Is Quantum Computing Harder than Expected?

This is a quite interesting article on Quantum Computing and how hard it really is.

It is well known that Quantum Computers are prone to Quantum Errors, and this issue grows with the number of Qubits. The typical estimate is that an useful Quantum Computer would need approx. 1.000 physical Qubits to correct the Quantum Errors of a single “logical” Qubit. Even if there are advancements in this topic (see for example this post), this is still a problem to be solved in practice.

Another potential issue is that Quantum Computers have been proposed to efficiently solve many problems including optimization, fluid dynamics etc. besides those problems for which a Quantum Computer would provide exponential speed-up, such as factoring large numbers and simulating quantum systems. But if a Quantum Computer does not provide an exponential speed-up in solving a problem, there is the possibility that actually it would be slower than a current “classical” computer.

But the big question remains: will a real useful Quantum Computer arrive soon? If yes, how soon?

Towards Web 3, but first: What is it?

I am sorry, but I am confused.

I am reading and hearing about “Web 3”, but I am not sure if I understand what it is all about. It is quite possible that I missed some information.

So, to what I understand:

  1. Web 1 seems to correspond to the first incarnation of the “WWW”, from the first years to the first e-commerce platforms (up to approx 2004)
  2. Web 2 seems to correspond to what we currently know as “WWW”, based on dynamic pages and services, or “Web as a platform” where most services are centralised (eg. Cloud) and/or users are also producers of contents (social media etc.)
  3. Web 3 is not here yet, but should be arriving soon and should be a “decentralised online ecosystem based on blockchain” (see Wikipedia) and should incorporate also some features envisioned by Tim Berners-Lee in his 1999 proposal of a “Semantic Web” (or Web 3.0, just to add to the confusion) which should be a web of data that can be processed by machines (that is to make Internet data machine-readable).

And 2021 should have been the year of the real beginning of Web 3, with crypto-currencies, NFTs and a general adoption of blockchain decentralised services. But opinions on this are quite diverging: from extremely optimistic to “marketing buzzword”.

I’ve tried to think about it and from the little I understand I see at least two points of view: as persons and companies. As personal use of WWW I do not think that much will change, still there will be services to use online, Apps to install (and update, but no pain please) and companies that will deliver all that (at a price or with other business models). From the company point of view, the only thing that comes to my mind is a parallel with the IT Out-sourcing / In-sourcing cycle: technologies and business models change, and approaches follow.

Still it is not really clear to me what Web 3 actually is or should be.

Solar Superstorms and IT BC/DR

Very interesting research paper with a scary title “Solar Superstorms: Planning for an Internet Apocalypse“. It is about a Black Swan event which has actually already happened in 1859, a major solar Corona Mass Ejection (CME) which has some chance to happen in the next future. Without entering in any detail (the research paper is quite readable) the main point is that if a CME of 1859’s magnitude would hit earth today, the consequences would be catastrophic.  Apart from the impacts on the electric grid, and in particular to the long distance power distribution (but power operator should be aware of this threat), the research paper points out that there would be severe damages to satellites, in particular low-orbit ones, with possible total failure of satellite communication including GPS, television broadcasting and data (internet) transmission. But equivalently at risk are long distance communication cables, more noticeable submarine optical fibre cables. Actually, optical fibres per se would not be affected, but optical repeaters along the fibres at distances of 50 – 150 km at the bottom of the oceans would burn out and stop almost all communication between continents.

I remember years ago discussing a similar scenario with some physicist friends and wondering if it could have been a threat or not. It seems that it can be, but is the cost of mitigating this threat worth it?  Should we act today?

iPhone X and Science Fiction

I usually do not comment on new products, but what I read about the new iPhone X made wonder if we are finally getting closer to the infinite number of science fiction computers that can really interact directly with a human being.

I guess that everyone remembers HAL 9000 from “2001: A Space Odyssey” (1968), and it had plenty of ancestors and an infinite number of descendants.

On Manufacturing, IoTs and IT Security

Since many years we are quite used to the fact that products, of any kind, contain digital and electronic components. The process of manufacturing products and integrating digital and/or electronic components is by now quite well established and robust. The most important requirements to the digital / electronic components is that they perform their tasks correctly, effortlessly and that they are reliable. Security is mostly perceived as safety for example from electric shock or from the behaviour of the product induced by the digital / electronic components. It is not important that the digital component has features which are not used by the product, or that it has been designed for other purposes as far as it performs correctly as a component of the product.

But the scenario changes dramatically if the digital component is connected to a network, in particular Internet. In this case the product becomes part of the “Internet of Things” (IoTs). Then the security perspective changes completely. For example, those unused features of the digital component, if not correctly configured and managed, can be abused and become a serious security threat. What bad can be done with a washing machine connected to Internet? Difficult to say, but if out of imagination one can always try to join the washing machine to a botnet for distributed denial of service (DDoS) attacks.

So the manufacturer should also take care of the full IT security of any digital / electronic component embedded in its products. This means that even unused features must be configured, managed and updated.

But this is not all. The interaction between components in a product can create new type of security threats, which can be considered like side-channel threats and attacks. The abuse and misuse of digital components can be quite inventive, for example recently in the news I have noticed the following:

  • how to use a scanner to communicate through a laser mounted on a drone with a malware on a PC (see eg. this article)
  • how a smartphone or laptop’s ambient light sensor can be used to steal the browsing history from the device (see eg. this article)
  • how to install malware on a Smart TVs using the DVB terrestrial radio signals (see eg. this article)

and others concerning light-bulbs, surveillance cameras etc.

Typically in IT security one has first to describe clearly what are the threat scenarios and based on these to evaluate the risks and the security measures needed to mitigate these risks. In the case of IoTs it seems very difficult to imagine all possible threat scenarios due to the interaction between embedded digital Internet-connected components and the other product’s components.

Even more difficult is to imagine how, in the current markets, manufacturers of products like lightbulbs, refrigerators, television sets and more or less anything else one can imagine, can devote time and money to the security of embedded digital components produced by someone else, which should just work, cost as little as possible and not be maintained.

PS. Products like cars, airplanes etc. in regulated sectors, should constitute a welcome exception to this, thanks to the very stringent safety concerns and rules that apply to them.

PPS. Also of interest is this, just appeared, Microsoft whitepaper on Cybersecurity Policy for IoTs.

On D-Wave and Quantum Computing

I have been following at a distance since a few years the development of Quantum Computers. One of the more controversial approaches to Quantum Computing is the one proposed by D-Wave. D-Wave is also the only company which claims to have a specialized version of Quantum Computer ready to sell, and actually they did sell at least one Quantum Computer to a consortium made by Google, NASA, and the Universities Space Research Association.

What it is not yet clear is if it is really a Quantum computer, and even if it is, if it gives any advantages with respect to traditional computers. There are quite some different opinions about this, and this IEEE Spectrum article tries to understand where we stand now.

 

Managing a Large ICT Implementation is Hard

Recently there have been quite some news about failed large ICT projects, starting from the Obamacare rollout and so on. One of the latest news is that Bridgestone is suing IBM for fraud for $600 Million over a failed IT implementation (see here for details).

We know since at least 20 years that large ICT projects are hard and that quite often they fail, at least as far as they do not deliver what has been agreed at the beginning. (A very easy and often adopted way of guaranteeing that an ICT project is succesful, is to change the its requirements and goals at the end.)

What seems new to me is the fact that the news about these failures are becoming more and more public, probably because they affect more and more people, and that someone is starting to complain, in this case to the point that the customer thinks that there has been a fraud against him.

Actually this trend could help the ICT business in the long run, since it will force us to learn how to manage large ICT projects and implementations and to produce (at last) higher quality ICT software products.

Will tablets kill desktop PCs?

A few days ago IDC released (see here and here) a forecast according to which by 2017 87% of connected devices will be tablets and smartphones. Desktop PC sales will be down whereas tablets and smartphone sales will grow double digits.

This does not surprises me, most users do not need a full PC for browsing the web and access the few applications by now mostly “in/on the clouds” that they use. Easy of access, intuitive interfaces and great graphics are more important than the full power of a desktop PC with all possible kind of resident applications (which the user should then manage).

Security and all kinds of management should be done by the device provider, better if almost unknown to the user or with very limited user participation.

Privacy and personal information dissemination are the only issue which involves directly every user, and on this point we will need to improve quite a lot.

Obviously, work related PC requirements are different, and for this use desktop PCs will remain, albeit in reduced numbers.

Device fingerprinting and user tracking

A recent study by KU Leuven-iMinds researchers points out that device and web-browser fingerprinting is on the raise, in spite of all efforts to limit it like the introduction of the “Do Not Track” HTTP Header.

This does not surprise me since advertisment and marketing are usually at odds with privacy and it is not well understood by most what is the real meaning and breath of the information that it is possible to collect by tracking users on internet.

On the other side, device fingerprinting is a very useful tool for ICT security of web transactions: knowing which device is making the transaction and to which user is (usually) associated, added to the geolocalization of IP addresses and other information, can make the difference between a valid transaction and an attempted fraud.

At the end the most important issue is by whom and how a tool is used, and this holds true in particular for security tools: a gun in the hand of a policeman should be used to a good end, but the same gun in the hand of a thief should be illegal.

It never happens to me :-(

ICT “Glitches” are smaller brothers of “Bugs” which in turn can become major security disasters. Well sometimes they can make us (the one benefitting innocently from the Glitch) happy or at least they can make us laugh. This case is quite notable: “Bank error makes restaurant manager the world’s first ever trillionaire (and he even offered to pay off the national debt before the glitch was spotted)”.

I am also addicted to this blog “IT Hiccups of the Week” which sometime reports on some very good discounts (due to some kind or another of Glitch) we just missed at our local supermarket.