I do not really know what is going on, but news on Quantum Computers (or “Quantum Information Science” – QIS) and Post Quantum Crypto keep piling up, the last one coming from the White House (see eg. here).

# Tag Archives: Quantum Computing

# Is “Post Quantum Crypto” Going Mainstream – part 2 ?

Openssh has released a few days ago version 9.0 (here the announcement) which features the “*hybrid Streamline NTRU Prime + x25519 key exchange method by default.*” In other words, the key exchange is performed by the standard X25519 ECDH key exchange algorithm (the previous default) paired with the NTRU Prime, a Post Quantum Crypto algorithm “*believed to resist attacks enabled by future quantum computers.*” If one of the two algorithms fails to protect the confidentiality of the encryption key, the other should continue to protect it, even if a quantum computer will be able to successfully attack X25519 ECDH alone.

# Is “Post Quantum Crypto” Going Mainstream?

We do not know if or when Quantum Computers will arrive: 10 years “at best” for Quantum Computing, “at worst” for Cryptography.

Today Post Quantum Cryptography (PQC) aims to provide algorithms resistant to Quantum Computers but it is still in an development phase (see eg. NIST for details).

Concerning information security and Quantum Computer, today we should worry about at least two issues:

- how long it will take to perform the transition to Post Quantum Crypto algorithms;
- how to protect information encrypted today with standard algorithms but that should still be protected in 10 or more years.

For the second point, one possibility is to adopt already today the emerging PQC algorithms and “double encrypt” sensitive long-term data with a current algorithm and PQC-devel algorithm, with the hope that if one of the two fails the other will keep protecting the data. And based on this IBM announcement (see also here), this seems to be starting right now.

# Always about Security Patching and Updates

These days I keep coming back to the “security patching and updates” issue. So I am going to add another couple of comments.

The first is about Ripple 20 (here the official link but the news is already wide spread) which carries an impressive number of “CVSS v3 base score 10.0” vulnerabilities. The question is again:

how can we secure all of these Million/Billion vulnerable devices since it seems very likely that security patching is not an option for most of them?

The second one is very hypothetical, that is in the “food for thought” class.

Assume, as some says, that in 2030 Quantum Computers will be powerful enough to break RSA and other asymmetrical cryptographic algorithms, and that at the same time (or just before) Post Quantum Cryptography will deliver us new secure algorithms to substitute RSA and friends. At first sight all looks ok: we will have just to do a lot of security patching/updating of servers, clients, applications, CA certificates, credit cards (hardware), telephone SIMs (hardware), security keys (hardware), Hardware Security Modules (HSM) and so on and on… But what about all those micro/embedded/IoT devices in which the current cryptographic algorithms are baked into? And all of those large devices (like aircrafts but also cars) which have been designed with cryptographic algorithms baked into them (no change possible)? We will probably have to choose between living dangerously or buy a new one. Or we could be forced to buy a new one, if the device will not be able to work anymore since its old algorithm will not be accepted by the rest of the world.

PS. Concerning Quantum Computers, as far as I know nobody claims that a full Quantum Computer will be functioning by 2030, this is only the earliest possible estimate of arrival, but it could take much much longer, or even never!

PS. I deliberately do not want to consider the scenario in which full Quantum Computers are available and Post Quantum Cryptography is not.

# NIST Announces Second Round of Post-Quantum-Cryptography Standardization

NIST has announced the conclusion of the first round of the standardization process for post-quantum-cryptography algorithms, that is public key and digital signature algorithms which are not susceptible to attacks by quantum computers.

The announcement can be found here and a report on the 26 participants to the second round can be downloaded from here.

# New Developments in Cryptography

Wired reports in this article of a recent advance in deployed cryptography by Google.

Last summer the NSA published an advisory about the need to develop and implement new crypto algorithms resistent to quantum computers. Indeed if and when quantum computers will arrive, they will be able to crack easily some of the most fundamental crypto algorithms in use, like RSA and Diffie Hellman. The development of quantum computers is slow, still it continues and it is reasonable to expect that sooner or later, some say in 20 years, they will become reality. Also the development of new crypto algorithms is slow, so the quest for crypto algorithms resistant to quantum computers, also called post-quantum crypto, has already been going on for a few years.

Very recently Google has announced the first real test case of one of these new post-quantum algorithms. Google will deploy to some Chrome Browsers an implementation of the Ring-LWE post-quantum algorithm. This algorithm will be used by the chosen test users, to connect to some Google services. Ring-LWE will be used together with the current crypto algorithms adopted by the browser. Composing the current algorithms with Ring-LWE will guarantee a combined level of security, that is the minimum level of security is that of the strongest algorithm used in the combination. It should be noted that Ring-LWE is a much more recent crypto algorithm compared to the standard ones, and its security has not been established yet to a comparable level of confidence.

If the level of security will not decrease and hopefully just increase, it has to be seen how it will work in practice in particular for performances.

For modern cryptography this two-year Google’s project could become a cornerstone for the development and deployment of post-quantum algorithms.

# A New Approach to Quantum Random Number Generators and news on Quantum Cryptography

I am still interested in developements in the area of Quantum phenomena which can be used in ICT and in particular in ICT Security. Recently there have been quite a few announcements of interest. Here are a some of them:

- A scientific paper proposes on a new way of generating Quantum Random Number, that is ‘real random numbers’ (whatever that means) by using every day technology like the camera of our smart phone; this does not mean that the smart phone camera is enough to produce real random numbers (for the moment you still need a computer to process the data produced by it) but it is a sign that the technology is providing us with tools of unprecedented power, and soon our smart phone will be enough for a good many things;
- New developments in Quantum Cryptography (se here and here for details) would make it easier to implement Quantum Cryptography in practice; this is nice, even if it does not changes dramatically the current status and relevance of Quantum Cryptography;
- Another article (see here for a comment) leaves me instead quite puzzled: either I don’t understand it or there is something fundamentally flawed in the argument otherwise it will look like it is possible to obtain quantum effects in classical physics, which is just what it is not.

# On D-Wave and Quantum Computing

I have been following at a distance since a few years the development of Quantum Computers. One of the more controversial approaches to Quantum Computing is the one proposed by D-Wave. D-Wave is also the only company which claims to have a specialized version of Quantum Computer ready to sell, and actually they did sell at least one Quantum Computer to a consortium made by Google, NASA, and the Universities Space Research Association.

What it is not yet clear is if it is really a Quantum computer, and even if it is, if it gives any advantages with respect to traditional computers. There are quite some different opinions about this, and this IEEE Spectrum article tries to understand where we stand now.

# Ross Anderson, Quantum Computing and fundamental Quantum Mechanics

It has just been published here a paper by Ross Anderson and Robert Brady on Quantum Computing, Quantum Cryptography and Quantum Mechanics.

I personally know some of the people mentioned in the paper and who worked for many years on these aspects of fundamental Quantum Mechanics and Particle Physics. Without discussing the details of the theory proposed in this paper, I think that some comments can be useful since I worked in research in theoretical physics for a good part of my life.

It is true that the Bell’s inequalities and the EPR paradox have been and are the cause of many debates in fundamental theoretical physics, beginning with Einstein’s reject of these concepts. I believe that today there is enough experimental evidence that on this point Einstein was wrong and the Bell’s inequalities are violated. In other words, I believe that Quantum Mechanics is a valid description of elementary physics at the Quantum scale. We know very well that (non-relativistic) Quantum Mechanics does not work eg. at very high energy scales like the ones probed by the CERN experiments which led recently to the discover of the Higgs particle.

We know very well that there is a lot that we do not understand yet in Particle Physics. This could mean that Quantum Computing could be harder than what we expect due to our ignorance of some new (quantum) physics.

But I disagree with Ross Anderson this time since I do not believe that Classical Mechanics can explain this kind of phenomena nor that it can show that the theory of Quantum Cryptography is flawed (implementing Quantum Cryptography in practice is a completely different story).

My 2c.