Quantum Computers and Error Correction / Mitigation

Error correction still remains one of the main hurdles in the development of Quantum Computers. Recent developments (see here) by IBM point to first gaining performance improvements and error mitigation on Quantum Computers with a limited number of qubits, such as the latest 156 qubits IBM’s Heron processor, instead of pushing for Quantum Computers with thousands of qubits without having a better approach to error mitigation and correction.

A Quantum Computers’ Status Update

On IEEE Spectrum I found quite interesting this article by IBM researchers about the current status and possible future developments of Quantum Computers. Even if there is no direct mention of “breaking RSA” (but Shor algorithm is mentioned), it is worth considering alongside the recent NIST announcement of the first 3 Post Quantum Encryption Standards (here and here).

The first World Quantum Readiness Day, September 26, 2024

Somehow I missed the announcement of DigiCert organizing the first “World Quantum Readiness Day“, see here and here. The purpose of this initiative is to help organizations prepare for the (future) arrival of Quantum Computers: to evaluate the risks, the opportunities and adopt measures to mitigate the first and take advantage of the second.

Cryptanalysis, Hard Lattice Problems and Post Quantum Cryptography

Cryptanalysis is the study of weaknesses of cryptographic algorithms and it is essential to prove that a cryptographic algorithm is safe. Assuming that in the near or distant future, Quantum Computers will arrive, cryptographic algorithms will need to be safe against standard and quantum cryptanalysis.

Quantum cryptanalysis is the study of weaknesses of cryptographic algorithms which can be exploited only by algorithms running on Quantum Computers. Shore’s algorithm is the most important quantum algorithm because it will break algorithms such as RSA and most of our current IT security.

Post Quantum algorithms are mostly based on hard lattice problems which are safe against the Shor algorithm and thus are safe also in case a full Quantum Computer will be available. But research keeps advancing in the study of quantum cryptanalysis, as shown by this recent paper which, luckily for us, contained an error that invalidated the proof. As commented by Bruce Schneier here, quantum cryptanalysis has still a long way ahead to become a real threat since not only the proof should be formally correct and applicable to the real post-quantum algorithms instead of reduced models, but it should actually be able to run on a Quantum Computer, whenever it will be available.

What can be Done with the Quantum Computers we can Build

This is an interesting article about how it will be possible to use Quantum Computers that realistically will be built in the next decade. The main areas seem to be solving quantum problems in chemistry, material science, and pharma. And there are prizes offered by Google and XPrize up to US $5 million to those who can find more practical applications of the Quantum Computers which will be available in the near future.

Is Quantum Computing Harder than Expected?

This is a quite interesting article on Quantum Computing and how hard it really is.

It is well known that Quantum Computers are prone to Quantum Errors, and this issue grows with the number of Qubits. The typical estimate is that an useful Quantum Computer would need approx. 1.000 physical Qubits to correct the Quantum Errors of a single “logical” Qubit. Even if there are advancements in this topic (see for example this post), this is still a problem to be solved in practice.

Another potential issue is that Quantum Computers have been proposed to efficiently solve many problems including optimization, fluid dynamics etc. besides those problems for which a Quantum Computer would provide exponential speed-up, such as factoring large numbers and simulating quantum systems. But if a Quantum Computer does not provide an exponential speed-up in solving a problem, there is the possibility that actually it would be slower than a current “classical” computer.

But the big question remains: will a real useful Quantum Computer arrive soon? If yes, how soon?

“Error Suppression” for Quantum Computers

Recently IBM announced that it has integrated in its Quantum Computers an “Error Suppression” technology from Q-CTRL which can reduce even by orders of magnitude the likelihood of quantum errors when running an algorithm on a Quantum Computer (see for example here).

Quantum errors are inherent to Quantum Computing, and the likelihood of error usually grows with the number of qubits. Theoretical Quantum Error Correction Codes exist, but their practical implementation is not easy; for example, the simplest codes can require even a thousand error correction qubits for each computation qubit.

The approach by Q-CTRL seems to adopt a mixture of techniques to identify the more efficient and less quantum error-prone way of running a computation, for example by optimizing the distribution of quantum logical gates on the qubits and by monitoring quantum errors to design more efficient quantum circuits.

Surely it is an interesting approach, we’ll see how effective it will really turn out to be in reducing the likelihood of quantum errors.

NSA and Post Quantum Cryptography

The National Security Agency (NSA, USA) has announced the “Commercial National Security Algorithm Suite 2.0” (CNSA 2.0, you can find the announcement here and some FAQ here).

There are a few points of interest related to this announcement:

  • first of all, NIST has not completed the selection and standardization of all possible Post Quantum Cryptography algorithms, which should be completed by 2024, but the NSA has anyway decided to require the implementation of the algorithms already standardized by NIST (see NIST SP 800-208) and to suggest to get ready to implement the others which will be standardized in the next years; this can be interpreted as NSA has some kind of urgency in introducing the new algorithms and that it foresees that Quantum Computers able to break current cryptographic algorithms like RSA will arrive in a not too far future;
  • the already standardized new PQC algorithms are to be used only for software and firmware signing, and the transition to them must begin immediately;
  • the timelines are quite short considering the time it takes to implement, also in hardware, all the new algorithms, summarizing: the already standardized new PQC algorithms must be implemented by 2025 and exclusively used by 2030; all others new PQC algorithms should be supported or implemented by 2025 and exclusively used by 2033;
  • the above mentioned timelines suggest that NSA believes that a Quantum Computer able to break current cryptographic algorithms like RSA could be available by 2035 or around.

On Cryptographic Agility

Cryptographic algorithms change, evolve, are retired. This is nothing new, but still we are not good in swapping an old, weak or deprecated algorithm for a new one. There are many issues that make this quite complex, like

  • algorithms can be implemented in hardware for performance, substituting them with software implementations can notably degrade the performance of the system, and new hardware implementation can take years to implement and require changing many hardware components
  • new algorithms have different software interfaces which can require that all programs using them have to be updated accordingly.

Experience of the last 30 years shows us that it can take many years to change to new cryptographic algorithms: from DES to AES, from MD4 and MD5 to SHA-0, SHA-1, SHA-2 and SHA-3. To make things even more complicated, long term sensitive information must be kept securely encrypted for many years, which requires using algorithms which will remain effective for the same time span, whereas digital signatures must be re-applied with the new and stronger algorithms before the old ones are deprecated.

To all this, we can add the threat of Quantum Computers which, in case they will become really operational, will be able to break practically all current asymmetric algorithms (eg. RSA). Do we need to change all asymmetric algorithms with the new Post Quantum Cryptographic algorithms as soon as these will be available? And how long will this take? What if one of these new PQC algorithms, which are based on new types of math, will be broken short time after its deployment?

So we need to vastly improve the “agility” of swapping from old to new cryptographic algorithms and to be proactive in doing it. This requires designing new interfaces and processes which will easily allow to swap one cryptographic algorithm for a new one.