NSA and Post Quantum Cryptography

The National Security Agency (NSA, USA) has announced the “Commercial National Security Algorithm Suite 2.0” (CNSA 2.0, you can find the announcement here and some FAQ here).

There are a few points of interest related to this announcement:

  • first of all, NIST has not completed the selection and standardization of all possible Post Quantum Cryptography algorithms, which should be completed by 2024, but the NSA has anyway decided to require the implementation of the algorithms already standardized by NIST (see NIST SP 800-208) and to suggest to get ready to implement the others which will be standardized in the next years; this can be interpreted as NSA has some kind of urgency in introducing the new algorithms and that it foresees that Quantum Computers able to break current cryptographic algorithms like RSA will arrive in a not too far future;
  • the already standardized new PQC algorithms are to be used only for software and firmware signing, and the transition to them must begin immediately;
  • the timelines are quite short considering the time it takes to implement, also in hardware, all the new algorithms, summarizing: the already standardized new PQC algorithms must be implemented by 2025 and exclusively used by 2030; all others new PQC algorithms should be supported or implemented by 2025 and exclusively used by 2033;
  • the above mentioned timelines suggest that NSA believes that a Quantum Computer able to break current cryptographic algorithms like RSA could be available by 2035 or around.

On Cryptographic Agility

Cryptographic algorithms change, evolve, are retired. This is nothing new, but still we are not good in swapping an old, weak or deprecated algorithm for a new one. There are many issues that make this quite complex, like

  • algorithms can be implemented in hardware for performance, substituting them with software implementations can notably degrade the performance of the system, and new hardware implementation can take years to implement and require changing many hardware components
  • new algorithms have different software interfaces which can require that all programs using them have to be updated accordingly.

Experience of the last 30 years shows us that it can take many years to change to new cryptographic algorithms: from DES to AES, from MD4 and MD5 to SHA-0, SHA-1, SHA-2 and SHA-3. To make things even more complicated, long term sensitive information must be kept securely encrypted for many years, which requires using algorithms which will remain effective for the same time span, whereas digital signatures must be re-applied with the new and stronger algorithms before the old ones are deprecated.

To all this, we can add the threat of Quantum Computers which, in case they will become really operational, will be able to break practically all current asymmetric algorithms (eg. RSA). Do we need to change all asymmetric algorithms with the new Post Quantum Cryptographic algorithms as soon as these will be available? And how long will this take? What if one of these new PQC algorithms, which are based on new types of math, will be broken short time after its deployment?

So we need to vastly improve the “agility” of swapping from old to new cryptographic algorithms and to be proactive in doing it. This requires designing new interfaces and processes which will easily allow to swap one cryptographic algorithm for a new one.

Cryptography is still Difficult to Implement

This is a recurring theme (see eg. this older post), how difficult it is to implement Cryptography in the “right” way. It requires a lot of knowledge and expertise in Cryptography, programming skills are not enough.

Recently I read this article “PRACTICAL BRUTEFORCE OF AES-1024 MILITARY GRADE ENCRYPTION” by Kudelski Security Research, in which they show how it was possible to brute force what was called “AES-1024 military grade encryption“. AES is not broken (and “AES-1024” does not really stand for AES with 1024-bit keys) but the main problem was the custom procedure adopted to transform a user chosen password into an AES encrypting key.

Which again shows how it is difficult to implement Cryptography to attain the expected security goals.

Is “Post Quantum Crypto” Going Mainstream – part 2 ?

Openssh has released a few days ago version 9.0 (here the announcement) which features the “hybrid Streamline NTRU Prime + x25519 key exchange method by default.” In other words, the key exchange is performed by the standard X25519 ECDH key exchange algorithm (the previous default) paired with the NTRU Prime, a Post Quantum Crypto algorithm “believed to resist attacks enabled by future quantum computers.” If one of the two algorithms fails to protect the confidentiality of the encryption key, the other should continue to protect it, even if a quantum computer will be able to successfully attack X25519 ECDH alone.

Is “Post Quantum Crypto” Going Mainstream?

We do not know if or when Quantum Computers will arrive: 10 years “at best” for Quantum Computing, “at worst” for Cryptography.

Today Post Quantum Cryptography (PQC) aims to provide algorithms resistant to Quantum Computers but it is still in an development phase (see eg. NIST for details).

Concerning information security and Quantum Computer, today we should worry about at least two issues:

  1. how long it will take to perform the transition to Post Quantum Crypto algorithms;
  2. how to protect information encrypted today with standard algorithms but that should still be protected in 10 or more years.

For the second point, one possibility is to adopt already today the emerging PQC algorithms and “double encrypt” sensitive long-term data with a current algorithm and PQC-devel algorithm, with the hope that if one of the two fails the other will keep protecting the data. And based on this IBM announcement (see also here), this seems to be starting right now.

Fixing Cryptography is not Always Easy

The latest version of the Zloader banking malware is (also) exploiting a Microsoft Signature Verification bug (CVE-2013-3900) for which the bugfix exists since 2013 (see for example here for more details). In this case the security issue is not due to users not updating their systems with the mandatory security patches but to the fact that the patch is optional and should be installed manually.

The problem is that the stricter signature verification implemented by the Microsoft Authenticode patch which fixes this bug, has an extremely high risk of false positives in many situations, for example some installers can be identified as having an invalid signature. So Microsoft decided to let the user decide if the patch would create more problems than solving some.

The Zloader malware uses this “bug” to be able to run some modified (and then unsigned) libraries. But this requires that the malware is already on the system, so applying this patch does not prevent a system from being infested by this malware.

The issue that, again, this event points out, is how difficult it is to balance strict security, in particular if cryptography is involved, and usability / availability of systems and services.

Risks of Cryptography

Cryptography is too often considered as the “final” solution for IT security. In reality cryptography is often useful, rarely easy to use and brings its own risks, including an “all-in or all-out” scenario.

Indeed suppose that your long-lived master key is exposed, then all possible security provided by cryptography is immediately lost, which it seems to be what happened in this case.