Saving encryption: Thales on the future of quantum cryptography

Quantum computing is promising to totally break contemporary encryption techniques, but should you really be worried? We spoke to Thales eSecurity's VP of technical strategy Sol Crates to find out.

The prospects and business cases for quantum computing - while at times overstated - are undoubtedly fascinating. The theoretical advantages that quantum computing promises to provide for areas such as science, pharmaceuticals and the general enterprise are tremendous and our journey to getting towards a quantum computer that provides useful advantages over traditional computing is getting closer.

Many of tech's biggest names - such as IBM, Google, Microsoft, and Intel - are making big investments into the area, with numerous major governments following suit. The Trump administration recently passed a bill to provide $1.2 billion (USD) to quantum research, and similar moves have being made by the European Union - who pledged €1 billion to the cause - and China, which has investing billions of dollars in the technology for years.

While it must be said that we might still be a good while away from achieving what some organisations are calling the quantum advantage (i.e. developing a quantum computer that is better than traditional computing at a truly useful task), the future potential for these systems is both extremely exciting and somewhat concerning. That is, as quantum-level technology can be used to achieve a variety of amazing milestones in science and technology, it can also be used in negative ways. Perhaps the most pressing issue is the potential of quantum computing to completely disrupt contemporary encryption techniques, putting a massive amount of encrypted data at risk of attack.

Current standards for encryption rely on the inability of standard computers to factor large numbers, which serve as the foundation of many popular methods of cryptography. This raises a whole new order of cybersecurity concern for organisations, who might find swathes of encrypted data cracked open extremely easily. In order to address this issue, the National Institute of Standards and Technology (NIST) has set out to find new standards in cryptographic algorithms that wouldn't be vulnerable to attacks from quantum computers. To achieve this, it started an initiative called the Post-Quantum Cryptography Standardization project, which assessed a variety of new cryptographic algorithm submissions designed to be quantum-resistant.

NIST has now winnowed this list of submissions down to a total of 26, with 17 potential algorithms chosen for public-key encryption and nine chosen for digital signatures. These algorithms are diverse in their execution and represent a wide range of mathematical ideas, which NIST says is important in assessing the best possible approach to protecting data. In order to make sense of this list and to ascertain how organisations should approach issues in quantum cryptography, we spoke with data security expert Sol Cates, who serves as Thales eSecurity's VP of Technical Strategy.  Cates talks about the true level of risk that quantum computing poses for the security of huge amounts of critical data and what organisations should be doing right now in order to prepare for quantum attacks.

 

Why are current methods of cryptography and data encryption insufficient against attacks from Quantum Computers?

The impact of quantum computing on our current public-key cryptography results from our basing them on very particular mathematical problems which are very hard for everyday computers to solve, but very easy for (as yet theoretical) quantum computers. However, there exist some common misconceptions regarding the extent of the disruption that quantum computers may cause to our current cryptography: If and when quantum computers become a workable reality, all cryptography will be impacted, but to very different degrees. For most current-day public-key cryptography (e.g. RSA and elliptic curve operations), the impact will be so large as to completely rule out our existing RSA and elliptic curve tools from use. In contrast, for symmetric key cryptography and the use of hash functions (e.g. AES encryption and SHA256), the impact will be pretty minimal, with parameter and key size changes being the extent of the damage.

Until the mid-1990s, the security of RSA and elliptic curve-based public-key cryptography was well understood. Even in the present day, with the best algorithms and most efficient computers, we can quite accurately estimate that one would need to use enough energy to boil Lake Geneva (89 cubic kilometres of water) in order to either:

  • Solve a 3072-bit RSA problem
  • Crack a 128-bit symmetric AES problem
  • Crack an SHA-256 hash function problem

The quantum computing earthquake occurred in 1994 with the announcement of Shor's algorithm - a method which, given a large-enough quantum computer, would allow solving of the 3072-bit RSA problem with enough energy to boil maybe a few cups of water, rather than Lake Geneva. The impact on elliptic curve cryptography will likely be even more severe. To put the impact into perspective, to make RSA as hard for a quantum computer as for a classical computer, the 3072-bit RSA key would need to be replaced by one of size on the gigabyte scale.

In contrast, for symmetric cryptography (AES/SHA-256 etc.), the impact is relatively minimal. Grover's algorithm is another quantum algorithm which can be applied to solve such problems faster than a classical computer. While our 3072-bit RSA key needed to grow to the gigabyte scale to defend against Shor's algorithm, a typical 128-bit AES needs to only double in size to AES-256 to negate the advantage of the quantum computer and require it, again, to boil Lake Geneva.

 

What data is at risk? And how much?

In the most simplistic terms, most communications which involve a potentially-untrustworthy entity are at risk, while any purely "local" uses of encryption are generally safe. Digital signatures, public key encryption and key exchange are the main vulnerable practices.

HTTPS connections are the most obvious area for concern - digital signatures and key exchange methods are used to ensure that you can agree on a temporary key with https://yourbank.com and to also ensure that it really is your bank you're talking to. There are two main ways attacks on HTTPS traffic could be carried out using quantum computers:

  1. Recording present-day traffic and decrypting ~5-10 years in the future using a quantum computer: a lot of current-day information will still have value to malicious parties 10 years in the future. If TLS-secured traffic was recorded now, future quantum computers could be used to crack present-day key exchanges and decrypt the traffic. We can defend against this type of attack by a partial transition to quantum-resistant cryptography today.
  2. Using a quantum computer to mount a real-time attack against HTTPS. In this setting, if classical key exchange and signature algorithms are still in use by the time large-scale quantum computers are available, it may be possible to mount a man-in-the-middle attack on HTTPS in real-time.

 

Cryptocurrencies (the vast majority of them) rely on quantum-vulnerable public-key cryptography. In Bitcoin, for example, sources and destinations of bitcoins are simply public/private key pairs. If a quantum computer was applied to forge Bitcoin transactions, forgery of signatures and thus unauthorized (but seemingly genuine) transfers of bitcoins would be easy - trust (and hence value) would disappear very rapidly.

VPNs, IPSEC, SSH are all at risk - all rely on legacy key exchange and signature mechanisms to establish trust and confidentiality with a remote entity.

IoT forms another risk, particularly in the rapid deployment of devices which are intended to operate for up to 20 years in the field. If devices are deployed now and rely on quantum-vulnerable crypto with no capacity for algorithm replacement/upgrade, we risk costly and damaging replacement efforts in the future. We could quickly arrive at an internet of (untrusted) things in which quantum-vulnerable signatures and identities can no longer be relied on.

Those areas to be minimally-affected:

File and disk encryption, in contrast, are generally not at risk. Likewise, hashing for file and message integrity is largely unaffected. Applications of cryptography such as password storage/verification for logins will be largely unaffected.

Cell phone traffic is the main exception to the above - in 2G/3G/4G/LTE networks, symmetric-key cryptography is used between handsets and infrastructure, with a copy of a long-term symmetric key residing in your SIM card. Symmetric crypto is generally relied on throughout to provide encryption, integrity and authentication, instead of public-key methods (largely motivated by efficiency concerns). While cellular encryption has something of a chequered history for other reasons, it will be largely unaffected (at least at the handset-to-infrastructure level) by the arrival of quantum computing. However, key sizes may need to be increased by a modest factor to defend against Grover's algorithm and this may require replacement of equipment, SIM cards etc. Note: communication between base stations and the core network still relies on IPSEC and would likewise be vulnerable to quantum computers.

 

How can new Cryptographic algorithms help?

The good news is that, for every mainstream quantum-vulnerable algorithm and use case, several quantum-resistant replacements are waiting "in the wings".

Compared to the millennium bug, the amount of effort required to quantum-proof our communications will be much greater, but if these algorithms can be successfully deployed and integrated into all present-day applications, the arrival of quantum computers will hopefully be something of a non-event. HTTPS, SSH, VPN, IPSEC connections can all be switched over to using quantum-proof algorithms, with no outward sign of the different cryptography employed within.

Research into quantum-resistant cryptography has been undertaken seriously since around 2005, with 2012-onwards seeing increasing urgency and effort in this direction. Several quantum-resistant replacements for RSA and elliptic curve based techniques have been devised, with several years of fruitful security analysis and learnings behind us already.

However, the adoption and acceptance of new cryptographic algorithms is a slow and arduous process - the best measure of cryptographic strength remains founded on the amount of effort and time that cryptographic algorithms have resisted expert attack. This process is well underway for quantum-resistant algorithms, with NIST's Post-Quantum Cryptography Standardization effort being, in part, an effort to focus scrutiny on the security of a formalised and narrowed set of candidates.

 

What do you generally look for in a good Cryptographic algorithm, what distinguishes the more useful ones from the less?

  1. Security - a good theoretical foundation and track record in withstanding expert attack and certification by trusted institutes.
  2. Efficiency - implementation platforms can range from 8-bit processors through to high-end servers. Algorithms which can be deployed on heterogeneous devices with good overall performance characteristics.
  3. Key/signature sizes - key and signature sizes can have direct impacts on performance, especially in IoT and lightweight device settings. Minimising key/signature size improves efficiencies and reduces the scope of secure key storage.
  4. Side-channel resistance - theoretically-secure algorithms, if implemented poorly, can rapidly leak secret keys via side-channels. Cryptographic algorithms which attempt to minimise side-channel leakage from the design phase can be easier to use and implement, requiring fewer side-channel mitigations.

 

 

In terms of the algorithms that NIST has selected for finals contention, which are the standouts for each category (Public key and digital signatures) in your view? Why?

In the public-key category, one candidate is a supersingular isogeny scheme - a relatively new category but one that's seen experimentation by Microsoft and Cloudflare. We think this is definitely one to watch.

But there were also plenty of surprising inclusions. The Classical McEliece scheme, for example, is a code-based scheme that dates back to 1978 - but has rarely seen use due to large key sizes. It would be ironic if the McEliece scheme were to finally come into use, 45 years after its invention. While the McEliece scheme failed to compete against RSA in the late 1970s due to its larger key sizes, it has successfully withstood theoretical attack for the past 40 years and features in the 2nd round candidates chosen by NIST. If McEliece had become the standard public key algorithm in place of (quantum-vulnerable) RSA in the late 1970s, quantum computing would not be such a threat to modern cryptography.

Once NIST has selected it's three winners, is that it? Are we safe from Quantum computing attacks on encrypted data?

We've got high hopes for the algorithms which emerge from the NIST effort - having undergone several years of scrutiny and testing by experts, we can have good confidence that they will be the best available to protect against quantum computing. The area of cryptanalysis (or code-breaking) is very fast moving, however, with new and improved attacks emerging regularly against the candidate problems. These will continue to improve in the coming years, with impacts on the key sizes and implementation details of the winning algorithms.

1 2 Page 1
Page 1 of 2