Cyber Centre’s summary review of final candidates for NIST Post‑Quantum Cryptography standards

Introduction

The Canadian Centre for Cyber Security (Cyber Centre), part of the Communications Security Establishment, is Canada’s authority on cyber security. The Cyber Centre is a partner with National Institute of Standards and Technology (NIST) on the Cryptographic Module Validation Program (CMVP) and recommends the use of NIST cryptographic standards through our guidance publications, like ITSP.40.111. Cyber Centre cryptographic scientists are focused on the evaluation of the remaining finalist and alternate Post‑Quantum Cryptography (PQC) candidates to inform the candidate selection process and to ensure that the candidates selected for standardization continue to provide cryptographic protection of information and systems of importance to Canada and Canadians beyond the advent of quantum computers. The views expressed here are solely from the Cyber Centre.

Background

In 1994, Peter Shor invented an algorithm which, using a large quantum computer, could break the hard mathematical problems underlying modern public‑key cryptography. While large quantum computers do not yet exist, many researchers believe they could become a reality in the coming decades. This has spurred a significant research effort into PQC which is cryptography that would remain secure even in a world with large quantum computers.

As part of this effort, in 2016 the NIST in the United States launched a new PQC effort with the goal of standardizing one or more post-quantum cryptographic algorithms (the NIST PQC Standardization Process). The 2 types of algorithms being considered are public‑key encryption schemes (used to encrypt messages or establish shared secret keys) and signature schemes (used to digitally sign messages).

The deadline for initial submissions to the standardization process was November 2017, and in December of that year NIST announced that there were 69 valid submissions. These submissions came from researchers in over 25 countries and used techniques from a number of different mathematical families, including lattices, error‑correcting codes, multivariate equations, hash functions, elliptic curves, and others. This marked the start of the first round of evaluation. In January 2019 the field was cut to 26 candidates for the second round, and in July 2020 the third round officially began.

The third round contains two types of candidates: 7 finalists and 8 alternates. NIST intends that one or more finalists will be chosen and standardized at the end of the third round. The alternates were selected for further study, with the expectation that some of those algorithms may be standardized after a fourth round of evaluation.

While security will always be the most important criteria, algorithm speed and bandwidth requirements are important factors in the standardization process. At the start of the third round, NIST stated that the most promising family for general-purpose use was lattice‑based cryptography; 5 of the 7 finalists are lattice‑based. NIST would like to standardize (at most) one lattice‑based encryption algorithm and (at most) one lattice‑based signature algorithm at the end of the third round. The other 2 finalists, one being code-based and one being multivariate‑based, are not suitable for general-purpose use due to issues related to their large bandwidth requirements. However, these other finalists (and the alternates) do have advantages of their own. In the next section we will briefly describe each mathematical family represented in the finalist and alternate candidates, and discuss their relative advantages and disadvantages.

Mathematical families

Lattices

Lattice-based cryptography bases its security on mathematical problems about lattices. This has been a very active research area for cryptographers, a fact often credited to Ajtai’s breakthrough ‘worst‑case to average-case’ hardness result in 1996. Twenty‑six of the initial 69 submissions were lattice‑based, as are 5 of the 7 finalists. Several techniques used in submissions to the process at first glance may not seem like problems about lattices, but the proofs of their security and/or best‑known attacks are nonetheless based on the difficulty of lattice problems. All five of the finalists rely on (variants of) either the NTRU problem or the Learning with Errors (LWE) problem, and offer competitive performance in terms of both speed and bandwidth requirements.

The NTRU problem was first introduced over 20 years ago and it is based on factoring a certain polynomial into a product of 2 polynomials with special properties. The best‑known attacks against the NTRU problem are lattice‑based, and there are 2 candidates in round three that rely on the NTRU problem.

Similar to the code-based cryptography described in the next section, the LWE problem is a modification of a simple linear algebra problem which is made hard to solve by intentionally adding errors. The original LWE problem requires very large public keys and is much slower than the lattice‑based finalists being considered by NIST.  The LWE‑based finalists use modifications to LWE which permit smaller keys, and provide very competitive speeds.

Among the remaining candidates, lattice‑based approaches are the only type of candidate that offer both encryption and signing and they are among the fastest in both categories. There are specialized proposals which could provide better performance for specific applications, however lattice‑based approaches feature competitive performance in bandwidth requirements as well, making them strong candidates for general purpose use.

Error-correcting codes

Code‑based cryptography is based on the theory of error‑correcting codes. Error‑correcting codes are traditionally used to solve the following non‑cryptographic problem in information theory: Two parties want to communicate over a noisy channel, which means that when one party sends a message to the other, the message arrives with many errors. Error‑correcting codes are used to remove the error and recover the original message. In most code‑based cryptography the key idea is that errors are used intentionally, in such a way that they can only be identified or removed by the party holding the secret key. Anyone without the secret key must solve the hard problem of decoding an errored codeword from a random‑looking linear code.

Code-based cryptography was first proposed over 40 years ago as the McEliece encryption scheme.  There were 17 code‑based encryption schemes and 2 code‑based signatures submitted to the original NIST call for PQC candidates and as of round three there remains 3 encryption schemes: 1 finalist and 2 alternates. The finalist encryption scheme candidate is a modern McEliece scheme which uses Goppa codes, while the alternate code-based encryption schemes use different code families from the traditional McEliece, one uses Quasi‑Cyclic Moderate Density Parity Check (MDPC) codes and the other random Quasi‑Cyclic codes.

The original McEliece scheme suffered from extremely large public keys, a problem which is still one of the main drawbacks of code-based cryptosystems today. Variants in the underlying code family used were often proposed to address this issue but even in the best cases the public keys of code‑based systems are larger than in finalist lattice systems. The main advantages of the McEliece system is the confidence in its security due to its maturity, the very small ciphertexts (smaller than all other schemes), and the speeds of encapsulation and decapsulation (quite fast).

Systems of Multivariate Quadratic Equations

Multivariate Quadratic (MVQ) schemes are based on solving systems of equations of degree two with 2 or more unknown variables over finite fields. A typical MVQ scheme builds a system in such a way that only the person with the secret key can find the solution. This is done by building equations with special structure and masking them so that they look random to an attacker. There were 7 MVQ signatures and 2 MVQ encryption schemes submitted to round one of the NIST standardization process, and in round three 2 MVQ signatures remain, one as a finalist and the other as an alternate.

One advantage of MVQ signature schemes is that they produce very small signatures (in some cases smaller than RSA signatures used today), and in the case of the MVQ finalist, provide relatively fast signing. Recent research has made advancements in the security theory of MVQ schemes which has required parameter changes in the round three finalist, putting some question on the stability of the MVQ hard problems. A disadvantage of MVQ signature schemes are the large public (and sometimes private) keys, which make them not suitable for general purpose use.

Hash functions

Hash-based signatures (HBS) rely solely on the security of a cryptographic hash function, a non-reversible function that takes a string of any length as input and produces a fixed length output, believed for larger output sizes to be secure against quantum computers. The basic idea is that an HBS is a collection of many one-time signature (OTS) schemes (meaning each OTS can only sign a single message); an HBS uses a data structure known as a tree to combine many OTS efficiently. To sign a message, an HBS chooses a single OTS from its collection and uses it to sign the message. The critical issue is that the HBS must never choose the same OTS twice, or else security is compromised. The most efficient HBS schemes are stateful, meaning that the HBS must remember which OTS it has already used. This requires a state management procedure which may not be feasible for some applications. NIST has published its recommendations for stateful hash-based signatures.

Two stateless HBS schemes were submitted to the initial NIST call for PQC candidates, one of which is still being considered as an alternate scheme in round three. The idea is to use a very large tree and to choose the OTS at random. If the tree is large enough, the probability of repeating an OTS can be made very small. Although many improvements on stateless HBS schemes have been made, the signatures are large, and the performance is slow compared to other PQC signature candidates. Nonetheless, hash-based signatures offer high confidence in their security and so remain under consideration.

Isogenies of Elliptic Curves

Isogeny-based cryptography uses maps between elliptic curves to build public‑key cryptosystems. Elliptic curves used in traditional public‑key cryptosystems are known to be vulnerable to quantum computers, but isogeny‑based schemes do not use the same arithmetic and thus are believed to be quantum‑resistant. The only isogeny‑based submission to the NIST standardization process is an encryption scheme that remains under consideration in the third round as an alternate candidate. It remains attractive since it has the smallest public keys out of any of the other remaining candidates and has a very small ciphertext size as well. Although it has many optimizations, it is still significantly slower than its competitors. As with some of the other alternate candidates, NIST suggested isogeny‑based cryptography could benefit from further research in the third round. 

“Other”

Among the initial 69 submissions, some did not fit perfectly into one of the preceding categories. Examples include rank-metric codes (a variant of coding theory), random walks, and braid groups. Only one remains as an alternate scheme in the third round, a signature scheme that incorporates symmetric cryptography and what is called a ‘zero‑knowledge proof of knowledge’. This scheme has a small public key, but slow signing times and large signatures. One of its main advantages, similar to the ‘hash‑based’ category, is that its security depends only on symmetric cryptography. Interestingly, this scheme is based on a relatively new idea, which means that it has been less studied than schemes from the other families. The advantage of this is that significant improvements have been found during the first two rounds, and NIST is hopeful that further improvements will be found. The disadvantage is that it may take more time until the cryptographic community is confident in its security.

Conclusion

The evaluation of third round NIST PQC candidates is expected to last 12 to 18 months with the intent that a small number of candidates from the finalists will be selected for standardization by early 2022 and to have finalized standards by 2024. NIST is also considering a fourth round of evaluation for alternate candidates or finalists that require more evaluation and could result in more standards at a later date. While lattices seem best for general purpose use, the cryptographic community will continue studying other families due to their advantages for specific use cases and also for diversity in case of advancement in attacks against lattices. With the small number of candidates remaining in the third round NIST is hopeful that the cryptographic research community will be able to evaluate in more detail each candidate in areas such as side‑channel resistance, performance data in internet protocols and performance data in hardware implementations along with the continued rigorous security analysis.

For more in-depth information about the NIST PQC standardization, we encourage you to visit the NIST’s official Post-Quantum Cryptography website or check out the NIST second round status report (NISTIR 8309 available at the official website), both of which were heavily consulted in the writing of this blog post. The Cyber Centre strongly advises consumers wait until the standards for post‑quantum public-key encryption and signature schemes are finalized before using any PQC candidate to protect data or systems. Right now, it is most important to plan for a cryptographic migration by identifying and preparing systems where cryptography will need to be quantum resistant.

Report a problem on this page

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Please select all that apply:

Thank you for your help!

You will not receive a reply. For enquiries, please contact us.

Date modified: