A report from the Fifth PQC Standardization Conference
In light of advancements in building large-scale quantum computers and their threats to the existing public key cryptosystems, NIST is organizing multiple PQC competitions to standardize next generation, quantum-safe crypto algorithms. The algorithm submissions to this competition belong to two different categories: Post-Quantum Digital Signature Algorithms (PQC-DSA) and Post-Quantum Key Encapsulation Mechanism Scheme (PQC-KEM).
NIST’s first post-quantum standards are scheduled to drop in ‘summer 2024’ (but everyone seems to agree that means late summer at best and more likely September 2024.) These include:
FIPS 203: ML-KEM - a key encapsulation scheme that allows PQ key agreement based on lattices
FIPS 204: ML-DSA - a lattice-based generic digital signature scheme that is suitable for basically any application
FIPS 205: SLH-DSA - a hash-based digital signature scheme that has many years of trust but has very large keys and slow operations and thus is only really recommended for offline systems such as firmware signing (vs online like in TLS)
After those three, there is another generic signature algorithm, FN-DSA, also based on lattices, that is scheduled to have its first standard draft in Fall 2024; it will then take another possible year to go from a draft to a full standard (the FIPS documents above took that long). This scheme is possibly more attractive than ML-DSA because of slightly smaller keys and signatures, but there are concerns that it is harder to implement securely (i.e., protected against side-channel attacks) and may not see traction without the cryptography engineering community feeling more confident about this point. Fourth-round KEM candidates BIKE, HQC, and Classic McEliece are continuing on so that another KEM that is not based on lattices can be selected, but the smallest candidates are not much smaller or faster than ML-KEM. It will be several years before the standard(s) for any of these selections are available.
Meanwhile, there is a signature ‘on-ramp’ standardization process to try to find other PQ signatures that are not based on lattices, are more performant and are smaller than the current batch. All the current selection algorithms are over ~10X larger than the signatures in a fresh TLS 1.3 connection, not including the larger ML-KEM handshake bytes too, and the Web PKI cannot handle it without dropped connections and severe latency/time-to-first-paint degradation. Without smaller signatures, there will need to be significant restructuring of the Web PKI and CA ecosystem to adopt PQ signatures. Therefore, there is significant interest in the PQ signature on-ramp. Still, it will take several more years before selections are made, and draft standards are available, plus another year between draft standard and full standard.
NIST’s Fifth PQC Standardization Conference
Day 1 provided updates from NIST on the progress of the PQC project in general, including the fourth-round KEM candidates BIKE, HQC, and Classic McEliece, plus a progress report on the upcoming FN-DSA signature scheme, previously known as Falcon. The FN-DSA draft standard is scheduled to land later in 2024, while the other selections (ML-KEM, ML-DSA, and SLH-DSA) are scheduled to land as final FIPS standards in late summer 2024. There were also research updates on how to implement some of these schemes in side-channel-resistant ways.
Sometime during the first day, a preprint describing an efficient quantum attack against the general learning-with-errors problem landed on ePrint, and became the talk of the hallways for the remaining two days. The paper did not happen to apply to the parameters used in ML-KEM (neé Kyber) or ML-DSA (neé Dilithium), but the long-term impact of the attack on other LWE-based or more exotic lattice-based schemes seemed to put a lot of work up in the air. Luckily, a week later, two independent researchers found a fundamental bug in the algorithm, and the attack paper was withdrawn. General LWE-based cryptography lives, but the ripple effects of some of the novel techniques presented by Yilei Chen remain to be seen.
Day 2 provided specific updates from NIST on each of the updating FIPS documents that have initial public drafts: FIPS 203 (ML-KEM), FIPS 204 (ML-DSA), and FIPS 205 (SLH-DSA). These included explicit planned changes to support an updated ML-KEM API, including a public randomized API, which sources its own randomness, and a private deterministic API, to which the randomness and other arguments are passed. These changes align with a general industry approach when implementing these cryptographic algorithms to allow testability and reproducibility to ensure correctness. These changes also enable key material to be stored as a root seed, from which everything else about the computation is derived and is apparently already FIPS-compatible now but should be made more explicit in FIPS 203.
Research updates on SLH-DSA (neé Sphincs+) parameter sets were presented, including smaller parameters that would make the signing scheme more efficient but allow a fewer number of signatures per keypair, as well as work on accelerating the scheme via efficient hardware implementations. There were also several presentations on threshold PQ signatures, based on MPC-in-the-head and applying recent techniques in classical Schnorr signatures for lattice-based schemes. While these schemes don’t exactly match the signatures schemes currently chosen for FIPS standards this year, the signature on-ramp continues and indicates there may be future standardized PQ signature schemes that support single signing and threshold signing modes. The remainder of day 2 included updated cryptanalysis on the signature schemes Biscuit, AIM, ALTEQ, MEDS, SNOVA and VOX, and UOV.
The cryptanalysis talks were all on first-round on-ramp signature schemes. Curiously, the majority of the talks targeted multivariate polynomial schemes with structure. Among them, “Preliminary Cryptanalysis of the Biscuit Signature Scheme” showed how structures special to the polynomial systems underlying Biscuit can be exploited to solve the polynomial system faster than claimed by the designers. Consequently, the parameters chosen by Biscuit need to be increased. There were a few talks analyzing schemes (including MAYO, QR-UOV, VOX, and SNOVA) that broadly fall under the “Oil and Vinegar” motif. Again, the techniques exploited structures special to the underlying polynomial systems and highlighted the need to increase the parameters. One exception to talks on multivariate polynomial systems, “Finding isomorphism between trilinear forms, slightly faster,” described cryptanalytic algorithms for the tensor isomorphism problem, whose hardness underlies the MEDS signature scheme. The parameters of MEDS were found inadequate to meet the security requirements but are easily remedied by increasing the number of bits of the underlying prime power. Notable by absence were any cryptanalysis attempts on lattice or code-based systems.
Sandbox’s own Carlos Aguilar Melchor was part of a panel discussing the candidature of the three remaining fourth-round KEM schemes. All three follow a similar framework and rely on code-based hardness assumptions, in particular, some variants of syndrome decoding. The primary distinction is that Classic McEliece instantiates the McEliece cryptosystem using classic Goppa codes, just as McEliece originally did. In contrast, BIKE and HQC instantiate the McEliece cryptosystem using a quasi-cyclic medium density parity check and a combination of polynomial codes/quasi-cyclic codes, respectively. Goppa code-based McEliece cryptosystems have withstood over four decades of cryptanalytic attempts, which a panelist reiterated, suggesting it is perhaps the most conservative choice in terms of security. BIKE and HQC involve codes with structure that allows for better performance, such as smaller public key sizes. The panelists addressed questions and defended the three schemes against commonly raised issues while trying to highlight merits that distinguished one scheme from the others. NIST hinted that perhaps only one or two of these three candidates might be selected going forth.
Day 3 kicked off with a collection of research on optimized implementations of MAYO, SDitH hardware implementations, benchmarking PQ signatures on microcontrollers, multiplications accelerators for NTRU-based schemes, estimating decoder failures in code-based schemes, multi-recipient KEMs, a practical cost estimate of Grover’s algorithm against AES (hint, it’s impractical), and ended with two panels on NIST standards readiness in general, and signature pre-hashing. Much of this work will inform implementations of PQ schemes in the future, which will also influence the specific parameters and specifications of schemes as the signature on-ramp continues.
Spotlight: SDitH
We would like to highlight a talk by Sanjay Deshpandey, a PhD student at Yale. The talk was based on joint work with James Howe and Dongze (Steven) Yue of SandboxAQ and Jakub Szefer of Yale as a part of Sanjay’s residency work at SandboxAQ. The presentation was on a lightweight hardware implementation https://eprint.iacr.org/2024/069 of the hypercube SDitH (Syndrome Decoding in the Head) [signature scheme(https://]csrc.nist.gov/Presentations/2024/sdith-in-hardware), encompassing all the parameter choices. The hardware designs are specification compliant, constant-time (to thwart side-channel attacks), and may be tailored to the security level (λ), syndrome decoding field size (q), share splitting size (d), repetition rate (τ), and the random evaluation points (t) parameters. By splitting the protocol into an offline and an online phase, the clock cycles are reduced by 27-33% with an additional cost of 30%-60% in BRAMs. Many of the sub-modules designed could also be useful beyond SDitH in other MPC-in-the-Head schemes, particularly those that employ the hypercube optimization, many of them present in NIST’s ‘on-ramp’ PQC process.