The Quantum Frontier (Part 3): The Quantum Shield: How NIST and Post-Quantum Cryptography (PQC) Will Save Your Data

By Ryan Wentzel
10 Min. Read
#Quantum#AI#PQC#Post-Quantum Cryptography#NIST
The Quantum Frontier (Part 3): The Quantum Shield: How NIST and Post-Quantum Cryptography (PQC) Will Save Your Data

Table of Contents

Fighting Math with... Better Math

In Part 2 of this series, we established that Shor's algorithm will eventually break RSA and ECC. A natural first reaction might be: why not just use longer keys? If a 2,048-bit RSA key is vulnerable, why not use a 4,096-bit key, or a 16,384-bit key, or something even larger?

The answer reveals why post-quantum cryptography requires fundamentally new mathematical foundations rather than incremental improvements to existing ones.

Why Longer Keys Cannot Save RSA and ECC

Shor's algorithm runs in polynomial time, specifically O(n^3) for an n-bit integer. Doubling the key length does not double the difficulty; it merely increases the quantum computation time by a factor of eight. To make RSA resistant to Shor's algorithm, you would need keys so large, potentially millions of bits, that the resulting cryptosystem would be impractical. TLS handshakes would take minutes. Certificates would be megabytes. Embedded devices could not store or process the keys. The performance penalty would effectively break the internet.

ECC faces an even steeper problem. Shor's algorithm for the elliptic curve discrete logarithm problem is particularly efficient. Doubling the key size only adds a linear factor to the quantum attack time. There is no key size that makes ECC both secure against quantum attacks and practically usable.

The only viable path forward is to replace the underlying mathematical problems entirely, swapping factoring and discrete logarithms for problems that are believed to be hard for both classical and quantum computers. This is the core idea behind post-quantum cryptography (PQC): fight math with better math.

The Candidate Problem Families

Researchers identified several families of mathematical problems that resist known quantum algorithms:

  • Lattice-based problems: Finding short or close vectors in high-dimensional lattices. No known quantum algorithm provides a significant speedup.
  • Hash-based signatures: Security based entirely on the properties of cryptographic hash functions, which Grover's algorithm can attack but only with a quadratic speedup (addressed by doubling hash output sizes).
  • Code-based cryptography: Decoding random linear codes (McEliece cryptosystem, proposed in 1978 and still unbroken).
  • Multivariate polynomial cryptography: Solving systems of multivariate quadratic equations over finite fields.
  • Isogeny-based cryptography: Computing isogenies between elliptic curves (though SIKE, a prominent candidate, was spectacularly broken in 2022 by a classical attack).

Each family has different trade-offs in key sizes, computational performance, bandwidth requirements, and maturity of security analysis. The question was: which should become the new standards?

The Great Crypto "Bake-Off": The NIST PQC Competition

In 2016, the National Institute of Standards and Technology (NIST) launched a public competition to evaluate and standardize post-quantum cryptographic algorithms. The process was deliberately modeled after NIST's previous successful standardization efforts, particularly the AES competition of the late 1990s.

The Competition Timeline

The scale and rigor of the NIST PQC competition was unprecedented:

2016 (Call for Proposals): NIST published its call, requesting submissions for quantum-resistant key encapsulation mechanisms (KEMs) and digital signature schemes. The requirements specified security levels aligned with AES-128, AES-192, and AES-256.

2017 (82 Submissions): By the November 2017 deadline, NIST received 82 complete submissions from research teams around the world, spanning all major PQC families. Each submission included a detailed specification, reference implementation, security analysis, and performance benchmarks.

2019 (Round 2 - 26 Candidates): After extensive public review, NIST narrowed the field to 26 second-round candidates. Several submissions were broken or found to have significant weaknesses during the review period.

2020 (Round 3 - 15 Candidates): The field narrowed further to 7 finalists and 8 alternate candidates. This round involved intense scrutiny from the global cryptographic research community, with hundreds of published papers analyzing the security of the remaining candidates.

2022 (Winners Announced): NIST announced the first group of algorithms selected for standardization: CRYSTALS-Kyber for key encapsulation, and CRYSTALS-Dilithium, FALCON, and SPHINCS+ for digital signatures.

2024 (FIPS Standards Published): NIST published the final standards as Federal Information Processing Standards: FIPS 203 (ML-KEM, based on Kyber), FIPS 204 (ML-DSA, based on Dilithium), and FIPS 205 (SLH-DSA, based on SPHINCS+). FALCON was deferred to a later publication due to implementation complexity.

The entire process took eight years from call to published standards, a timeline that underscores both the difficulty of the problem and the thoroughness of the evaluation.

The Winning Algorithms

CRYSTALS-Kyber / ML-KEM (FIPS 203): The primary key encapsulation mechanism. Kyber is a lattice-based scheme built on the Module Learning With Errors (MLWE) problem. It provides key establishment, the quantum-resistant replacement for RSA key exchange and ECDH. Kyber was selected for its combination of strong security margins, compact key and ciphertext sizes, and excellent performance across a wide range of platforms.

CRYSTALS-Dilithium / ML-DSA (FIPS 204): The primary digital signature scheme. Also lattice-based, built on the Module Learning With Errors and Module Short Integer Solution (MSIS) problems. Dilithium replaces RSA signatures, ECDSA, and EdDSA. It was chosen for its simplicity of implementation (reducing the risk of side-channel vulnerabilities) and balanced performance characteristics.

SPHINCS+ / SLH-DSA (FIPS 205): A hash-based signature scheme included as a conservative backup. Its security relies only on the properties of hash functions, making it the most conservative choice. The trade-off is significantly larger signatures (up to 50KB compared to Dilithium's roughly 2.5KB). SPHINCS+ exists as an insurance policy: if a breakthrough attack is found against lattice-based schemes, hash-based signatures provide a fallback whose security assumptions are minimal.

FALCON: A lattice-based signature scheme using NTRU lattices and fast Fourier sampling. FALCON produces the most compact signatures among the winners but is significantly more complex to implement correctly, particularly the floating-point arithmetic in key generation. NIST deferred its standardization to allow more time for implementation guidance. It is expected to be published as FIPS 206.

Meet the New Standards: What is Lattice-Based Cryptography?

Three of the four winning algorithms are lattice-based, making lattice cryptography the cornerstone of the post-quantum transition. Understanding the underlying mathematics, at least at a conceptual level, is important for any technical leader navigating this migration.

Lattices and Hard Problems

A lattice is a regular, repeating grid of points in multi-dimensional space. In two dimensions, think of a sheet of graph paper: the intersections form a lattice. In the cryptographic context, lattices exist in hundreds or thousands of dimensions, where geometric intuition breaks down entirely.

Two foundational hard problems arise in lattice cryptography:

The Shortest Vector Problem (SVP): Given a lattice, find the shortest non-zero vector. In two dimensions, this is trivial. In hundreds of dimensions, the best known algorithms (both classical and quantum) run in exponential time relative to the dimension. The LLL algorithm and its descendants can find approximately short vectors, but finding the actual shortest vector remains intractable in high dimensions.

The Learning With Errors Problem (LWE): Given a system of approximate linear equations over a finite field (where each equation has a small random error added), recover the secret vector. Without the errors, this is simple linear algebra. With errors, the problem becomes as hard as worst-case lattice problems, a result proven by Oded Regev in 2005 in a landmark paper that earned him the 2024 Godel Prize.

The structured variant used in the NIST standards, Module-LWE, works over polynomial rings rather than plain integers. This structure reduces key sizes and improves performance while maintaining the security reduction to hard lattice problems.

How Kyber Works (Conceptual Overview)

Kyber's key encapsulation works roughly as follows: the public key encodes a noisy system of linear equations whose solution is the private key. To encapsulate a shared secret, the sender creates a new noisy system related to the public key and derives a shared key from it. The receiver, knowing the private key, can "cancel out" the noise and recover the same shared key. An eavesdropper, lacking the private key, faces the full Module-LWE problem and cannot extract the shared secret.

Key Size and Performance Tradeoffs

The most significant practical difference between PQC algorithms and their classical predecessors is key and signature sizes:

Algorithm Public Key Ciphertext/Signature Security Level
RSA-2048 256 bytes 256 bytes ~112-bit
ECDH P-256 32 bytes 32 bytes ~128-bit
ML-KEM-768 (Kyber) 1,184 bytes 1,088 bytes ~128-bit
ML-DSA-65 (Dilithium) 1,952 bytes 3,293 bytes ~128-bit
SLH-DSA (SPHINCS+) 32-64 bytes 7,856-49,856 bytes ~128-bit

Kyber's keys are roughly 4-5 times larger than RSA keys and 37 times larger than ECC keys. Dilithium signatures are about 13 times larger than RSA signatures and over 100 times larger than ECDSA. These are meaningful increases that affect bandwidth, storage, and handshake latency, particularly for constrained environments like IoT devices and satellite communications.

However, the computational performance of lattice-based schemes is competitive with or faster than RSA. Kyber key generation and encapsulation are faster than RSA key generation. The performance impact on TLS handshakes has been measured at roughly 10-15% additional latency in most scenarios, which is acceptable for the vast majority of applications.

The Migration Has Begun

Standards are published. The algorithms are ready. Now comes the hardest part: migrating the world's cryptographic infrastructure.

Step 1: Cryptographic Inventory

The first step in any PQC migration is understanding what you have. Most organizations have no comprehensive inventory of where and how they use cryptography. This includes TLS certificates, VPN configurations, code signing keys, database encryption, email encryption (S/MIME, PGP), SSH keys, API authentication tokens, and dozens of other applications.

A cryptographic inventory must catalog every use of public-key cryptography, the algorithms and key sizes in use, the data protection requirements (how long must this data remain confidential?), and the dependencies between systems. This is not a trivial exercise; large enterprises may have thousands of distinct cryptographic touchpoints.

Step 2: Prioritization Based on Risk

Not all systems need to migrate simultaneously. Prioritization should follow the HNDL risk model discussed in Part 2. Systems protecting data with long confidentiality requirements should migrate first. This typically means VPN and TLS infrastructure protecting classified or regulated data, then certificate authorities and code signing, then general-purpose web traffic.

Step 3: Hybrid Mode Deployment

The recommended migration strategy is hybrid mode: running both classical and post-quantum algorithms simultaneously. A hybrid TLS handshake, for example, performs both an ECDH key exchange and a Kyber key encapsulation, combining both shared secrets. This ensures that if the PQC algorithm is later found to be flawed, the classical algorithm still provides protection (assuming a CRQC has not yet arrived). Conversely, if a CRQC arrives, the PQC algorithm provides protection even though the classical algorithm is broken.

Google and Cloudflare began deploying hybrid key exchange (X25519 + Kyber) in Chrome and at the CDN edge as early as 2023. AWS, Apple, and Signal have all announced or deployed PQC support in their products and protocols.

Step 4: Crypto-Agility and Abstraction Layers

Perhaps the most important long-term lesson from the PQC transition is the need for crypto-agility: the ability to swap cryptographic algorithms without redesigning entire systems. Organizations should abstract their cryptographic operations behind well-defined interfaces so that the next algorithm migration (and there will be a next one) does not require another decade-long effort.

This means avoiding hard-coded algorithm references, using cryptographic libraries that support algorithm negotiation, and designing protocols with version flexibility built in. Practically, crypto-agility requires centralizing cryptographic configuration, using libraries like liboqs or AWS's s2n-tls that support multiple algorithm families, and building automated testing infrastructure that can validate new algorithms across the full application stack before deployment. Organizations that treat this migration as a one-time project rather than an ongoing capability will find themselves repeating the same painful process the next time standards evolve.

Industry Adoption Status

As of early 2025, adoption is accelerating:

  • Browsers: Chrome, Firefox, and Safari support hybrid PQC key exchange.
  • Cloud providers: AWS KMS, Google Cloud, and Azure have begun offering PQC options.
  • Messaging: Signal deployed the PQXDH protocol with Kyber in 2023. Apple announced PQ3 for iMessage in 2024.
  • Operating systems: BoringSSL, OpenSSL 3.x, and liboqs provide library support.
  • Government mandates: NSM-10 requires U.S. federal agencies to complete cryptographic inventories and begin migration. CNSA 2.0 provides specific timelines for NSS (National Security Systems).

Conclusion

The shield exists. NIST's rigorous eight-year competition has produced standardized, peer-reviewed, quantum-resistant algorithms ready for deployment. The mathematical foundations, lattice problems, hash functions, and structured codes, have withstood years of intense analysis from the global cryptographic community.

Now begins the engineering migration to deploy it. This is not a future concern; it is an active, ongoing effort. Organizations that begin their cryptographic inventory and migration planning today will be positioned to protect their data against quantum threats. Those that wait risk finding themselves on the wrong side of the HNDL timeline.

In Part 4, we shift from defense to offense: exploring how quantum computing will revolutionize drug discovery and materials science, transforming R&D from a laboratory-first discipline into a simulation-first revolution.

Share Your Thoughts

Found this article helpful? Share it with your network.

Get in Touch
Trusted by teams using
NetflixOracleFigmaCoinbaseDellServiceNowAppleDeloitteNikeAWSJPMorgan ChaseT-MobileAtlassianBoschStripeL'OréalDatadogMicrosoftPalantirHPRobinhoodEYSonyCanvaVisaAutoCADDiscordBell HelicopterAdobeCharles SchwabE*TRADENVIDIAGoogleJohnson & JohnsonFidelityClaudeMastercardIntuitBoeingAT&TShopifyPwCOpenAIKPMGIBMDatabricksSalesforceGitHubAmerican ExpressWorkdayMailerSend