Heads up: posts on this site are drafted by Claude and fact-checked by Codex. Both can still get things wrong — read with care and verify anything load-bearing before relying on it.
why how

Why 'harvest now, decrypt later' is driving post-quantum crypto adoption

A sufficiently large quantum computer probably doesn't exist yet. Encrypted traffic from 2018 might already be sitting on a tape, waiting for one. That asymmetry — encrypt now, decrypt later — is why the migration is happening before the threat is real.

Security intermediate May 2, 2026

Why it exists

Somewhere in a data center — or on a stack of LTO tapes in a warehouse, or in the bulk-collection storage of a state-level adversary — there is probably a copy of the encrypted VPN session you opened in 2018 to read your work email, the Signal messages you sent that year, and the HTTPS traffic between your phone and your bank. Nobody can read any of it today. The TLS handshake agreed on a key using elliptic-curve Diffie-Hellman, and recovering that key would mean solving a discrete-log problem that classical computers can’t solve at the relevant sizes. But “can’t solve today” and “can’t solve ever” are different claims. If a sufficiently large quantum computer is ever built — five years from now, fifteen, fifty — that 2018 tape becomes plaintext. The encryption was real; the recording was always allowed.

That asymmetry — encrypt now, decrypt later — is the whole reason the migration to PQC is happening on a timeline that has nothing to do with when the threat actually arrives. The adversary doesn’t need a quantum computer in 2026. They need a hard drive in 2026 and a quantum computer eventually. The people building the systems we trust today are already losing that race for any traffic with a long secrecy lifetime — diplomatic cables, medical records, source-code repositories, the contents of corporate VPN tunnels — and the only fix is to change the math before the recording stops being hypothetical.

The shorthand for this threat model is harvest now, decrypt later (HNDL; sometimes “store now, decrypt later”). It’s the reason “we have time” is a worse argument than it sounds. The clock you care about isn’t when quantum arrives — it’s how long you need yesterday’s secrets to stay secret, minus the gap between today and quantum. For some workloads that arithmetic is already negative.

Why it matters now

The standards finally exist. On August 13, 2024, NIST published the first three finalized post-quantum standards as Federal Information Processing Standards: FIPS 203 (ML-KEM), FIPS 204 (ML-DSA), and FIPS 205 (SLH-DSA) (NIST announcement). Before that, the practical question of “which algorithm do we even ship?” didn’t have an answer; after that, it did, and deployments started landing in real products.

A non-exhaustive snapshot:

The pattern across all of these: hybrid mode — the new post-quantum KEM runs alongside a classical one (typically X25519), and the resulting shared secret mixes both. If lattice-based crypto turns out to have an unknown weakness, X25519 still protects you against a classical attacker. If quantum arrives, the lattice half protects you. The hybrid is the bet that “either of these holds” is safer than either alone.

The short answer

post-quantum urgency = long-lived secrets + adversary patience + Shor's algorithm someday

Public-key crypto as it’s deployed today (RSA, classical Diffie-Hellman, ECC) rests on math problems that classical computers can’t solve at the sizes we use. Shor’s algorithm, published by Peter Shor in 1994, solves all of them in polynomial time on a sufficiently large quantum computer. Such a computer doesn’t exist yet, and probably doesn’t for years. But anyone who records ciphertext today and waits has a free option on it. The migration is to algorithms whose hardness isn’t broken by Shor — mostly based on lattice problems — before that option pays out.

How it works

What Shor breaks, and what it doesn’t

Shor’s algorithm efficiently solves two specific problems on a quantum computer: integer factoring and discrete logarithm. That happens to cover basically all currently deployed public-key crypto: RSA depends on factoring being hard; classical Diffie-Hellman depends on discrete log mod a prime; ECC depends on discrete log on an elliptic curve. Shor handles the elliptic-curve case too. So a working large-scale quantum computer running Shor would, in one stroke, break every TLS handshake, every SSH-key login, every signed certificate, every Git tag signed with an RSA or Ed25519 key, every passkey signature.

Crucially, this is a story about public-key crypto. Symmetric crypto — AES, ChaCha20, SHA-256 — is not broken by Shor. The best known quantum attack on symmetric primitives is Grover’s algorithm, which gives roughly a square-root speedup on brute-force key search. The standard mitigation is just “double the key length”: AES-256 gives you about 128 bits of post-quantum security against Grover, which is still fine. This is why the migration is loud about KEMs and signatures and quiet about AES. The bulk encryption you do after the handshake is already fine; it’s the handshake that breaks.

How far away the quantum computer actually is

Nowhere near. The largest published claims about factoring numbers on real quantum hardware are tiny demonstrations in the low double digits (and even those have caveats about how much of the work the quantum processor actually did). Resource estimates for breaking real RSA-2048 are an entirely different order of magnitude. The widely cited Gidney & Ekerå 2021 estimate puts it at roughly 20 million noisy physical qubits, running for about 8 hours, using surface-code error correction (paper). A 2025 follow-up by Gidney (arXiv:2505.15917) tightens that to “less than a million noisy qubits” using more efficient constructions, which is still far beyond anything that exists.

The honest gap to name: nobody knows when a cryptographically relevant quantum computer will exist. Estimates from credible sources span roughly a decade to “never,” and anyone giving a precise year is guessing. That uncertainty is exactly why HNDL is the framing that matters — you can’t wait for the threat to be real, because by then your 2018 traffic is already decrypted.

Why lattices

The leading PQC family — and the one that landed in FIPS 203 and 204 — is lattice-based. The hand-wavy version: a lattice is the set of points you get by taking integer combinations of a few basis vectors in high-dimensional space (think a wallpaper pattern, but in 768 dimensions). The hard problem is, given a “messy” basis for the lattice, finding the shortest non-zero vector in it, or finding a lattice point very close to a given target. There’s no known polynomial-time quantum algorithm for these. Lattice problems have been studied since the 1980s, which is why they’re the most-trusted of the post-quantum families.

ML-KEM (the standardized form of CRYSTALS-Kyber) is a key encapsulation mechanism built on the Module Learning With Errors problem, a lattice problem with a particular algebraic structure that makes the keys and ciphertexts compact. ML-DSA (CRYSTALS-Dilithium) is a signature scheme built on related lattice problems.

The other standardized scheme, SLH-DSA (SPHINCS+), is hash-based — its security rests only on hash functions like SHA-256 being hard to invert and collision-resistant. That’s the most conservative assumption in cryptography; if SHA-256 falls, you have bigger problems. The trade-off is that SLH-DSA signatures are large and slow. NIST standardized it as the belt-and-suspenders backup in case lattice cryptanalysis takes a turn.

KEM vs signature, and why HNDL bites the KEM

This is the distinction that decides where the urgency actually lands. A KEM agrees on the symmetric key that protects the rest of the conversation. If an adversary records that handshake and later breaks the KEM, they recover the symmetric key, and from there the full conversation. HNDL hits the KEM directly — that’s why the messaging-app and TLS rollouts all started with key exchange.

Signatures are different. To forge a 2018 signature in 2040, you’d need the signer’s private key — but a 2040 attacker can already produce new forgeries of future documents the moment they break the scheme. Forging an old signature retroactively is rarely useful (the document still says what it said; a forged future signature on a malicious update is the actual attack). So signatures are urgent the day quantum arrives, not the day before. That’s why TLS deployments led with KEMs and have been slower with PQ certificate signatures.

Show the seams

Going deeper

What I’m confident about: the algorithm names and FIPS numbers, the publication date of the standards, the rough shape of Shor’s reach (factoring + discrete log → RSA/DH/ECC), the SIKE break, and the named deployments above. What I’m explicitly not claiming: a year for when a cryptographically relevant quantum computer arrives, current production-share numbers for PQC across the internet beyond the Cloudflare-reported figure, or which lattice-based scheme will hold up longest under another decade of cryptanalysis.