Why 'harvest now, decrypt later' is driving post-quantum crypto adoption
A sufficiently large quantum computer probably doesn't exist yet. Encrypted traffic from 2018 might already be sitting on a tape, waiting for one. That asymmetry — encrypt now, decrypt later — is why the migration is happening before the threat is real.
Why it exists
Somewhere in a data center — or on a stack of LTO tapes in a warehouse, or in the bulk-collection storage of a state-level adversary — there is probably a copy of the encrypted VPN session you opened in 2018 to read your work email, the Signal messages you sent that year, and the HTTPS traffic between your phone and your bank. Nobody can read any of it today. The TLS handshake agreed on a key using elliptic-curve Diffie-Hellman, and recovering that key would mean solving a discrete-log problem that classical computers can’t solve at the relevant sizes. But “can’t solve today” and “can’t solve ever” are different claims. If a sufficiently large quantum computer is ever built — five years from now, fifteen, fifty — that 2018 tape becomes plaintext. The encryption was real; the recording was always allowed.
That asymmetry — encrypt now, decrypt later — is the whole reason the migration to PQC is happening on a timeline that has nothing to do with when the threat actually arrives. The adversary doesn’t need a quantum computer in 2026. They need a hard drive in 2026 and a quantum computer eventually. The people building the systems we trust today are already losing that race for any traffic with a long secrecy lifetime — diplomatic cables, medical records, source-code repositories, the contents of corporate VPN tunnels — and the only fix is to change the math before the recording stops being hypothetical.
The shorthand for this threat model is harvest now, decrypt later (HNDL; sometimes “store now, decrypt later”). It’s the reason “we have time” is a worse argument than it sounds. The clock you care about isn’t when quantum arrives — it’s how long you need yesterday’s secrets to stay secret, minus the gap between today and quantum. For some workloads that arithmetic is already negative.
Why it matters now
The standards finally exist. On August 13, 2024, NIST published the first three finalized post-quantum standards as Federal Information Processing Standards: FIPS 203 (ML-KEM), FIPS 204 (ML-DSA), and FIPS 205 (SLH-DSA) (NIST announcement). Before that, the practical question of “which algorithm do we even ship?” didn’t have an answer; after that, it did, and deployments started landing in real products.
A non-exhaustive snapshot:
- Signal rolled out PQXDH in 2023, upgrading the X3DH initial handshake to combine X25519 with CRYSTALS-Kyber in hybrid mode.
- Apple announced PQ3 for iMessage on February 21, 2024, a hybrid post-quantum messaging protocol that shipped with iOS 17.4 and macOS 14.4.
- Cloudflare enabled hybrid post-quantum key agreement (X25519Kyber768) for general availability across its edge in September 2023; by early 2024 about 1.8% of TLS 1.3 connections to Cloudflare were using PQC, mostly via Chrome.
- Chrome shipped X25519Kyber768 by default in Chrome 124 (May 2024), then switched to the standardized ML-KEM-768 in Chrome 131 (late 2024).
The pattern across all of these: hybrid mode — the new post-quantum KEM runs alongside a classical one (typically X25519), and the resulting shared secret mixes both. If lattice-based crypto turns out to have an unknown weakness, X25519 still protects you against a classical attacker. If quantum arrives, the lattice half protects you. The hybrid is the bet that “either of these holds” is safer than either alone.
The short answer
post-quantum urgency = long-lived secrets + adversary patience + Shor's algorithm someday
Public-key crypto as it’s deployed today (RSA, classical Diffie-Hellman, ECC) rests on math problems that classical computers can’t solve at the sizes we use. Shor’s algorithm, published by Peter Shor in 1994, solves all of them in polynomial time on a sufficiently large quantum computer. Such a computer doesn’t exist yet, and probably doesn’t for years. But anyone who records ciphertext today and waits has a free option on it. The migration is to algorithms whose hardness isn’t broken by Shor — mostly based on lattice problems — before that option pays out.
How it works
What Shor breaks, and what it doesn’t
Shor’s algorithm efficiently solves two specific problems on a quantum computer: integer factoring and discrete logarithm. That happens to cover basically all currently deployed public-key crypto: RSA depends on factoring being hard; classical Diffie-Hellman depends on discrete log mod a prime; ECC depends on discrete log on an elliptic curve. Shor handles the elliptic-curve case too. So a working large-scale quantum computer running Shor would, in one stroke, break every TLS handshake, every SSH-key login, every signed certificate, every Git tag signed with an RSA or Ed25519 key, every passkey signature.
Crucially, this is a story about public-key crypto. Symmetric crypto — AES, ChaCha20, SHA-256 — is not broken by Shor. The best known quantum attack on symmetric primitives is Grover’s algorithm, which gives roughly a square-root speedup on brute-force key search. The standard mitigation is just “double the key length”: AES-256 gives you about 128 bits of post-quantum security against Grover, which is still fine. This is why the migration is loud about KEMs and signatures and quiet about AES. The bulk encryption you do after the handshake is already fine; it’s the handshake that breaks.
How far away the quantum computer actually is
Nowhere near. The largest published claims about factoring numbers on real quantum hardware are tiny demonstrations in the low double digits (and even those have caveats about how much of the work the quantum processor actually did). Resource estimates for breaking real RSA-2048 are an entirely different order of magnitude. The widely cited Gidney & Ekerå 2021 estimate puts it at roughly 20 million noisy physical qubits, running for about 8 hours, using surface-code error correction (paper). A 2025 follow-up by Gidney (arXiv:2505.15917) tightens that to “less than a million noisy qubits” using more efficient constructions, which is still far beyond anything that exists.
The honest gap to name: nobody knows when a cryptographically relevant quantum computer will exist. Estimates from credible sources span roughly a decade to “never,” and anyone giving a precise year is guessing. That uncertainty is exactly why HNDL is the framing that matters — you can’t wait for the threat to be real, because by then your 2018 traffic is already decrypted.
Why lattices
The leading PQC family — and the one that landed in FIPS 203 and 204 — is lattice-based. The hand-wavy version: a lattice is the set of points you get by taking integer combinations of a few basis vectors in high-dimensional space (think a wallpaper pattern, but in 768 dimensions). The hard problem is, given a “messy” basis for the lattice, finding the shortest non-zero vector in it, or finding a lattice point very close to a given target. There’s no known polynomial-time quantum algorithm for these. Lattice problems have been studied since the 1980s, which is why they’re the most-trusted of the post-quantum families.
ML-KEM (the standardized form of CRYSTALS-Kyber) is a key encapsulation mechanism built on the Module Learning With Errors problem, a lattice problem with a particular algebraic structure that makes the keys and ciphertexts compact. ML-DSA (CRYSTALS-Dilithium) is a signature scheme built on related lattice problems.
The other standardized scheme, SLH-DSA (SPHINCS+), is hash-based — its security rests only on hash functions like SHA-256 being hard to invert and collision-resistant. That’s the most conservative assumption in cryptography; if SHA-256 falls, you have bigger problems. The trade-off is that SLH-DSA signatures are large and slow. NIST standardized it as the belt-and-suspenders backup in case lattice cryptanalysis takes a turn.
KEM vs signature, and why HNDL bites the KEM
This is the distinction that decides where the urgency actually lands. A KEM agrees on the symmetric key that protects the rest of the conversation. If an adversary records that handshake and later breaks the KEM, they recover the symmetric key, and from there the full conversation. HNDL hits the KEM directly — that’s why the messaging-app and TLS rollouts all started with key exchange.
Signatures are different. To forge a 2018 signature in 2040, you’d need the signer’s private key — but a 2040 attacker can already produce new forgeries of future documents the moment they break the scheme. Forging an old signature retroactively is rarely useful (the document still says what it said; a forged future signature on a malicious update is the actual attack). So signatures are urgent the day quantum arrives, not the day before. That’s why TLS deployments led with KEMs and have been slower with PQ certificate signatures.
Show the seams
- PQC algorithms are new. Lattice problems are old; “deploying Kyber on a billion devices” is not. The cautionary tale is SIKE (and its underlying SIDH), an isogeny-based KEM that was a NIST finalist as recently as the fourth round. In August 2022, Castryck and Decru published an efficient classical key-recovery attack that broke SIKE in roughly an hour on a single CPU core. NIST dropped it. The takeaway isn’t “PQC is broken”; it’s that real-world cryptanalytic experience with these schemes is much shorter than for RSA, and surprises are still possible. That’s part of why hybrid deployments matter.
- Sizes change everything downstream. Ed25519 signatures are 64 bytes. ML-DSA-65 signatures are 3,309 bytes (liboqs algorithm table); the public key is 1,952 bytes. Multiply by the number of certificates in a TLS chain and the handshake gets meaningfully bigger. Chrome’s initial X25519Kyber768 default rollout hit interop bugs in middleboxes precisely because the ClientHello no longer fit in one packet. Hardware tokens with tight storage budgets (some TPMs, some smart cards) need rework, not just a software update.
- The timeline is genuinely unknown. I don’t have a confident year for “cryptographically relevant quantum computer exists.” Public expert estimates I’ve seen range from roughly 2030 to “much later or never,” and they update with every hardware announcement. The defensible claim is the structural one — if it happens, harvested traffic is exposed retroactively, and the migration takes years to propagate through the ecosystem, so starting now is rational under almost any timeline assumption.
Famous related terms
- Shor’s algorithm —
Shor's = quantum algorithm + period-finding via QFT → factoring + discrete log in poly time. The reason public-key crypto needs replacing. - Grover’s algorithm —
Grover's ≈ quantum brute-force search with sqrt speedup. Why symmetric crypto only needs longer keys, not new algorithms. - ML-KEM (Kyber) —
ML-KEM = lattice-based KEM + Module-LWE hardness assumption + small ciphertexts. The standardized post-quantum key exchange (FIPS 203). - ML-DSA (Dilithium) —
ML-DSA = lattice-based signature + Module-LWE/SIS + Fiat-Shamir. The standardized post-quantum signature (FIPS 204). - SLH-DSA (SPHINCS+) —
SLH-DSA = stateless hash-based signature + Merkle trees of one-time signatures. The conservative-assumption backup standard (FIPS 205). - Hybrid key exchange —
hybrid = classical KEM (X25519) + post-quantum KEM (ML-KEM) + combine into one shared secret. The deployment pattern everyone is using during the transition. - HNDL —
HNDL = adversary records ciphertext now + decrypts later when quantum arrives. The threat model that justifies migrating before the threat is real. - Public-key crypto — the layer being replaced. The interface stays similar; the math underneath changes.
Going deeper
- Peter Shor, Polynomial-Time Algorithms for Prime Factorization and Discrete Logarithms on a Quantum Computer (1994/1996) — the paper that started the field.
- NIST, Post-Quantum Cryptography Standardization — the running record of the competition and standards.
- Apple Security Engineering, iMessage with PQ3 (2024) — a clean write-up of how a real product designed a hybrid PQC protocol.
- Signal, Quantum Resistance and the Signal Protocol (2023) — same, for the X3DH → PQXDH upgrade.
- Cloudflare Research, The state of the post-quantum Internet — annual snapshot of real deployment numbers.
- Castryck & Decru, An efficient key recovery attack on SIDH (2022) — the SIKE break, as a reminder that PQC schemes are still being stress-tested.
- Gidney & Ekerå, How to factor 2048 bit RSA integers in 8 hours using 20 million noisy qubits (2019, revised 2021) — the canonical resource estimate; the 2025 follow-up tightens it but the headline shape is the same.
What I’m confident about: the algorithm names and FIPS numbers, the publication date of the standards, the rough shape of Shor’s reach (factoring + discrete log → RSA/DH/ECC), the SIKE break, and the named deployments above. What I’m explicitly not claiming: a year for when a cryptographically relevant quantum computer arrives, current production-share numbers for PQC across the internet beyond the Cloudflare-reported figure, or which lattice-based scheme will hold up longest under another decade of cryptanalysis.