A Prepper's Guide to Q Day
Why Nation-States Won't Wait for a Quantum Computer When Your Offshore Contractor Has Domain Admin
Current quantum computers: ~6,100 physical qubits. Breaking RSA-2048: estimates have dropped from ~20 million physical qubits (2019) to under a million (2025) to under 100,000 using QLDPC architectures (2026) — though the last result assumes hardware parameters nobody has demonstrated at scale and a real-time decoder nobody has built. The gap has narrowed from four orders of magnitude to roughly two, and it is narrowing faster than the comfortable estimates assumed.
NIST published the post-quantum standards in August 2024 — FIPS 203, 204, 205 — and an entire advisory industry has materialized to sell multi-year migration programs for estates that cannot be inventoried, using keys that cannot be located, on systems nobody knows they own. Before we go further: PQC is not quantum computing. Quantum computing is an emerging technology with standard adoption gates. PQC is the next iteration of the same cryptographic modernization that moved the enterprise from DES to AES, SHA-1 to SHA-256, TLS 1.0 to 1.3 — operational plumbing, not a speculative technology bet. The algorithms are standardized; the migration belongs in the CISO’s portfolio, not an emerging technology incubator.
The Test Nobody Applies
The PQC sales cycle runs on a single premise: adversaries are harvesting your encrypted traffic and will decrypt it when quantum computers mature. The premise is correct. The conclusion — that everything needs to migrate immediately and uniformly — does not follow, because it skips a triage step the discourse never performs.
Data is genuinely HNDL-susceptible only at the intersection of three conditions: it retains long-lived value, and it can only be obtained in encrypted form, and it is worth quantum decryption when cheaper access paths exist. Each conjunction shrinks the true exposure. Most enterprise data fails at least one. A retail payment transaction fails the first: its value decays in months. An internal database fails the second: the adversary reaches it through an overprivileged service account. A classified weapons design with air-gapped key custody passes all three — and belongs on a compressed PQC timeline today.
Everything that follows applies this test.
Confidentiality in Transit
This is the bulk “nation-states are harvesting your TLS traffic” narrative that drives most PQC urgency. Signature ecosystems, data-at-rest, and crypto-native forgery are different failure modes with different timelines; they appear under exceptions below.
“Harvest Now, Decrypt Later” is a real threat model and deserves to be engaged at its strongest. HNDL is passive collection: a nation-state on a border gateway or cable landing station takes zero risk, leaves no logs, triggers no alerts. The patient archive what they intercept.
Conceded. Now apply the test.
Without forward secrecy — TLS 1.2 with static RSA key exchange — the adversary breaks the server’s private key once and decrypts every session that ever used that certificate. With forward secrecy (TLS 1.3 / ECDHE), each session requires an independent quantum computation. The difference between those two scenarios is the difference between an expensive program and a non-credible one. The attacker-side cost model is in the companion piece, A Quantum of Solace.
Enforcing forward secrecy on all TLS endpoints is the single highest-impact HNDL mitigation for data in transit that requires zero post-quantum cryptography. It converts “break one key, decrypt everything” to “break one key per session.” Technically simple, operationally hard: a cipher suite policy change that becomes a migration program in heterogeneous estates.
But “per session” is less airtight than it sounds, and this is the part the conference keynotes skip. TLS 1.3’s KeyUpdate provides zero additional quantum resistance — it derives each new secret from its predecessor with no fresh randomness, so breaking the initial handshake exposes every subsequent epoch. PSK resumption is worse: one ECDHE break cascades through the entire resumption chain, including 0-RTT early data, until the ticket key rotates or a fresh exchange is forced. Default ticket lifetime in most implementations: 24 hours of resumed sessions exposed by one break. Blanco-Romero et al. (2026) validated this experimentally. And breaking P-256 ECDLP now requires fewer logical qubits than breaking RSA-2048 — 1,098 versus 1,409. The scaling law holds, but the vendor pitch is anchored to the wrong number.
These caveats make selective attacks against specific corridors more valuable. They do not resurrect bulk feasibility across high-volume endpoints. The bulk economics remain prohibitive: with forward secrecy properly enforced and PSK chains bounded, the adversary faces hundreds of millions of independent quantum computations per day of harvested wire data. You cannot rack-mount a quantum processor any more than you can rack-mount a tokamak. The adversary is searching Borges’ Library of Babel for one coherent book, except each volume requires its own run of Shor’s algorithm at near absolute zero, and most contain nothing of value.
Most harvested financial traffic also fails the first condition of the test: transactions, trading activity, and counterparty relationships have limited shelf life. A twenty-year-old wire transfer record is not actionable intelligence; it is an archive.
The Door Without a Lock
Most enterprise data fails the second condition: it can be obtained without breaking the encryption at all. The adversary who can obtain cleartext through an identity path has no reason to assault the cryptographic wall — and the identity paths are numerous, cheap, and available today. The offshore contractor with production access in a jurisdiction where intelligence services operate with legal impunity. The overprivileged service account nobody audits. The service desk that will reset a password over the phone. The SQL injection that never required credentials in the first place. In every case, the cryptographic layer is never engaged; PQC is irrelevant because the encryption was never the barrier.
For encrypted data at rest — backups, cloud snapshots — apply the test again. It only qualifies as genuinely encrypted if key custody is segregated from data access; when the adversary can reach the decryption keys through overpermissioned KMS policies, the encryption is decorative and the decryption is classical. Kerckhoffs has been teaching this lesson since 1883: the security of the system is the security of the key management, and key management is an identity problem before it is a cryptographic one. Where DAR encryption does pass all three conditions — genuinely segregated key custody, long-lived value, no cheaper access path — it belongs in the exception class below.
What Passes the Test
Three categories survive the conjunction and deserve compressed PQC timelines independent of everything else.
Long-horizon secrecy data — state secrets, genomic data, critical infrastructure designs — retains extreme value well beyond any projected quantum timeline. Not every asset warrants the same protection, but the assets that do need to be identified and triaged rather than subsumed into a uniform program that treats a retail banking app and a classified weapons design as posing equivalent HNDL risk.
Cryptographic-native systems — blockchain platforms, tokenized asset infrastructure, smart contracts, and identity signing infrastructure (PKI, SAML, code signing) — where cryptography is not defense-in-depth but the operational substrate. Breaking ECDSA here does not reveal a secret; it enables forgery — unauthorized transfers, fabricated contract executions, systemic trust collapse. The threat model is not “harvest now, decrypt later” but “forge at will,” and the cryptographic choices are embedded in consensus mechanisms and contract logic that may be immutable by design. For institutions building or investing in DLT infrastructure today, PQC is an architectural design constraint at inception — every month of deployment deepens the debt.
Hardware-embedded cryptography — HSMs, satellite systems, embedded controllers with 15–20 year deployment cycles — cannot wait for software-layer maturity. Migration planning starts immediately, informed by the inventory.
Ride the Budget Line
Mandates and examiner pressure exist across major jurisdictions. CISOs cannot tell regulators they are deferring PQC to fix identity governance.
But foundational security has failed to secure adequate budgets for decades because it lacks a hard external catalyst, and PQC mandates provide precisely that leverage. Cryptographic inventory, asset discovery, and identity governance are PQC readiness — not rebranded IT hygiene, but the only substrate that makes algorithm migration executable. The enterprise that knows what it has, who owns it, and whether it can change it has a deployment problem. The enterprise that doesn’t, has a discovery problem wearing a compliance deadline.
The Playbook
Exception classes run on compressed timelines independent of this sequence.
Immediate — enforce and harden forward secrecy. Kill static RSA key exchange. Harden TLS 1.3 configuration so it actually delivers session independence: disable 0-RTT, force fresh key exchange on resumption, rotate ticket keys aggressively. Deploy hybrid PQC where centralized TLS termination already exists — coverage stops where centralization stops.
Immediate — build the crypto control plane. Cryptographic inventory, key custody standardization, CBOM embedded in the build pipeline so the inventory problem stops growing while legacy enumeration proceeds. Consolidate external services behind API gateways. Re-encrypt edge to origin. Map vendor dependencies — fixed crypto stacks are the migration blockers, and some will never migrate.
When you can answer the conjunction test for your own estate — algorithm migration at scale. Full internal PQC rollout, application-level library remediation, legacy system modernization. This is where the budget pressure lives and where the vendor pitch starts. It proceeds when the enterprise can demonstrate that it knows what it has, who owns it, and how to change it — not perfection, but demonstrated trajectory and capability. Without that substrate, algorithm migration is a roadmap that assumes infrastructure it doesn’t have.
The conjunction test is the triage tool the PQC discourse doesn’t use. Apply it to your own estate: what retains long-lived value, can only be obtained encrypted, and is worth quantum decryption when cheaper paths exist? That intersection is your actual HNDL exposure. Everything outside it is a priority conversation about operational maturity, not a quantum emergency. Fund the substrate. Accelerate the exceptions. The algorithm migration follows.
For the full attacker-side economics — including what happens when a fictional intelligence agency runs the HNDL business case and discovers the throughput bottleneck that kills the program — see A Quantum of Solace: How I Learned to Stop Worrying and Love the CBOM.


