Security after quantum computing and the collapse of classical trust

Security after quantum computing and the collapse of classical trust

The frightening version of the story goes like this: one day a quantum computer appears, and every password, every encrypted file, every VPN, every banking session, every PGP archive, and every “military-grade” cipher collapses at once. That story is wrong. It is wrong in the dangerous direction and in the comforting direction.

It is wrong because a sufficiently powerful, fault-tolerant quantum computer would be catastrophic for much of today’s public-key cryptography. RSA, classic Diffie-Hellman, elliptic-curve Diffie-Hellman, ECDSA, and EdDSA all sit in the blast radius. That covers the part of cryptography that helps systems exchange keys, authenticate identities, sign software, validate certificates, and establish trust on the public internet. Security teams are not migrating to post-quantum cryptography for fun. NIST published its first post-quantum standards in August 2024, NCSC has set milestone dates through 2035, and major vendors are already deploying hybrid or post-quantum-capable systems.

It is also wrong because quantum computing does not flatten all of security into dust. Password guessing, password storage, rate limits, MFA, access control, logging, network segmentation, key rotation, hardware roots of trust, and the entire operational side of defense do not vanish just because Shor’s algorithm exists on paper. Symmetric cryptography and secure hash functions are affected far less severely than the public-key layer. The UK NCSC states plainly that symmetric cryptography and secure hash functions are not significantly impacted, and that existing symmetric algorithms with at least 128-bit keys can continue to be used.

So, is security an illusion? No. Security is a moving engineering contract with physics, mathematics, software, and human behavior. Quantum computing changes the contract. It does not erase the idea of security itself. The right question is not “Will everything break?” The right question is which parts break, which parts bend, what survives, and what must change before an attacker gets there first.

Quantum computing without the fog

A lot of confusion starts with the machine itself. Quantum computing gets described with a strange mix of science fiction and marketing, and both are bad for clear thinking. A quantum computer is not a magical box that “tries every answer at once” and instantly wins. It is a machine that uses qubits, quantum states that can be prepared, transformed, entangled, and measured in ways that have no classical equivalent. IBM’s explainer puts it cleanly: qubits can exist in superposition, groups of qubits can interfere and entangle, and quantum algorithms work by amplifying useful outcomes while canceling others.

That detail matters because quantum advantage is not universal. A quantum computer does not accelerate every problem. It accelerates some specific problem structures. Cryptography lives or dies by that fact. If the hard mathematical problem under your crypto scheme happens to line up with a known powerful quantum algorithm, your scheme is in trouble. If it does not, the damage is smaller or more manageable.

Another source of confusion is the gap between today’s quantum hardware and the hypothetical machine cryptographers fear. The machine that matters here is often called a cryptographically relevant quantum computer, or CRQC. The NCSC describes current quantum computers as limited and affected by relatively high error rates, while also warning that a future large, general-purpose quantum computer could make most traditional public-key cryptography vulnerable. That is the right posture: do not panic about today’s lab hardware, but do not sleep through the migration either.

Recent research shows why people take the threat seriously without pretending the threat has already arrived. A December 2024 Nature paper on Google’s Willow processor reported below-threshold quantum error correction in a logical memory, a major technical milestone because error correction is one of the gates standing between noisy prototypes and useful large-scale computation. That result is not remotely the same thing as “RSA is dead tomorrow.” It does show steady movement on the hardest engineering problem in the field.

This is the first sober conclusion a beginner needs to keep: quantum computing is real, the cryptographic threat is real, and the timeline is uncertain. Those three facts can sit together without contradiction. Professionals have lived with this kind of uncertainty before. You do not wait for the breach to begin patching. You move when the probability-weighted cost of delay becomes irrational. Governments and standards bodies have already decided we are in that phase now.

The two algorithms that decide almost everything

For cryptography, most of the noise falls away once you understand two names: Shor and Grover.

Peter Shor showed that a quantum computer could factor integers and solve discrete logarithm problems in polynomial time. Those two problems are not obscure academic toys. They sit under the foundations of RSA and large chunks of classical public-key cryptography. Shor’s 1995 paper states exactly that: factoring integers and finding discrete logarithms, long believed hard for classical computers, can be solved efficiently on a hypothetical quantum computer.

That is the algorithm that keeps cryptographers awake. Shor is the reason RSA, finite-field Diffie-Hellman, elliptic-curve Diffie-Hellman, ECDSA, and EdDSA are all future casualties if a CRQC arrives. The NCSC says most traditional public-key cryptography in use today will be vulnerable, and it names the familiar families directly: RSA-style factorization systems and discrete-logarithm systems including finite-field Diffie-Hellman, ECDH, DSA, ECDSA, and EdDSA.

Grover’s algorithm is different. Grover gives a quadratic speedup for unstructured search. In rough terms, a brute-force search over NNN possibilities can drop to about N\sqrt{N}N​ quantum queries in the ideal model. That matters for symmetric keys, hashes, and password search spaces. It does not amount to the same kind of collapse that Shor causes for public-key systems. Grover weakens margins. Shor destroys assumptions.

That distinction is where a lot of bad headlines go wrong. If someone says “quantum computers break encryption,” ask a rude but useful follow-up: which encryption, and by which algorithm? If the answer is RSA key exchange or ECDSA signatures, that is a serious claim. If the answer is “all passwords and all AES instantly,” the speaker is compressing different threat models into a cartoon.

The practical takeaway is stark. Public-key cryptography is the urgent migration problem. Symmetric cryptography is the parameter and margin problem. That is why NIST’s first post-quantum standards were not replacements for AES. They were standards for a post-quantum key encapsulation mechanism and post-quantum signature algorithms: ML-KEM in FIPS 203, ML-DSA in FIPS 204, and SLH-DSA in FIPS 205.

Two quantum threats that people keep mixing together

Quantum effectWhat it targetsWhat happens to current cryptoWhat defenders do
Shor-type attackFactoring and discrete logarithmsBreaks much of today’s public-key cryptography such as RSA, DH, ECDH, ECDSA, and EdDSAMigrate to post-quantum key exchange and signatures
Grover-type attackGeneric search and brute forceReduces security margins for symmetric keys, hashes, and password search spaces, but does not create the same collapseUse strong symmetric keys, good password hashing, rate limits, and crypto agility

Almost every scary public discussion blends these two rows together. That blend is the mistake. One row is a structural failure of current public-key assumptions. The other is a pressure increase on brute-force cost models.

What breaks first, what bends, and what mostly survives

Now the picture sharpens.

Public-key key exchange breaks first. When two systems use RSA or elliptic-curve methods to agree on a shared secret over an untrusted network, a future CRQC could attack the public-key math that protects the exchange. That is why NIST’s key establishment standard, FIPS 203, centers on ML-KEM. NIST describes a KEM as the tool two parties use to establish a shared secret over a public channel, and it says ML-KEM is believed secure even against adversaries who possess a quantum computer.

Digital signatures are the second major casualty. If your trust model depends on signatures produced with classical vulnerable algorithms, a CRQC threatens more than secrecy. It threatens authenticity. The NCSC spells it out: an adversary with a CRQC could forge signatures, impersonate legitimate private-key owners, or tamper with signed information. That matters for code signing, firmware, certificates, identity systems, secure boot, document signing, and any long-lived trust anchor.

Symmetric encryption survives far better. AES does not rely on factoring or discrete logarithms. Grover’s algorithm changes the cost curve for brute force, but not in the same catastrophic way. The NCSC says symmetric cryptography with at least 128-bit keys can continue to be used. That is why the post-quantum transition is not a total rewrite of cryptography. It is a selective but enormous rewrite of the asymmetric layer around it.

Hash functions also survive better than people assume. Secure hashes are not significantly affected in the NCSC guidance, though margin still matters. You still need good parameter choices and sane protocol design. You do not need to behave as if SHA-256 becomes instantly decorative the moment a useful quantum machine appears.

Passwords sit in a more complicated place. A password is not one thing. It can be an online login secret checked behind rate limits. It can be a password-derived key used to decrypt a local file or disk. It can be one input to a multi-factor scheme. It can be stored badly in a breached database. Each case behaves differently under quantum attack, because the real bottleneck is not always raw search size. It may be the server’s lockout policy, the cost of each hash evaluation, the presence of salt, the use of Argon2id, or the fact that the user chose “Summer2026!” in the first place.

That last point is worth saying plainly. A weak password does not become strong because the attacker lacks a quantum computer, and a strong authentication system does not become useless because the attacker has one. Quantum computing changes attack economics. Security architecture still decides whether the attacker gets an easy offline guessing game or a rate-limited slog with alerts, MFA, and account protection.

Passwords are still mostly a human and systems problem

The phrase “quantum computer will break all passwords” sounds dramatic because it hides the only distinction that matters: online guessing versus offline cracking.

In an online attack, the attacker interacts with the real service. They send login attempts. The service answers yes or no. The defender controls the tempo. NIST SP 800-63B requires verifiers to implement rate limiting on failed authentication attempts. That requirement does not evaporate in the quantum era. A quantum computer does not grant permission to try ten trillion guesses against a live service if the service allows five and then locks the account, triggers step-up authentication, or trips monitoring.

In an offline attack, the attacker has stolen password hashes or encrypted data and can guess locally. That case is more dangerous because the defender no longer controls the rate. Here, password hashing design matters enormously. RFC 9106 describes Argon2 as a memory-hard function for password hashing, and states that Argon2id is the primary variant and must be supported by implementations of the RFC. OWASP’s password storage guidance says stored passwords must remain protected even if the database is compromised.

That is why mature password defense has layers. You salt each password. You hash with a slow, memory-hard function. You tune parameters so each guess is expensive. You monitor breaches. You reject known-compromised passwords. You use MFA where it matters. None of that becomes irrelevant because Grover exists. Grover is still a search algorithm. It does not remove the per-guess work defenders intentionally impose.

A lot of people secretly hope for the wrong kind of answer here. They want either “quantum kills passwords, abandon all hope” or “quantum does not matter, ignore it.” The honest answer is less theatrical. Quantum pressure makes already-good password engineering more important, not less. If you store passwords badly, you are in trouble today and more trouble later. If you store them well and back them with rate limits and MFA, quantum computing does not flatten that architecture into nothing.

There is another nuance professionals should keep in mind. Passwords often protect symmetric material, not public-key math directly. A local password manager vault, a disk encryption passphrase, or a password-derived archive key is not the same problem as an RSA certificate. That does not make it easy for the attacker. It changes which quantum tool is relevant. In those cases the threat is closer to Grover-style brute-force acceleration, not a Shor-style demolition of the underlying scheme. The design response is to raise entropy, raise per-guess cost, and reduce the attacker’s offline opportunities.

So no, quantum computers do not “break all passwords” as a single class of object. They make bad password practices even less forgivable. That is a different statement, and it happens to be the true one.

PGP is one of the places where the threat feels painfully concrete

If you want a crisp example of quantum risk, PGP and OpenPGP are excellent examples because they already contain the split we have been discussing.

OpenPGP is not one algorithm. It is a format and protocol family for combining public-key cryptography, symmetric encryption, signatures, compression, and key management. The current OpenPGP standard is RFC 9580. It requires implementations to support Ed25519 for signatures and X25519 for encryption, and it also requires support for AES-128 on the symmetric side.

That hybrid structure is the key to understanding the quantum problem. When you encrypt a PGP message, the message body is typically protected with a symmetric session key, and that session key is then protected or delivered using public-key mechanisms. Under a future CRQC, the symmetric payload cipher is not the first thing that dies. The public-key envelope and the signature layer are. If an attacker can break the public-key mechanism used to recover the session key, the message is readable. If an attacker can forge the signature scheme, authenticity is gone.

That is why PGP archives are a classic harvest-now, decrypt-later problem. The NCSC warns that attackers may collect and store encrypted data today and decrypt it later if a CRQC becomes available, especially when the information is high value and needs long-term confidentiality. Old email archives, diplomatic mail, legal material, M&A records, research correspondence, intelligence reporting, and long-retention backups all fit that profile.

There is some good news. OpenPGP is not frozen. The IETF has an active draft, Post-Quantum Cryptography in OpenPGP, which extends RFC 9580. The January 2026 draft says it defines a post-quantum public-key algorithm extension for OpenPGP, providing a basis for long-term secure signatures and ciphertexts, with composite public-key encryption based on ML-KEM and composite signatures based on post-quantum algorithms. That matters because it shows the ecosystem is moving from theory to actual wire-format migration.

Still, professionals should resist a comforting half-truth here. The existence of a draft is not the same as having a battle-tested, universally deployed, audited, interoperable post-quantum PGP world. Today’s standardized OpenPGP baseline is still classical. That is the uncomfortable point. If you are protecting information that must stay secret for decades, classical OpenPGP alone should no longer feel like a sufficient long-term answer.

The professional lesson is sharp. PGP is not obsolete because of quantum computing. PGP without a post-quantum migration path is time-limited for long-lived confidentiality and trust. Those are not the same statement. One is defeatist nonsense. The other is exactly what current standards work is trying to solve.

The migration is already happening in real systems

The post-quantum story stopped being speculative when standards and production deployments started landing.

On August 13, 2024, NIST approved the first three Federal Information Processing Standards for post-quantum cryptography: FIPS 203 for ML-KEM, FIPS 204 for ML-DSA, and FIPS 205 for SLH-DSA. NIST’s broader PQC page says these standards can be implemented now to secure a wide range of electronic information. In March 2025, NIST also selected HQC as a backup algorithm for post-quantum encryption, explicitly as a backup to ML-KEM rather than a replacement.

That standards movement has been matched by policy movement. The NCSC’s March 2025 migration guidance gives organizations three milestone dates: by 2028 define goals, discover crypto dependencies, and build an initial plan; by 2031 complete early high-priority migrations and refine the roadmap; by 2035 complete migration to PQC across systems, services, and products. That is not a laboratory curiosity. That is operational planning guidance for real organizations.

Large-scale internet deployment is already visible. Cloudflare wrote in October 2025 that more than half of human-initiated traffic with Cloudflare was using post-quantum encryption, mitigating the store-now-decrypt-later threat. Its documentation also shows hybrid key agreement between Cloudflare and origin servers using X25519 and ML-KEM, while noting awkward real-world issues such as the larger ClientHello sometimes spilling into multiple packets. That is exactly what serious migration looks like: not just new math, but protocol edge cases, middlebox behavior, and compatibility pain.

Messaging systems are moving too. Signal’s PQXDH specification states that the protocol establishes a shared secret with post-quantum forward secrecy, though it still relies on the hardness of the discrete log problem for mutual authentication in the current revision. That sentence is valuable because it captures the real character of this transition: partial upgrades, hybrid designs, and stepwise replacement of vulnerable assumptions rather than a single dramatic cutover.

The tooling layer is catching up as well. OpenSSL 3.5, released in April 2025, added support for ML-KEM, ML-DSA, and SLH-DSA. That is not a guarantee of painless deployment, but it is a very strong signal that post-quantum support is leaving the “research demo” bucket and entering mainstream cryptographic infrastructure.

The beginner should take one message from all this: the migration has started before the quantum emergency has fully arrived. The professional should take a second message: organizations that wait for final certainty will be forced into rushed, brittle, expensive changes under pressure. Standards bodies are buying time. That only helps if people use it.

The professional threat model looks different from the consumer one

A consumer using current mainstream browsers, operating systems, and messaging apps will mostly inherit the transition through software updates. The NCSC says that for users of commodity IT, the switchover should ideally happen seamlessly as software updates land. That is reassuring, but it should not be read as “problem solved.” It means the burden shifts upward into the supply chain, platform vendors, protocol designers, certificate ecosystems, HSM vendors, embedded device makers, and operators of long-lived infrastructure.

For professionals, the first hard problem is discovery. You cannot migrate cryptography you cannot find. The NCSC’s 2025 timeline guidance spends an unusual amount of space on asset discovery for a reason. You need to know which services use public-key cryptography, where certificates live, how trust anchors are embedded, what data must remain confidential for ten or twenty years, which hardware cannot be upgraded, and which third parties or managed service providers own part of the crypto path.

The second hard problem is prioritization by lifetime, not just by current criticality. Data that stays sensitive for three days and data that stays sensitive for thirty years do not face the same quantum risk. A system that rotates short-lived session material and stores little is different from one that signs firmware for industrial devices expected to remain deployed for fifteen years. The NCSC calls out long-lived trust anchors and long-term protection as present-day reasons to care before a CRQC exists.

The third problem is protocol fit. Post-quantum algorithms are not neat drop-in replacements in every context. They can have larger keys, larger signatures, more bandwidth cost, more CPU cost, different implementation hazards, and rough interactions with legacy protocols or middleboxes. Cloudflare’s discussion of split ClientHello packets is a perfect micro-example. The math may be fine while the deployed network path still breaks in annoying ways.

The fourth problem is signature diversity. General-purpose signatures might use ML-DSA, while firmware or software signing may be better served by SLH-DSA, LMS, or XMSS depending on assurance and operational constraints. The NCSC recommends ML-DSA for general-purpose use, and treats SLH-DSA, LMS, and XMSS as suitable for cases such as firmware and software signing. NIST SP 800-208 separately approved LMS and XMSS for stateful hash-based signatures.

The fifth problem is crypto agility. This phrase gets abused, but the real need is simple: do not weld your system so tightly to one algorithm family that you need surgery every time standards change. NIST already selected HQC as a backup because healthy cryptographic ecosystems do not bet civilization on one mathematical assumption forever. A professional system should be designed to replace algorithms, parameters, certificate types, and protocol suites without architectural trauma.

There is a final point experts should keep close: quantum migration is a security project, not just a cryptography project. The NCSC frames the work as part of broader cyber resilience, and that is right. A rushed migration can create new bugs, downgrade paths, operational blind spots, broken interoperability, and false assurance. The goal is not to sprinkle ML-KEM into a diagram and declare victory. The goal is to preserve confidentiality, authenticity, and recoverability while the mathematical foundation under current PKI changes beneath you.

Security is not an illusion, but permanence is

The easiest answer to the user’s question would be “yes, security is an illusion.” It is also lazy. It confuses temporary assumptions with impossibility.

Security has never meant “unbreakable forever.” It has meant expensive enough to attack, controlled enough to trust, monitored enough to respond, and adaptable enough to survive change. Quantum computing does not end that idea. It is simply one of the biggest environmental changes modern cryptography has ever had to absorb.

If you strip away the headlines, the deeper lesson is almost conservative. Strong security still comes from layers. Use public-key algorithms that still deserve trust. Use strong symmetric primitives. Hash passwords with memory-hard functions. Rate-limit login attempts. Use MFA. Patch systems. Rotate keys. Reduce long-term secrets. Inventory dependencies. Upgrade roots of trust. Demand roadmaps from vendors. That was good security yesterday. It remains good security in the quantum transition.

The quantum era does add one new cultural demand: stop treating cryptography as furniture. It is not a static fixture you install once and ignore for a decade. It is a living dependency with shelf life. Organizations that understand that will manage the shift. Organizations that assume RSA and classical signatures can sit untouched forever are already on borrowed time.

So the clean answer is this: a future cryptographically relevant quantum computer would break much of today’s public-key cryptography, would put classical PGP at serious long-term risk, and would force a broad migration already underway. It would not make all passwords fall over at once, and it would not make security itself meaningless. Security is not an illusion. Comfort is.

Author:
Jan Bielik
CEO & Founder of Webiano Digital & Marketing Agency

Security after quantum computing and the collapse of classical trust
Security after quantum computing and the collapse of classical trust

FAQ

Will a quantum computer instantly break all passwords?

No. Online passwords are still constrained by rate limits, lockouts, monitoring, and MFA, and NIST requires verifiers to limit failed authentication attempts. Offline password cracking remains dangerous, which is why defenders need salted, memory-hard password hashing such as Argon2id.

Is AES broken by quantum computing?

Not in the way RSA or ECDH would be. Symmetric cryptography is affected much less severely, and the NCSC says existing symmetric algorithms with at least 128-bit keys can continue to be used. The urgent migration problem is mostly in the public-key layer.

Why is PGP especially exposed to the quantum threat?

Because OpenPGP combines symmetric encryption with public-key wrapping and signatures. The symmetric payload is not the first thing that fails; the public-key part that protects the session key and the classical signature layer are. Current OpenPGP is standardized in RFC 9580, while post-quantum OpenPGP extensions are still moving through the IETF process.

Can attackers steal encrypted data now and read it later?

Yes, for high-value long-lived data that is the core of the harvest-now, decrypt-later threat. The NCSC warns that attackers may store encrypted data today and decrypt it later if a CRQC becomes available, which is why long-term confidentiality needs earlier migration.

Are real systems already using post-quantum cryptography?

Yes. NIST’s first PQC standards were finalized in August 2024, Cloudflare reported in October 2025 that a majority of human-initiated traffic on its network used post-quantum encryption, Signal published PQXDH, and OpenSSL 3.5 added support for major PQC algorithms.

Do small companies need to rebuild everything now?

Usually not all at once. For commodity IT, the transition will often arrive through vendor updates, but organizations still need discovery, vendor pressure, upgrade planning, and a roadmap. The NCSC’s milestone guidance points to planning now, early priority migrations by 2031, and full migration by 2035.

Is quantum key distribution the answer instead of post-quantum cryptography?

Not for most real-world systems. NSA says it does not recommend quantum key distribution for securing National Security Systems unless major limitations are overcome. The practical mainstream path is post-quantum cryptography running on classical computers and classical networks.

This article is an original analysis supported by the sources cited below

What Is Quantum Computing?
IBM’s overview of qubits, superposition, interference, entanglement, and the limits of quantum advantage.

Polynomial-Time Algorithms for Prime Factorization and Discrete Logarithms on a Quantum Computer
Peter Shor’s foundational result showing efficient quantum algorithms for factoring and discrete logarithms.

A fast quantum mechanical algorithm for database search
Lov Grover’s original search algorithm paper, the source of the quadratic speedup discussed for brute-force search.

Next steps in preparing for post-quantum cryptography
UK NCSC guidance on the threat model, affected algorithms, migration posture, and recommended post-quantum algorithms.

Timelines for migration to post-quantum cryptography
NCSC milestone guidance with the 2028, 2031, and 2035 target dates for migration work.

Post-quantum cryptography
NIST’s central PQC page with standards status, background, and current algorithm portfolio.

Module-Lattice-Based Key-Encapsulation Mechanism Standard
FIPS 203, the primary NIST standard for post-quantum key establishment using ML-KEM.

Module-Lattice-Based Digital Signature Standard
FIPS 204, NIST’s standard for ML-DSA digital signatures.

Stateless Hash-Based Digital Signature Standard
FIPS 205, NIST’s standard for SLH-DSA, a stateless hash-based post-quantum signature scheme.

NIST releases first 3 finalized post-quantum encryption standards
NIST’s announcement of the first finalized PQC standards in August 2024.

NIST selects HQC as fifth algorithm for post-quantum encryption
NIST’s March 2025 announcement selecting HQC as a backup key-establishment algorithm.

Quantum Readiness: Migration to Post-quantum Cryptography Fact Sheet
Joint fact sheet from CISA, NSA, and NIST urging organizations to begin readiness planning.

Post-Quantum Cybersecurity Resources
NSA’s official post-quantum resource hub, including its position on QKD and CNSA-related guidance.

RFC 9106 Argon2 Memory-Hard Function for Password Hashing and Proof-of-Work Applications
The Argon2 specification used to support the password storage and offline cracking discussion.

Password Storage Cheat Sheet
OWASP guidance on protecting stored passwords even after database compromise.

NIST Special Publication 800-63B
NIST’s digital identity guidance used here for rate-limiting requirements and authentication controls.

RFC 9580 OpenPGP
The current OpenPGP standard describing algorithms, message formats, and interoperability requirements.

Standard
OpenPGP project page pointing to RFC 9580 as the current proposed standard.

Post-Quantum Cryptography in OpenPGP
Current IETF draft extending OpenPGP with post-quantum public-key mechanisms.

The PQXDH Key Agreement Protocol
Signal’s technical specification for post-quantum extended Diffie-Hellman.

State of the post-quantum Internet in 2025
Cloudflare’s October 2025 deployment report showing post-quantum traffic adoption at scale.

Post-quantum between Cloudflare and origin servers
Operational documentation on hybrid X25519 and ML-KEM deployment details and compatibility concerns.

OpenSSL 3.5 Final Release
Release announcement showing PQC algorithm support entering one of the most widely used crypto libraries.

EVP_PKEY-ML-KEM
OpenSSL’s ML-KEM documentation, useful for understanding the implementation surface now available.

Recommendation for Stateful Hash-Based Signature Schemes
NIST guidance for LMS and XMSS, relevant to software and firmware signing in the post-quantum transition.

Quantum error correction below the surface code threshold
Nature paper on a major 2024 error-correction milestone, used to separate real hardware progress from exaggerated claims about immediate cryptographic collapse.