In 1994, Peter Shor — a mathematician at AT&T Bell Labs — published an algorithm that could theoretically shatter the security of the internet. The algorithm could factor large integers in polynomial time, which would break RSA encryption, the mathematical foundation on which most secure digital communication rests. The catch was that it required a computer that did not yet exist: a machine capable of maintaining and manipulating quantum mechanical states across hundreds or thousands of components simultaneously, without those states collapsing from environmental noise. Shor's algorithm was a complete theoretical result. What it required was an engineering achievement of historically unprecedented difficulty.
Three decades later, that engineering challenge remains largely unsolved — but not entirely. Real quantum computers exist. They have demonstrated genuine quantum speedup for specific, narrow problems. Google's 2019 claim of "quantum supremacy," in which a 53-qubit processor completed a specific sampling task in 200 seconds that researchers estimated would take 10,000 years on a classical supercomputer, was disputed in its magnitude (IBM argued the classical estimate was wrong by many orders of magnitude) but not in its basic physics. Quantum computers are real devices that exploit real quantum phenomena. The question is not whether they work but what they are actually good for, how far they are from being useful for cryptographic attacks, and what the honest timeline is.
Understanding quantum computing requires confronting three common misconceptions that distort nearly every popular account. First: qubits are not simply "both 0 and 1 at the same time" — they are quantum systems whose states are described by probability amplitudes, and the full meaning of that requires more precision. Second: quantum computers do not work by trying all possible answers simultaneously — they work by using interference to amplify the probability of correct answers. Third: breaking RSA encryption requires not the 50-to-1,000 noisy qubits that current machines have, but millions of fault-tolerant qubits operating at error rates far below anything achieved today. The gap between where quantum computing is and where it would need to be to threaten current encryption is not a near-term engineering problem. It is a fundamental scientific challenge.
"Quantum computers are not magic. They are machines that exploit very specific mathematical structures in nature to solve very specific mathematical problems faster than we know how to solve them classically. Everything else is hype." — John Preskill, Quantum Computing in the NISQ Era and Beyond (2018)
Key Definitions
| Concept | Classical Computing Analog | Quantum Distinction | Practical Significance |
|---|---|---|---|
| Qubit | Classical bit (0 or 1) | Described by probability amplitudes; collapses to 0 or 1 on measurement | More representational capacity per physical unit in theory; error-prone in practice |
| Superposition | N/A | Pre-measurement state is a linear combination of basis states | Enables quantum algorithms to process information about multiple states simultaneously |
| Entanglement | N/A | Correlated qubits whose states cannot be described independently | Enables quantum communication protocols and increases computational power in some algorithms |
| Quantum interference | N/A | Probability amplitudes can cancel or reinforce, allowing amplification of correct answers | The mechanism by which quantum algorithms achieve speedup |
| Decoherence | N/A | Qubits losing their quantum properties through environmental interaction | The primary engineering challenge preventing scalable quantum computation today |
| Fault-tolerant quantum computing | Error-corrected classical computing | Requires thousands of physical qubits per logical qubit to achieve reliable computation | Current machines are NISQ (noisy intermediate-scale quantum); fault tolerance is years away |
Qubit — A quantum bit, the basic unit of quantum information. Unlike a classical bit, which is always either 0 or 1, a qubit's state before measurement is described by probability amplitudes: a complex linear combination of basis states |0> and |1>, written as alpha|0> + beta|1>, where |alpha|^2 + |beta|^2 = 1. Measuring the qubit collapses it irreversibly to either 0 or 1, with probabilities given by the squared amplitudes.
Superposition — The quantum mechanical property by which a qubit's pre-measurement state is a weighted combination of 0 and 1 described by probability amplitudes. This is not "being both at once" in a classical sense; it is a description of quantum indeterminacy that governs measurement outcomes.
Entanglement — A correlation between two or more qubits such that the quantum state of each cannot be described independently. Measuring one entangled qubit instantly determines correlated properties of its partner regardless of physical separation. This is not faster-than-light communication; no classical information is transferred.
Quantum interference — The process by which probability amplitudes for different computational paths add constructively or cancel destructively. Well-designed quantum algorithms use interference to amplify the probability of correct answers and suppress incorrect ones. This, not parallelism, is the source of quantum speedup.
Decoherence — The process by which a qubit loses its quantum state through interaction with the environment. Heat, vibration, and stray electromagnetic fields all cause decoherence. It is the central engineering obstacle to practical quantum computing.
Quantum error correction — A technique for encoding one logical qubit across many physical qubits so that errors can be detected and corrected without collapsing the quantum state. Current best estimates require roughly 1,000 physical qubits per fault-tolerant logical qubit using surface code methods.
NISQ era — Noisy Intermediate-Scale Quantum, a term coined by John Preskill in 2018 to describe the current period in which quantum processors have 50 to 1,000 qubits with error rates too high for fault-tolerant computation.
Post-quantum cryptography — Classical cryptographic algorithms designed to resist attacks from both classical and quantum computers, based on mathematical problems believed hard for both. NIST finalized post-quantum standards in 2024.
What Quantum Mechanics Actually Says
Superposition Without the Misconception
The popular description of superposition — that a qubit is "both 0 and 1 at the same time" — is not exactly wrong but is seriously misleading in ways that matter for understanding what quantum computers can and cannot do.
The accurate description: a qubit is a quantum system whose state before measurement is the complex linear combination alpha|0> + beta|1>. The numbers alpha and beta are probability amplitudes — complex numbers whose squared magnitudes give the probability of each measurement outcome. When you measure the qubit, it collapses to |0> with probability |alpha|^2 or to |1> with probability |beta|^2. After measurement, the superposition is gone. You get one definite bit.
A register of n qubits has a state space of dimension 2^n. Fifty qubits can be in a superposition of roughly one quadrillion states simultaneously. This is what people mean when they say quantum computers are exponentially more powerful — but the qualification is essential. Measuring the 50-qubit register yields exactly one of those quadrillion states, sampled according to the probability distribution encoded in the amplitudes. Simply putting a register into a large superposition is no more computationally useful than rolling a trillion-sided die. The useful work happens in how the amplitudes are manipulated before measurement.
Entanglement and the Bell Tests
The physical reality of entanglement — and the impossibility of explaining it with hidden local variables — was settled experimentally, definitively, by a sequence of experiments beginning in the 1970s and culminating in loophole-free tests in 2015.
John Bell's 1964 theorem derived a mathematical inequality: any theory based on local hidden variables (Einstein's preferred explanation for quantum correlations) predicts that correlations between measurements on separated particles are bounded by a specific value. Quantum mechanics predicts violations of this bound. The two predictions are distinct and testable.
Alain Aspect and colleagues performed decisive Bell test experiments at the Institut d'Optique in 1982, using entangled photons and rapidly switching detector orientations. They observed violations of Bell inequalities consistent with quantum mechanics and inconsistent with any local hidden variable theory. Aspect, John Clauser, and Anton Zeilinger were awarded the 2022 Nobel Prize in Physics for this line of work.
The most rigorous confirmation came in 2015, when Bas Hensen and colleagues at Delft University of Technology published a loophole-free Bell test in Nature. By entangling nitrogen-vacancy centers in diamonds separated by 1.3 kilometers and closing both the locality and detection loopholes simultaneously, they demonstrated Bell inequality violations at statistical significance levels that excluded all major objections. Quantum entanglement is not a theoretical speculation. It is an experimentally confirmed feature of physical reality that cannot be explained by any classical mechanism.
For quantum computing, entanglement is not merely philosophically interesting — it is computationally essential. Entangled qubits allow quantum algorithms to establish correlated states across the entire register that can be manipulated coherently, enabling the interference effects that drive quantum speedup. Without entanglement, quantum algorithms degrade to classical computation.
Quantum Interference: The Actual Source of Speedup
The correct way to understand quantum computational power is through interference, not simultaneous parallelism. When a quantum algorithm evolves a superposition of states through a sequence of quantum gates, the probability amplitudes of those states interfere. Some amplitudes add constructively, growing larger; others cancel destructively, approaching zero.
A well-designed quantum algorithm is engineered so that correct answers accumulate constructively interfering amplitude while incorrect answers cancel. When the system is finally measured, the correct answer emerges with high probability. This is fundamentally different from running many computations in parallel. If you could simply read off all 2^50 results from a 50-qubit register, quantum computing would be essentially magic. You cannot. What makes a quantum algorithm powerful is the specific interference structure its author has engineered to concentrate probability on useful outputs.
This means quantum speedup is not general. It applies only to problems with mathematical structure that can be exploited through interference engineering. Problems without this structure — the vast majority of computation — receive no quantum advantage.
Quantum Algorithms That Matter
Shor's Algorithm: Why Cryptographers Pay Attention
Shor's 1994 algorithm factors large integers in polynomial time by reducing the factoring problem to period-finding. Finding the period of a modular exponential function is exponentially hard for classical computers but can be done in polynomial time using quantum phase estimation and the quantum Fourier transform. The algorithm is one of the most elegant results in theoretical computer science.
RSA encryption relies on the asymmetric difficulty of multiplication versus factoring. Multiplying two large primes is computationally trivial; factoring their product is believed computationally infeasible for classical machines at the key sizes currently in use. RSA-2048, the standard used to protect most internet communication, relies on the infeasibility of factoring a 2048-bit number. Shor's algorithm eliminates this asymmetry for quantum hardware.
The catch is the hardware. Running Shor's algorithm on RSA-2048 at the scale needed to factor it in practical time requires approximately 4,000 fault-tolerant logical qubits. Each logical qubit, using current best estimates for surface code error correction, requires roughly 1,000 physical qubits operating at gate fidelities far above anything demonstrated today. The total physical qubit count needed is on the order of 4 million — compared to the 1,121 physical qubits in IBM's best current processor. This is not an incremental gap. It is a categorical difference that will require qualitative improvements in qubit coherence, gate fidelity, and error correction overhead.
Grover's Algorithm: Quadratic Speedup, Manageable Implications
In 1996, Lov Grover published an algorithm for searching an unsorted database of N items in O(sqrt(N)) time, compared to the classical O(N). Unlike Shor's exponential speedup, Grover's advantage is quadratic — large but bounded. It is also proven optimal: no quantum algorithm can do better for unstructured search.
Grover's algorithm has real cryptographic implications. AES-128 has an effective security level of 64 bits against a quantum adversary using Grover's algorithm, which halves the bit security of symmetric ciphers. The practical response is straightforward: move to AES-256, which provides 128-bit post-quantum security. This is a manageable adjustment within existing cryptographic practice.
The more significant point is comparative. Grover's quadratic speedup for search is real and demonstrated. Shor's exponential speedup for factoring is real and demonstrated (in principle). But the hardware requirements for each are completely different. Grover's algorithm provides a genuine advantage with near-term hardware; Shor's requires fault-tolerant hardware that does not yet exist.
The Engineering Reality: Decoherence and Error Correction
Why Building Qubits Is Hard
Physical qubits are implemented in several competing technologies: superconducting circuits (IBM, Google), trapped ions (IonQ, Quantinuum), photonic qubits (PsiQuantum), and nitrogen-vacancy centers in diamond. Each has different coherence times, gate speeds, and connectivity characteristics. All face the same fundamental enemy: decoherence.
Decoherence occurs when a qubit's quantum state becomes entangled with its environment — heat, vibration, stray photons, electromagnetic noise — collapsing the superposition and destroying the quantum information. Current superconducting qubits at IBM and Google operate near 15 millikelvin (colder than deep space) and maintain coherence for 50 to 500 microseconds. A deep quantum algorithm for a practical problem requires many thousands of gate operations within this window before errors accumulate beyond recovery.
Current two-qubit gate fidelities are approximately 99.0 to 99.5 percent for the best superconducting systems. An error rate of 0.5 percent per gate sounds acceptable, but quantum algorithms for cryptographic problems require millions of gate operations. Even 99.5 percent fidelity per gate means 99.5 percent raised to the 10,000th power — approximately 5 percent survival probability after 10,000 gates without error correction. The result is almost certainly wrong.
Quantum Error Correction: The 1,000-to-1 Problem
Quantum error correction works by encoding one logical qubit across many physical qubits using a redundant code. The most promising approach is the surface code, which detects and corrects errors by measuring stabilizer operators across neighboring qubits without collapsing the logical state. The overhead is substantial.
Achieving a logical qubit with error rate below the threshold needed for practical computation, using the surface code with current physical qubit error rates around 0.5 percent, requires approximately 1,000 physical qubits per logical qubit. Shor's algorithm for RSA-2048 needs roughly 4,000 logical qubits. The arithmetic: 4,000 logical qubits times 1,000 physical qubits per logical qubit equals approximately 4 million physical qubits.
IBM's Condor processor achieved 1,121 physical qubits in 2023. The distance between 1,121 and 4,000,000 is not a near-term engineering challenge. It represents a roughly 3,600-fold increase in qubit count alongside qualitative improvements in gate fidelity that have not yet been demonstrated at any scale. Google's Willow processor, announced in December 2024, demonstrated that logical error rates decrease as more physical qubits are added to the surface code — a necessary condition for scalable error correction and a genuine milestone — but it is not yet a practical computation. The honest consensus among leading researchers is that cryptographically relevant quantum computers are 10 to 20 or more years away.
What Quantum Computers Are Actually Useful For Now
Molecular Simulation: Feynman's Original Motivation
In 1982, Richard Feynman proposed that a computer built from quantum components could simulate quantum systems efficiently — that the best simulator of a quantum system is another quantum system. Classical computers cannot efficiently simulate quantum mechanics because the state space of a system of N particles grows exponentially with N. A classical simulation of a molecule with 50 electrons requires representing and manipulating a state space of 2^50 amplitudes, which quickly exceeds the memory and computation available on any classical machine.
Quantum computers with a few hundred reliable logical qubits could in principle simulate molecular systems of direct practical relevance to drug discovery, battery materials, nitrogen fixation catalysts, and high-temperature superconductor design. These are problems where quantum mechanics is being used to study quantum phenomena — exactly the application domain Feynman identified. This is the most compelling near-term application for quantum hardware, and it does not require breaking RSA or achieving general quantum supremacy.
Quantum Key Distribution
Quantum key distribution (QKD) is already commercially deployed and does not require fault-tolerant computation. QKD protocols such as BB84 use the quantum mechanical principle that measuring a quantum system disturbs it: any eavesdropper on a QKD exchange leaves detectable traces, making it physically impossible to intercept a key without being detected. The security derives from physical law, not computational hardness.
Commercial QKD networks operate at scale in China, where the Beijing-Shanghai quantum backbone extends 2,000 kilometers. European and Japanese networks are also deployed. The limitation is distance: quantum signals attenuate in optical fiber and cannot be amplified by classical repeaters without destroying the quantum state. Quantum repeaters that extend range using entanglement are a near-term research priority but not yet commercially available.
Post-Quantum Cryptography: Deploying Now
The most practically urgent consequence of quantum computing research is not that current encryption will be broken imminently but that the migration to quantum-resistant algorithms must begin now. Adversaries can collect encrypted communications today and decrypt them later when quantum computers become available — a "harvest now, decrypt later" attack strategy. Data with long-term sensitivity is vulnerable to this strategy even if the quantum computer needed to break it does not yet exist.
In 2024 NIST finalized its first post-quantum cryptographic standards. CRYSTALS-Kyber (now formally ML-KEM) is a lattice-based key encapsulation mechanism. CRYSTALS-Dilithium (now ML-DSA) is a lattice-based digital signature scheme. Both are based on mathematical problems in lattice theory — specifically the Learning With Errors problem — believed to be hard for both classical and quantum computers. Google began deploying CRYSTALS-Kyber in Chrome in 2023. Apple introduced post-quantum protections in iMessage with the PQ3 protocol in 2024.
The migration challenge is comparable in scope to the Y2K transition: essentially all internet infrastructure that implements RSA or elliptic-curve cryptography will need updating. The process will take years to decades across global IT infrastructure, which is precisely why NIST moved urgently and why major technology companies are already deploying the new standards.
For more on how current encryption works and what post-quantum migration means in practice, see what is encryption. For the relationship between quantum computing and AI development trajectories, see AI safety and alignment challenges.
Honest Assessment: The NISQ Era and What Comes Next
The public discourse about quantum computing oscillates between two failure modes. Dismissal — "it is just academic research, it will never work" — ignores the genuine physics and engineering progress of the past decade. Breathless acceleration — "quantum computers will transform everything within five years" — ignores the categorical gap between current NISQ devices and the fault-tolerant systems that would be needed for most claimed applications.
The honest picture, as of 2025: quantum computing is real and progressing. The underlying physics is not in question. Genuine quantum speedup has been demonstrated for specific problems. The engineering trajectory is clear and positive. But the distance between today's noisy 50-to-1,000 qubit machines and the fault-tolerant systems needed for cryptographically relevant computation requires qualitative breakthroughs in qubit coherence, gate fidelity, and error correction overhead that leading researchers estimate will take at minimum a decade and possibly considerably longer.
Google's 2024 demonstration with the Willow processor that logical error rates decrease as the surface code is expanded with more physical qubits — below-threshold scaling — is a genuine and significant milestone. It proves that the surface code approach to error correction is physically viable, not merely theoretical. But it is not yet a fault-tolerant logical qubit. It is a demonstration that the right ingredients are present. Assembling those ingredients into a system that can run Shor's algorithm on cryptographic key sizes remains a multi-decade project at best.
The meaningful milestones between today and cryptographically relevant quantum computing are: sustained operation of a single fault-tolerant logical qubit through a full algorithm run; demonstration of tens of logical qubits with below-threshold error rates; and quantum advantage on a practically useful application rather than a benchmark task. None of these has been achieved as of 2025. The transition from demonstration to deployment at cryptographic scale involves engineering challenges that have not yet been confronted at the necessary scale.
The appropriate response for organizations managing long-lived sensitive data is not panic but preparation: begin deploying post-quantum cryptographic standards where they are available, prioritize the migration of systems that protect data with multi-decade sensitivity, and track the meaningful engineering milestones rather than the headline qubit counts.
References
- Shor, P. W. (1994). Algorithms for quantum computation: discrete logarithms and factoring. Proceedings 35th Annual Symposium on Foundations of Computer Science, 124-134.
- Grover, L. K. (1996). A fast quantum mechanical algorithm for database search. Proceedings of the 28th Annual ACM Symposium on Theory of Computing, 212-219.
- Arute, F., et al. (2019). Quantum supremacy using a programmable superconducting processor. Nature, 574(7779), 505-510. https://doi.org/10.1038/s41586-019-1666-5
- Aspect, A., Grangier, P., & Roger, G. (1982). Experimental realization of Einstein-Podolsky-Rosen-Bohm Gedankenexperiment: a new violation of Bell's inequalities. Physical Review Letters, 49(2), 91-94. https://doi.org/10.1103/PhysRevLett.49.91
- Preskill, J. (2018). Quantum computing in the NISQ era and beyond. Quantum, 2, 79. https://doi.org/10.22331/q-2018-08-06-79
- Bell, J. S. (1964). On the Einstein-Podolsky-Rosen paradox. Physics, 1(3), 195-200.
- Hensen, B., et al. (2015). Loophole-free Bell inequality violation using electron spins separated by 1.3 kilometres. Nature, 526(7575), 682-686. https://doi.org/10.1038/nature15759
- Feynman, R. P. (1982). Simulating physics with computers. International Journal of Theoretical Physics, 21(6-7), 467-488.
- NIST. (2024). Post-Quantum Cryptography Standardization: Final Standards. Federal Information Processing Standards 203, 204, 205.
Frequently Asked Questions
What is a qubit and how is it different from a classical bit?
A classical bit is always either 0 or 1. A qubit is a quantum system whose state is described by a probability amplitude — it can exist in a superposition of 0 and 1 until measured, at which point it collapses to a definite value. This is not the same as being 'both at once'; it means the qubit's state is a weighted combination of both possibilities, and the weights determine measurement probabilities.
Does a quantum computer try all answers simultaneously?
No. This is a common misconception. A quantum computer does not evaluate all solutions at once. Instead, quantum algorithms use interference — amplifying the probability amplitude of correct answers and suppressing incorrect ones — so that when the system is measured, the right answer appears with high probability. Without a clever algorithm designed to exploit interference, a quantum computer offers no advantage over a classical one.
What is Shor's algorithm and why does it matter for encryption?
Shor's algorithm, developed by Peter Shor in 1994, factors large integers in polynomial time using quantum phase estimation and the quantum Fourier transform. Classical factoring algorithms require exponential time, which is the security foundation of RSA encryption. A quantum computer running Shor's algorithm at scale could break RSA-2048, but this requires thousands of fault-tolerant logical qubits — far beyond current hardware.
How close are we to a quantum computer that can break RSA encryption?
Cryptographically relevant quantum computers — those capable of breaking RSA-2048 in practical timeframes — are estimated to be 10 to 20 or more years away. Current quantum processors have 50 to 1,000 noisy physical qubits. Breaking RSA-2048 with Shor's algorithm requires roughly 4,000 fault-tolerant logical qubits, each requiring approximately 1,000 physical qubits for error correction — totaling millions of physical qubits with error rates far below current capabilities.
What is quantum decoherence and why is it the central engineering challenge?
Quantum decoherence is the process by which a qubit loses its quantum state through interaction with its environment — heat, vibration, stray electromagnetic fields. Current qubits maintain coherence for microseconds to milliseconds. Useful computation requires many gate operations before decoherence destroys the quantum state. Quantum error correction can mitigate this but requires roughly 1,000 physical qubits per reliable logical qubit, which is the central barrier to practical quantum computing.
What are quantum computers actually good at today?
Near-term quantum computers show genuine promise for simulating quantum systems (molecular dynamics for drug discovery and materials science, as Richard Feynman proposed in 1982), quantum key distribution for cryptographic security, and quantum sensing and metrology. These are applications where quantum mechanics is being used to study or interact with quantum phenomena directly, rather than replace classical computation for general tasks.
What is post-quantum cryptography and is it already being deployed?
Post-quantum cryptography refers to classical encryption algorithms designed to be secure against quantum attacks. In 2024 NIST finalized standards for post-quantum algorithms including CRYSTALS-Kyber for key encapsulation and CRYSTALS-Dilithium for digital signatures. These algorithms are based on mathematical problems — lattice problems — believed to be hard for both classical and quantum computers. Major technology companies and governments are already beginning deployment.