In 1905, Albert Einstein published four papers that would have made any physicist famous for a lifetime. He published them all in the same year, while working as a patent clerk in Bern, and he was 26 years old. One paper explained Brownian motion, vindicating the atomic theory of matter. One introduced special relativity, rewriting our understanding of space and time. One derived the equation E=mc2. And one explained the photoelectric effect — the puzzling experimental finding that light knocks electrons out of metal only if the light exceeds a threshold frequency, regardless of its intensity. Einstein's explanation was that light comes not in continuous waves but in discrete packets he called quanta. The energy of each packet is proportional to the light's frequency, not its intensity. It was this paper, not relativity, for which Einstein received the Nobel Prize in 1921.
Einstein had unwittingly launched a revolution that would spend the next twenty years destroying his worldview. The quantum theory built on his insight by Niels Bohr, Werner Heisenberg, Erwin Schrodinger, Max Born, Paul Dirac, and others turned out to imply things that Einstein found intolerable: that physical reality at the quantum level is irreducibly probabilistic, that particles do not have definite properties until they are measured, and that two particles can remain correlated across arbitrary distances in a way that seemed to violate his deepest convictions about the locality of physical causation. "God does not play dice," he reportedly said. Bohr's response has become equally famous: "Einstein, stop telling God what to do."
Einstein spent the last thirty years of his life in pursuit of a unified field theory that would restore determinism and locality to physics, and he failed. The dice-playing God won. Quantum mechanics, despite Einstein's objections and despite a century of experimental scrutiny, has never been falsified. Its predictions have been confirmed to greater precision than any other theory in science. The anomalous magnetic moment of the electron — a quantity that quantum electrodynamics predicts and experiments measure — agrees to eleven significant figures. And every technology of the modern world, from the microprocessor to the laser to the MRI machine, depends on quantum mechanical principles that work exactly as the theory predicts.
"If you think you understand quantum mechanics, you don't understand quantum mechanics." — Richard Feynman, The Character of Physical Law (1965)
Key Definitions
Quantum: The smallest discrete unit of a physical quantity. Energy is quantized, meaning it can only be exchanged in discrete multiples of a minimum unit related to Planck's constant.
Wave-particle duality: The phenomenon by which quantum objects such as electrons and photons exhibit both wave-like properties (interference, diffraction) and particle-like properties (localized detection events), depending on the experimental context.
Wavefunction: A mathematical function, typically denoted psi, that encodes the complete quantum state of a system. The square of its magnitude at any point gives the probability of finding the particle at that location.
Superposition: The quantum mechanical principle that a system can exist in a combination of multiple states simultaneously, until a measurement forces it into one definite state.
Heisenberg uncertainty principle: The fundamental quantum mechanical constraint that the product of the uncertainties in position and momentum of a particle cannot be less than h-bar divided by two.
Quantum entanglement: The phenomenon by which two or more quantum particles become correlated such that the state of each cannot be described independently, regardless of the spatial separation between them.
Copenhagen interpretation: The dominant historical interpretation of quantum mechanics, holding that the wavefunction describes probabilities of measurement outcomes and collapses to a definite value upon measurement.
Many-worlds interpretation: Hugh Everett's 1957 interpretation that the wavefunction never collapses but instead all outcomes of a measurement occur in branching parallel branches of a universal wavefunction.
Schrodinger's cat: A 1935 thought experiment by Erwin Schrodinger illustrating the apparent paradox of applying quantum superposition to macroscopic objects.
Planck constant: The fundamental physical constant, denoted h, relating a photon's energy to its frequency. Its reduced form h-bar equals h divided by 2 pi.
Spin: An intrinsic quantum mechanical property of particles, analogous to angular momentum but with no classical equivalent. Electrons have spin-1/2, meaning they can be in one of two spin states (up or down) along any chosen axis.
Pauli exclusion principle: The quantum mechanical principle that no two identical fermions (particles with half-integer spin) can occupy the same quantum state simultaneously. It explains the structure of atoms and the stability of matter.
Quantum tunneling: The quantum mechanical phenomenon by which a particle can pass through an energy barrier that it classically could not surmount. Essential for nuclear fusion, radioactive decay, and transistor operation.
Decoherence: The process by which quantum superpositions of macroscopic objects rapidly decay due to interactions with the environment, explaining why we do not observe superpositions at everyday scales.
Interpretations of Quantum Mechanics Compared
| Interpretation | What happens to the wavefunction | Is measurement special? | Locality | Determinism | Key figure | Key problem |
|---|---|---|---|---|---|---|
| Copenhagen | Collapses to a definite state upon measurement | Yes — measurement is a primitive, fundamental act | Local (in outcomes) | No — intrinsically probabilistic | Bohr, Heisenberg | "Measurement" is undefined; boundary between quantum/classical unclear |
| Many-worlds (Everett) | Never collapses; all outcomes occur in branching parallel worlds | No — decoherence, not measurement, explains apparent collapse | Local | Deterministic (global wavefunction evolves unitarially) | Everett (1957); DeWitt | No mechanism to derive Born rule (probability law) from branching; ontologically extravagant |
| Pilot wave / de Broglie-Bohm | Wavefunction is real and guides a definite particle; always determined | No — position is always definite | Nonlocal (but no signaling) | Deterministic | de Broglie; Bohm (1952) | Nonlocality; awkward in relativistic quantum field theory |
| Objective collapse (GRW) | Randomly collapses spontaneously at low rate; more particles → faster collapse | No — collapse is physical, not observer-induced | Nonlocal in collapse events | No — collapse is stochastic | Ghirardi, Rimini, Weber (1986) | Parameters of collapse rate are ad hoc; no empirical confirmation yet |
| Relational QM | Wavefunction describes relations between systems; facts are relative | No — all interactions are measurements relative to a system | Local | No | Rovelli (1996) | Counter-intuitive ontology; contested what "relative facts" means |
| QBism | Wavefunction is an agent's belief state about future experiences; not ontic | Yes — measurement is an agent's action updating beliefs | Local | No | Fuchs, Mermin | Science-as-prediction vs. science-as-explanation tension; subjective flavor |
The Problem Quantum Mechanics Solved
By the late 19th century, classical physics appeared almost complete. Newton's mechanics and Maxwell's electromagnetism together seemed to describe every physical phenomenon. But three experimental puzzles stubbornly resisted classical explanation, and their resolution required an entirely new framework.
The first was blackbody radiation. A blackbody — an idealized object that absorbs and re-emits all radiation — emits a characteristic spectrum of electromagnetic radiation at any given temperature. Classical physics, in the calculation known as the Rayleigh-Jeans law, predicted that the energy emitted would increase without bound at high frequencies — the "ultraviolet catastrophe." No real blackbody does this; real objects emit less energy at high frequencies, producing a peaked spectrum. Max Planck solved this in 1900 by making a desperate mathematical assumption: he postulated that the energy of the vibrating atoms in the blackbody walls could only take discrete values, multiples of a fundamental unit proportional to frequency. Planck considered this a mathematical trick, not a physical statement. But the formula worked — it reproduced the observed blackbody spectrum exactly.
The second puzzle was the photoelectric effect. When light hits a metal surface, it can eject electrons. Classical wave theory predicted that sufficient intensity of any frequency should eject electrons — more light waves, more energy transferred. But experiment showed that electrons were only ejected when the light exceeded a threshold frequency, independent of intensity. Below the threshold frequency, no electrons were ejected no matter how bright the light. Einstein's 1905 solution took Planck's mathematical trick seriously: light is not a wave but a stream of particles (later called photons), each carrying energy proportional to frequency. If a photon's energy (frequency times Planck's constant) is not enough to overcome the electron's binding energy, no ejection occurs, regardless of how many photons there are.
The third puzzle was atomic emission spectra. When elements are heated or electrified, they emit light at discrete, characteristic frequencies — not a continuous spectrum but sharp spectral lines. Hydrogen's visible emission lines follow a precise mathematical formula (the Balmer series, 1885) that classical physics could not explain. Niels Bohr's 1913 model postulated that electrons orbit the nucleus only at certain discrete energy levels, and that light is emitted when an electron drops from a higher to a lower level, the photon's frequency determined by the energy difference. The model was ad hoc and internally inconsistent, but it predicted hydrogen's spectral lines correctly. The puzzle of why only certain orbits are allowed would eventually be resolved by de Broglie's matter waves and Schrodinger's wave equation.
Wave-Particle Duality
The young Thomas Young demonstrated in 1801 that light is a wave: two narrow slits in a barrier produce an interference pattern on a screen behind them, with alternating bright and dark bands where the waves from the two slits reinforce or cancel each other. This is a wave phenomenon; particles passing independently through two slits would produce two bands, not an interference pattern.
But Einstein had shown in 1905 that light is also a stream of particles. How can it be both? The answer, which took decades to digest, is that quantum objects are neither classical waves nor classical particles but something with properties of both, manifesting whichever aspect is relevant to the experimental question being asked. Louis de Broglie proposed in 1924 that matter — electrons, protons, atoms — also has wave properties, with wavelength inversely proportional to momentum. This was confirmed by the Davisson-Germer experiment (1927), which showed that electrons produce diffraction patterns when scattered from a crystal.
The double-slit experiment is the canonical illustration of wave-particle duality in its most paradoxical form. When electrons are fired one at a time at a double-slit apparatus, each individual electron is detected as a localized point on the screen — a particle event. But as thousands of electrons accumulate, they build up an interference pattern, as if each electron simultaneously passed through both slits and interfered with itself. Claus Jönsson performed this experiment with electrons in 1961; Akira Tonomura and colleagues performed the definitive single-electron version in 1989, accumulating the interference pattern electron by electron.
What happens when we try to find out which slit each electron passes through? Any measurement capable of determining the electron's path destroys the interference pattern. The electron then produces the two-band distribution expected of particles. This is not a disturbance-by-measurement story, though that is the common explanation; more precisely, it is a case where the experimental conditions are incompatible: interference requires a superposition of both paths, and determining the path collapses the superposition. The quantum mechanical description is completely consistent; what is inconsistent is the attempt to assign simultaneously both wave-like (path superposition) and particle-like (definite path) properties.
The Uncertainty Principle
Heisenberg derived his uncertainty principle in 1927, and it requires no special intuition to understand in mathematical terms: it follows directly from the fact that position and momentum are related by the Fourier transform. A function that is sharply localized in position has a Fourier transform that is broadly spread in frequency (and hence momentum); a function with precisely defined frequency is spread across all positions. The uncertainty relation delta-x delta-p greater than or equal to h-bar over 2 is not an approximation or a practical limit — it is an exact mathematical theorem about Fourier pairs.
The physical interpretation is profound. A quantum particle does not have a definite position and a definite momentum simultaneously. These are not properties waiting to be discovered but properties that are brought into being by measurement. Asking "what is the electron's exact position and momentum before I measure?" is asking a question that has no answer — not because we are ignorant but because the question is malformed.
The uncertainty principle has direct consequences for atomic physics. The ground state energy of a hydrogen atom can be estimated by minimizing the total energy subject to the constraint that position uncertainty and momentum uncertainty satisfy Heisenberg's relation. The calculation correctly predicts the size of the atom (the Bohr radius) and its ground state energy without any additional assumptions. This explains why atoms are stable: confining an electron to a region smaller than the Bohr radius would require enormous kinetic energy (from the resulting momentum uncertainty), so the electron settles at the scale where kinetic and potential energy are balanced.
There is a time-energy uncertainty relation that is analogous: the product of the uncertainty in energy and the lifetime of a quantum state is at least h-bar over 2. This means that short-lived excited states have inherently broad energy spectra — the shorter the lifetime, the less precisely defined the energy. This spectral broadening is observed in atomic transitions and has practical implications for laser linewidth and atomic clock precision.
Schrodinger's Equation and the Wavefunction
In 1926, Erwin Schrodinger published the wave equation that bears his name, which governs how the quantum wavefunction evolves over time. The time-dependent Schrodinger equation is the quantum analogue of Newton's second law: given the wavefunction at one time and the potential energy landscape, it predicts the wavefunction at all future times. The evolution is perfectly deterministic and linear — superpositions remain superpositions.
The physical content of the wavefunction comes from Max Born's 1926 interpretation: the probability of finding a particle at position x at time t is proportional to the square of the magnitude of the wavefunction at (x, t). The wavefunction is not a physical wave in space — it is a probability amplitude, and its physical meaning is entirely statistical. This interpretation, for which Born received the Nobel Prize in 1954, resolved the puzzle of what was "waving" in de Broglie's matter waves.
The Schrodinger equation is exactly solvable for the hydrogen atom, yielding the energy levels and the spatial distributions (the atomic orbitals) that are now familiar from chemistry. The principal quantum number, angular momentum quantum number, and magnetic quantum number that emerge from the solution are not postulated but derived. Paul Dirac's 1928 relativistic generalization of the Schrodinger equation — the Dirac equation — naturally incorporated electron spin and predicted the existence of antimatter (the positron was discovered experimentally in 1932).
Schrodinger's cat thought experiment (1935) was a response to the apparent absurdity of the Copenhagen interpretation: if the wavefunction evolves continuously until measurement, and if a quantum event (radioactive decay) is connected to a macroscopic outcome (cat alive or dead), then before observation the cat is formally in a superposition of both states. Schrodinger intended this as a reductio ad absurdum of the Copenhagen view, but the thought experiment is taken seriously today as a genuine puzzle about the quantum-classical boundary.
Entanglement and Non-Locality
Einstein, Podolsky, and Rosen published their famous paper in 1935 arguing that quantum mechanics was incomplete. They constructed a thought experiment in which two particles interact and then separate. After separation, measuring the position of one particle instantly tells you the position of the other (by momentum conservation), and measuring the momentum of one instantly tells you the momentum of the other. Since quantum mechanics says you cannot simultaneously know both position and momentum, either the particles carried definite values for these quantities all along (hidden variables), or measuring one particle somehow instantaneously affects the other at arbitrary distance (non-locality). Einstein found both options unacceptable and concluded quantum mechanics must be incomplete.
John Bell's 1964 paper proved that hidden variable theories with local causality make different statistical predictions from quantum mechanics. Specifically, correlations between measurements on entangled particles in any local hidden variable theory must satisfy a set of inequalities (Bell inequalities). Quantum mechanics predicts violations of these inequalities. The experimental question was: do real entangled particles violate Bell inequalities? Alain Aspect and colleagues answered yes in a 1982 series of experiments using entangled photons, measuring correlations in ways that closed the main loopholes that might have allowed local hidden variable explanations. Their results violated the Bell inequalities by more than 40 standard deviations and matched quantum mechanical predictions precisely. The 2022 Nobel Prize in Physics was awarded to Aspect, John Clauser, and Anton Zeilinger for this and related work.
Entanglement is real, and it is non-local in the sense that entangled particles cannot be described by any local hidden variable theory. But — as Bell himself emphasized — this non-locality does not allow faster-than-light signaling. The correlations between measurement outcomes can only be observed by comparing results through a classical channel. Alice and Bob each see random results locally; the correlations appear only when they compare notes. No information travels faster than light.
Interpretations: What Does It All Mean?
Quantum mechanics predicts measurement outcomes with extraordinary accuracy. What it does not settle is the question of what is happening in the world when no one is measuring. This is the measurement problem, and it has generated more philosophical argument per correct experimental prediction than any other branch of science.
The Copenhagen interpretation remains the most widely taught and operationally most common. It holds that quantum mechanics is a theory about measurement outcomes, not about a reality independent of measurement. The wavefunction collapses upon measurement, yielding a definite result with the Born rule probabilities. Questions about what happens between measurements, or what constitutes a "measurement," are either meaningless (Bohr) or unanswered. The Copenhagen interpretation's pragmatic "shut up and calculate" stance suits the working physicist but leaves ontological questions open.
Hugh Everett's many-worlds interpretation (1957) takes the Schrodinger equation at face value and refuses to add any collapse postulate. Every possible outcome occurs in a branching universal wavefunction. The universe you find yourself in after measuring an electron's spin is one of many parallel branches in which the electron went the other way. The formalism is minimal, but the ontological price is staggering: an uncountable proliferation of parallel universes. The main technical challenge — deriving the Born rule probabilities in a theory where all outcomes happen — remains controversial despite sophisticated attempts by David Deutsch (2000) and David Wallace (2012) using decision theory.
Decoherence theory, developed by Wojciech Zurek in the 1980s and 1990s, provides a dynamical explanation for why quantum superpositions appear to collapse at macroscopic scales. Macroscopic objects constantly interact with their environments (air molecules, photons, thermal radiation), and these interactions entangle the object with its environment in ways that effectively wash out interference terms between macroscopic states. Decoherence happens extraordinarily fast for large objects — for a dust grain in air, within about 10^-31 seconds. Decoherence does not by itself resolve the measurement problem — it shows why we cannot observe macroscopic superpositions, but not why measurements yield definite outcomes — but it clarifies what needs to be explained.
Why Quantum Mechanics Is Right
The question "is quantum mechanics correct?" has a clear empirical answer: yes, to a degree that should provoke something like awe. Quantum electrodynamics, the quantum field theory of electromagnetic interactions, predicts the anomalous magnetic moment of the electron — the tiny deviation of the electron's magnetic moment from the value 2 — to eleven significant figures. The theoretical prediction and the best experimental measurement agree to within experimental uncertainty. This is widely cited as the most precisely tested prediction in all of science.
The experimental program testing quantum mechanics has been relentless. Bell inequality violations have been confirmed in dozens of experiments across multiple physical systems — photons, electrons, atoms, ions — closing successive loopholes until the 2015 "loophole-free" Bell tests of Hensen et al. (Nature) and Giustina et al. (Physical Review Letters), which simultaneously closed the detection loophole and the locality loophole. Quantum mechanics passed every test.
Applications that work only because quantum mechanics is correct include: the laser (stimulated emission), the transistor (semiconductor band structure and tunneling), the MRI machine (nuclear magnetic resonance), the LED (semiconductor electroluminescence), the atomic clock (transition frequencies), GPS (relativistic corrections from special and general relativity combined with quantum atomic clocks), and solar cells (photoelectric effect). The global economy runs on quantum mechanics, usually without acknowledging it.
Quantum Mechanics and Everyday Life
The practical applications of quantum mechanics constitute perhaps the most remarkable case in history of abstruse fundamental physics producing transformative technology within decades of the theory's development. The transistor, invented in 1947, would have been impossible without understanding quantum mechanical band theory — the explanation for why some materials conduct electricity and others do not, which depends on quantum mechanical energy levels and the Pauli exclusion principle.
A modern microprocessor contains billions of transistors fabricated at feature sizes of a few nanometers. At these scales, quantum tunneling — the classically forbidden passage of electrons through thin insulating barriers — is not just a theoretical curiosity but an engineering challenge that processor designers must account for and control. The continued miniaturization of electronics is limited by quantum effects, not by classical mechanical constraints.
Laser technology, from barcode scanners to fiber optic communications to precision surgery, depends on stimulated emission, which Einstein predicted in 1917 from the quantum mechanical properties of atomic transitions. Nuclear magnetic resonance, discovered in the 1940s and developed into MRI in the 1970s, exploits the quantum mechanical spin of atomic nuclei to produce detailed images of soft tissue without ionizing radiation.
Quantum computing represents the frontier of quantum mechanical engineering. Unlike a classical bit that is either 0 or 1, a quantum bit (qubit) can be in a superposition of both states, and multiple qubits can be entangled, allowing certain computations to proceed along multiple paths simultaneously. For specific problems — factoring large integers (Shor's algorithm), searching unsorted databases (Grover's algorithm), simulating quantum chemical systems — quantum computers offer exponential speedups over classical computers. The first quantum computers capable of performing computations that no classical computer could match in a reasonable time were demonstrated by Google (2019) and others for specific designed problems, though fault-tolerant large-scale quantum computing remains an engineering challenge.
Cross-References
Related articles: how quantum computing works, how the universe began, what is entropy
References
- Feynman, R.P., Leighton, R.B., & Sands, M. (1965). The Feynman Lectures on Physics, Vol. III: Quantum Mechanics. Addison-Wesley.
- Heisenberg, W. (1927). Uber den anschaulichen Inhalt der quantentheoretischen Kinematik und Mechanik. Zeitschrift fur Physik, 43, 172-198.
- Bell, J.S. (1964). On the Einstein Podolsky Rosen paradox. Physics, 1(3), 195-200.
- Aspect, A., Grangier, P., & Roger, G. (1982). Experimental realization of Einstein-Podolsky-Rosen-Bohm Gedankenexperiment: A new violation of Bell's inequalities. Physical Review Letters, 49(2), 91-94. https://doi.org/10.1103/PhysRevLett.49.91
- Everett, H. (1957). "Relative state" formulation of quantum mechanics. Reviews of Modern Physics, 29(3), 454-462.
- Wheeler, J.A., & Zurek, W.H. (Eds.). (1983). Quantum Theory and Measurement. Princeton University Press.
- Tonomura, A., et al. (1989). Demonstration of single-electron buildup of an interference pattern. American Journal of Physics, 57(2), 117-120.
- Wallace, D. (2012). The Emergent Multiverse: Quantum Theory According to the Everett Interpretation. Oxford University Press.
Frequently Asked Questions
What is quantum mechanics in simple terms?
Quantum mechanics is the branch of physics that describes how matter and energy behave at the smallest scales — the level of atoms, electrons, photons, and other subatomic particles. It differs from classical physics (the physics of everyday objects) in several fundamental ways. First, energy at the quantum level comes in discrete packets called quanta, rather than flowing in continuous streams. Second, quantum particles exhibit wave-particle duality: they behave like waves (spreading out, interfering with each other) and like particles (arriving at detectors as localized events) depending on how they are observed. Third, quantum mechanics is irreducibly probabilistic — it cannot predict where a particular particle will be found, only the probability distribution of possible locations. Fourth, particles exist in superpositions of multiple states simultaneously until they are measured, at which point the wavefunction appears to 'collapse' to a definite state. Fifth, quantum entanglement allows two particles to become correlated in ways that persist across any distance, so that measuring one instantly determines the corresponding property of the other. These features seem bizarre from the perspective of classical physics, but quantum mechanics is also the most experimentally verified theory in all of science, and it underlies the technology of lasers, transistors, MRI machines, and LEDs. The theory was developed between roughly 1900 and 1935 by Max Planck, Albert Einstein, Niels Bohr, Werner Heisenberg, Erwin Schrodinger, Paul Dirac, Max Born, and others, in what may be the most concentrated flowering of theoretical genius in the history of science.
What is the Heisenberg uncertainty principle?
The Heisenberg uncertainty principle, formulated by Werner Heisenberg in 1927, states that there is a fundamental limit to how precisely you can simultaneously know the position and momentum of a quantum particle. Mathematically, the product of the uncertainties in position and momentum must be at least as large as the reduced Planck constant divided by two: delta-x times delta-p is greater than or equal to h-bar over 2. This is not a limitation of our measuring instruments — it is not saying that we disturb the particle when we measure it, though we do. It is a statement about the nature of reality: a quantum particle does not simultaneously have a precise position and a precise momentum to be discovered. Position and momentum are complementary observables in a formal mathematical sense; they are represented by non-commuting operators in the quantum mechanical formalism, and non-commutativity is the mathematical source of the uncertainty relation. The physical intuition is related to wave-particle duality: a particle with precisely defined momentum corresponds to a wave of precisely defined frequency, but such a wave is spread out infinitely in space (the particle could be anywhere). A particle localized in space corresponds to a superposition of many different frequencies (momenta). You cannot have precise position and precise momentum simultaneously because position-localization and momentum-localization are wave properties that trade off against each other. The uncertainty principle has direct physical consequences: it explains why atoms are stable (the electron cannot collapse into the nucleus, because precise localization would require enormous momentum uncertainty and thus enormous kinetic energy), and it sets fundamental limits on quantum computing and quantum sensing.
What is quantum entanglement and does it allow faster-than-light communication?
Quantum entanglement is a phenomenon in which two or more particles become correlated in such a way that the quantum state of each cannot be described independently of the others, no matter how far apart they are. When you measure a property of one entangled particle — say, its spin — you instantly know the corresponding property of its partner, even if the partner is on the other side of the galaxy. Einstein called this 'spooky action at a distance' and considered it evidence that quantum mechanics must be incomplete, preferring the explanation that the particles carry 'hidden variables' (predetermined values for their properties) that we simply have not measured yet. John Bell's 1964 theorem proved that any hidden variable theory that is also consistent with the principle of locality (no influences faster than light) makes different statistical predictions from quantum mechanics. Alain Aspect and colleagues performed experiments in 1982 that tested these predictions, violating the Bell inequalities in exactly the way quantum mechanics predicts, not in the way local hidden variable theories predict. Entanglement is real, not explained by hidden variables. However — and this is crucial — entanglement does not allow faster-than-light communication. When Alice measures her particle and gets a result, she gets a random result. She cannot control what result she gets, so she cannot encode a message in it. Bob's result is correlated with Alice's, but Bob's result is also random from his perspective until he compares notes with Alice through a classical channel. The correlations only become apparent after the classical comparison. No information travels faster than light. Alain Aspect, John Clauser, and Anton Zeilinger shared the 2022 Nobel Prize in Physics for their experimental work on entanglement.
What is Schrodinger's cat and what does it illustrate?
Schrodinger's cat is a thought experiment proposed by Erwin Schrodinger in 1935 to illustrate what he saw as an absurdity in the Copenhagen interpretation of quantum mechanics. Imagine a cat in a sealed box with a device that has a 50 percent chance of triggering in one hour — triggered by the decay of a radioactive atom, which is a quantum event governed by probability. If the device triggers, it releases poison and kills the cat. If it does not, the cat lives. Now: the radioactive atom is in a superposition of 'has decayed' and 'has not decayed' until it is measured. The Copenhagen interpretation says the atom has no definite state until measurement. But the atom is connected by the device to a macroscopic system — the cat. If the atom is in superposition, is the cat simultaneously alive and dead until someone opens the box? Schrodinger thought this was a reductio ad absurdum: surely cats are either alive or dead, not in superpositions. The thought experiment does not resolve the measurement problem — it dramatizes it. Different interpretations of quantum mechanics answer it differently. The Copenhagen interpretation says the wavefunction collapses when the box is opened by a macroscopic observer. The many-worlds interpretation says the universe splits into a branch where the cat is alive and a branch where it is dead, with the observer splitting too. Decoherence theory argues that macroscopic objects interact so rapidly with their environment that superpositions decay almost instantaneously at macroscopic scales, explaining why we never observe 'classical' objects in superpositions even though quantum mechanics formally allows it. Schrodinger's cat remains the most famous illustration of the quantum measurement problem, which is still not fully resolved philosophically even though quantum mechanics works with extraordinary precision.
What is the Copenhagen interpretation of quantum mechanics?
The Copenhagen interpretation is the most historically dominant framework for understanding what quantum mechanics means about the nature of reality. Developed primarily by Niels Bohr and Werner Heisenberg in the late 1920s, it holds that a quantum system is completely described by its wavefunction, which encodes the probabilities of different measurement outcomes. Before measurement, the particle does not have a definite position, spin, or momentum — it is in a superposition of all possible values. When a measurement is performed, the wavefunction 'collapses' to a definite value, with the probability of each outcome given by the Born rule: the probability of finding the particle in state x is the square of the amplitude of the wavefunction at x. The Copenhagen interpretation emphasizes that quantum mechanics is a theory about measurement outcomes, not about 'what is really happening' when no measurement is being made. Bohr's principle of complementarity holds that certain pairs of properties — like position and momentum, or wave-like and particle-like behavior — cannot be simultaneously defined; each experimental context reveals one aspect while necessarily obscuring the other. The Copenhagen interpretation is deliberately agnostic about what happens between measurements ('shut up and calculate' is a frequently cited summary, though Bohr himself never said this). Many physicists are operationally Copenhagenist — they use quantum mechanics to calculate probabilities without worrying about ontological questions — but philosophers and physicists who want a complete account of physical reality find the interpretation unsatisfying, because it leaves the measurement process and the role of observers unexplained.
What is the many-worlds interpretation of quantum mechanics?
The many-worlds interpretation (MWI), proposed by Hugh Everett III in his 1957 Princeton doctoral thesis, holds that the wavefunction never collapses. When a measurement is performed and quantum mechanics predicts multiple possible outcomes, all outcomes occur — but in different branches of a universal wavefunction. The universe continuously splits into parallel branches corresponding to every possible quantum outcome. In one branch, Schrodinger's cat is alive; in another, it is dead. The observer splits too, each copy aware only of their own branch. There is no special role for measurement, no collapse, no observers needed to actualize reality — the Schrodinger equation applies universally and continuously. The MWI has several attractive features: it is the most mathematically minimal interpretation (it adds nothing to the formalism), it preserves determinism (the universal wavefunction evolves deterministically), and it takes the quantum formalism seriously as a complete description of reality. Its challenges are profound. The preferred basis problem asks: why do branches split along 'measurement' lines rather than other arbitrary quantum superpositions? (Decoherence provides a partial answer.) The probability problem is deeper: in a theory where all outcomes happen, what does it mean to say that one outcome has probability 0.9 and another 0.1? If both occur, why should we expect to find ourselves in the more probable branch more often? Attempts to derive the Born rule from the MWI — particularly David Deutsch's 2000 decision-theoretic derivation and David Wallace's extension — remain controversial. The MWI has attracted serious defenders including David Deutsch, David Wallace, and Sean Carroll, and serious critics including Adrian Kent and others.
Why does quantum mechanics matter for everyday technology?
Quantum mechanics is not just a philosophical puzzle about the nature of reality — it is the foundation of most modern technology, much of it developed by engineers and scientists who explicitly used quantum mechanical principles. The laser depends on stimulated emission, which Einstein predicted in 1917 from quantum mechanical reasoning: photons in a particular quantum state can stimulate other atoms to emit identical photons, producing coherent amplified light. The transistor, invented at Bell Labs in 1947 and now fabricated at densities of billions per square centimeter in microprocessors, operates through quantum mechanical principles of semiconductor band structure and electron tunneling. Without quantum mechanics, there is no explanation for why silicon conducts electricity under some conditions and not others. MRI (magnetic resonance imaging) uses nuclear magnetic resonance, a quantum mechanical phenomenon in which atomic nuclei in a magnetic field absorb and re-emit radio waves at specific frequencies determined by quantum mechanical energy levels. LEDs and solar cells both depend on quantum mechanical properties of semiconductors — the photoelectric effect (for which Einstein won the Nobel Prize) is the quantum mechanical process underlying solar cell operation. Quantum cryptography exploits the no-cloning theorem (an eavesdropper cannot copy an unknown quantum state without disturbing it) to create communications secure against any computationally powerful eavesdropper. Quantum computing, still in early development, exploits superposition and entanglement to perform certain calculations exponentially faster than any classical computer. The technology of the 21st century is quantum mechanical at its foundations.