Quantum physics is the rulebook for atoms, electrons, and photons — the building blocks of everything. At this scale, the familiar rules of classical physics break down completely.
A particle can be in two places at once. Observing it changes it. Two particles can be correlated across the universe. None of this makes intuitive sense. All of it is experimentally verified.
This isn't a limitation of our instruments. It's how reality works.
Particles exist in all possible states simultaneously — until measured.
Two particles can share a state across any distance, instantaneously.
Position and momentum cannot both be precisely known — not even in principle.
In 1900, Planck solved a long-standing puzzle: classical physics predicted that hot objects should radiate infinite energy at short wavelengths — the "ultraviolet catastrophe." Planck found the solution by proposing that energy could only be emitted or absorbed in discrete packets, which he called quanta. He found the result so bizarre that he tried for years to find a classical explanation. He never did. Quantum physics was born.
Use the navigation above to explore these concepts interactively. Start with Wave-Particle Duality →
Light — and matter — behaves as a wave when unobserved, and a particle when measured. The double-slit experiment makes this visible.
Fire electrons one at a time at a barrier with two slits. A detector screen sits behind it. What pattern builds up over thousands of shots?
Toggle to watch which slit each particle uses.
electrons fired at the slits
When the detector is switched ON, it tells us which slit each particle used — and the interference pattern disappears. The particle suddenly behaves like a classical particle. The act of observation changed the outcome.
Young performed the original double-slit experiment using light — not electrons — and observed the interference pattern that proved light behaves as a wave. At the time, Newton's corpuscular (particle) theory of light was dominant, and Young's result was met with fierce resistance. He was also the first to decipher significant parts of the Rosetta Stone. His experiment is now routinely voted the most beautiful in physics.
In his PhD thesis — just 70 pages long — de Broglie proposed that if light (a wave) can behave like a particle, then matter (particles) must also have wave-like properties. He derived a wavelength for any moving object: λ = h/p, where h is Planck's constant and p is momentum. Einstein read the thesis and called it "a first feeble ray of light on this worst of our physics enigmas." The prediction was confirmed experimentally within three years.
In 1935, Erwin Schrödinger devised a thought experiment to show how absurd quantum superposition becomes at the scale of everyday objects — if you take the Copenhagen interpretation literally.
A cat is placed in a sealed box with a radioactive atom, a Geiger counter, and a vial of poison. If the atom decays (a quantum event), the counter triggers, smashing the vial and killing the cat. Until you open the box, the atom is in superposition — decayed and not decayed simultaneously. So… what about the cat?
The box is sealed. The atom is in superposition.
Schrödinger invented this to show that Copenhagen quantum mechanics leads to absurdity at large scales. A cat cannot be both alive and dead. So either: (1) quantum superposition doesn't apply to large objects, (2) "measurement" collapses the wave function before you open the box, or (3) both outcomes happen in parallel universes.
Schrödinger developed the wave equation bearing his name — the central equation of quantum mechanics, which describes how the quantum state of a system evolves over time. In doing so he provided quantum mechanics with one of its two mathematical foundations (the other being Heisenberg's matrix mechanics). Ironically, his most famous contribution — the cat — was intended as a reductio ad absurdum to mock the Copenhagen interpretation, not celebrate it. He won the Nobel Prize alongside Paul Dirac.
The wave function collapses upon measurement. What counts as "measurement"? Stop asking.
The universe branches. In one branch the cat lives, in the other it dies. Both are real.
Interaction with the environment destroys superposition before any conscious observer acts.
The more precisely you know a particle's position, the less precisely you can know its momentum — and vice versa. This is not a measurement limitation. It is a fundamental property of reality.
Position uncertainty × Momentum uncertainty ≥ Planck's constant / 2
Drag the sliders to see how position and momentum uncertainty trade off. The product must always stay above ℏ/2.
The wave above shows the particle's probability distribution. Narrow = well-localised (high position certainty, high momentum uncertainty). Wide = spread out (low position certainty, low momentum uncertainty).
The uncertainty principle is not about clumsy instruments disturbing particles when we measure them. It's deeper than that: position and momentum don't simultaneously have precise values. The fuzziness is in the particle itself, not in our measurement.
At just 23, Heisenberg invented matrix mechanics — the first complete and consistent formulation of quantum mechanics. Two years later, while working alone on the island of Helgoland to escape hay fever, he derived the uncertainty principle. He showed mathematically that the position and momentum of a particle are conjugate variables — the more precisely one is defined, the less the other can be, not because of measurement disturbance, but because of the wave nature of matter itself. He received the Nobel Prize at 31, one of the youngest ever.
Nothing can ever be perfectly still. If position were perfectly known, momentum uncertainty would be infinite — so particles always jiggle. Even at absolute zero.
Momentum uncertainty means particles sometimes have enough energy to cross barriers they classically shouldn't — making nuclear fusion in stars possible.
About 30% of the global economy depends on technologies that only work because of quantum mechanics. Most people have no idea.
Every computer chip on Earth runs on quantum band theory and quantum tunnelling.
Stimulated emission of photons — a purely quantum process discovered by Einstein.
Nuclear magnetic resonance exploits quantum spin to image soft tissue non-invasively.
The photoelectric effect — photons knocking electrons free — is what generates electricity.
Qubits in superposition allow exponential parallelism — solving problems classical computers never could.
Entanglement makes eavesdropping physically detectable — unbreakable encryption based on physics.
A transistor is a switch — it controls whether current flows. Modern transistors are just 2 nm wide (a few atoms across). At this scale, electrons don't stay where you put them — they quantum tunnel through barriers.
The band theory of solids — a quantum mechanical theory — explains why some materials conduct electricity and others don't, and why silicon works perfectly as a semiconductor. Without quantum mechanics, there are no transistors. Without transistors, there are no computers.
A laser works via stimulated emission — a quantum process Einstein described in 1917. When a photon hits an excited electron, it triggers the release of a second, identical photon. Both photons then trigger more — a cascade of perfectly coherent light.
Lasers are used in: surgery, communications (fibre optic), manufacturing (cutting steel), barcode scanners, LiDAR in self-driving cars, and atomic clocks.
Protons have a quantum property called spin — they behave like tiny magnets. In a strong magnetic field, spins align. Pulse them with radio waves at the right frequency and they flip. When they flip back, they emit radio waves we can detect.
Different tissues have different proton densities and relaxation times — giving us detailed soft-tissue images without ionising radiation. All of this relies entirely on quantum mechanics.
Einstein's 1905 paper — the one that won him the Nobel Prize — explained that light comes in packets called photons. Each photon has energy proportional to its frequency: E = hf.
When a photon hits a solar cell, it knocks an electron free — generating current. This only works because of quantum discretisation. Classical wave-based light theory predicted this shouldn't produce current at all below a threshold intensity. It does — because photons are real.
A classical bit is 0 or 1. A qubit (quantum bit) can be 0, 1, or any superposition of both simultaneously. Entangle multiple qubits and you have an exponentially large state space — a 300-qubit computer can represent more states than there are atoms in the observable universe.
Quantum algorithms exploit interference to amplify correct answers and cancel wrong ones. Shor's algorithm can factor large numbers exponentially faster — breaking RSA encryption. Grover's algorithm searches databases in √N steps instead of N.
Quantum key distribution (QKD) uses entangled photons to share encryption keys. Because of the no-cloning theorem — a consequence of quantum mechanics — any eavesdropper physically cannot copy a quantum state without disturbing it.
If someone intercepts the key exchange, both parties will detect statistical anomalies in their measurements and know the channel is compromised. The security comes from the laws of physics, not computational difficulty.