Fundamental Theory

The advent of quantum mechanics at the end of the 19th century led to a fundamental change in our understanding of the nature of reality. Quantum theory is inherently probabilistic, resulting in seemingly counter intuitive behavior. Whilst the philosophical meaning of this theory is still debated, it has become one of the world's best tested theories. Our understanding of quantum mechanics has provided a deeper insight into a multitude of fields within physics. It has become indispensable in the other scientific disciplines such as chemistry and biology, whilst also influencing the fields of mathematics and computer science.

1895: Quantisation of Light

The continuous nature of energy was challenged by the UV catastrophe, whereby classical theories failed to explain experimental observations of the black body radiation spectrum from stars. Planck managed to resolve this failure by quantising black body energy into discrete packets. 

Reference: https://archive.org/details/theoryofheatradi00planrich/page/n5/mode/2up

This minutephysics video details how Planck quantised the electro-magnetic field and how this led to the foundations of the field of quantum theory.

‌Above: a portrait of Max Planck. Planck was born in what is present day Germany in 1858. His proposed energy quantisation set the groundwork for quantum theory to grow from.

1905: Photoelectric Effect

The photoelectric effect is a phenomenon whereby light incident upon a metal must be above a threshold frequency in order to eject electrons from the metal - irrespective of the light’s intensity. Whilst this couldn’t be explained using classical theories, Einstein used Planck’s concept of energy quantisation to explain these experimental observations. He theorised that light is comprised of discrete packets, later named photons, with energy dependent on frequency. Einstein’s discovery later earned him the Nobel Prize in Physics.

Reference: http://astro1.panet.utoledo.edu/~ljc/PE_eng.pdf

This article details how Einstein built off the work of Planck to explain the photoelectric effect.

 

 

 

Above: Portrait of Albert Einstein sitting in 1928, seven years after he won his Nobel Prize in Physics. While most people picture Einstein as an older man with crazy hair, he published his most impactful work when he was in his twenties!

1924: Wave Particle Duality

De Broglie proposed that matter should show wave-like behaviour, similar to the particle-like behaviour of electromagnetic waves. These waves had wavelength inversely proportional to momentum. He was awarded the 1927 Nobel Prize in Physics for his work.

Reference: https://fondationlouisdebroglie.org/LDB-oeuvres/De_Broglie_Kracklauer.pdf

Try and re-create the famous double slit experiment that verified the theory of wave-particle duality in this simulation from the University of Colorado Boulder.

1925: Quantum Wave Mechanics

Schrödinger, building on the work of de Broglie, published his famous equation describing the evolution of a wave function. The wave function describes the properties of an isolated quantum system and provides the probability of obtaining a specific outcome when an observable is measured. The Schrödinger equation laid the foundation for the wave theory of quantum mechanics and earned Schrödinger the 1933 Nobel Prize in Physics.

Reference: https://link.springer.com/content/pdf/10.1007/BF01328377.pdf

Above: The state of the quantum system across space and time is represented by a wavefunction, , which is acted on by a Hamiltonian that describes the dynamics of the systems, e.g. electric fields and interactions between particles. By solving this famous equation, with the Hamiltonian for the system, the evolution of the quantum state can be determined. The presence of the imaginary number in the equation highlights how complex numbers are integral to describing reality. 

1927: Uncertainty Principle

In quantum mechanics variables have intrinsic uncertainty depending upon the quantum state. Heisenberg showed that the uncertainties of position and momentum are fundamentally constrained.

Reference: https://doi.org/10.1007%2FBF01397280

This video from PBS SpaceTime explains how the uncertainty principal arises as a natural consequence of quantum mechanics being a wave theory.

Above: The mathematical expression of the uncertainty principle for position and momentum. The left hand of the equation represent the uncertainty in the measurements of the quantities position and momentum- a measure of how spread the results would be if you repeated the measurement many times.  The product of these two must exceed the quantity h/2, placing a fundamental limit on how sure we can be about the result of measurements of conjugate variables. 

1927: Quantum Tunnelling

Hund discovered the phenomenon of quantum tunnelling where quantum particles can pass through classically impenetrable barriers.

Reference: https://doi.org/10.1063/1.1510281

This gif from Quanta magazine gives a visual demonstration of how quantum tunnelling works as a wave packet hits a potential energy barrier.

1928: Dirac Notation

Dirac developed a new notation for quantum mechanics which has since become standard.

Reference: https://doi.org/10.1017%2FS0305004100021162

Follow this casual guide on Dirac notation to get a basic understanding of how it is used to represent quantum mechanics. Follow this for a more in-depth dive into Dirac notation.

 

1935: EPR Paradox

Einstein–Podolsky–Rosen (EPR) proposed a thought experiment that they believed showed that quantum mechanics is an incomplete theory. They argued that measurements made on a pair of distantly entangled particles could violate the uncertainty principle, and therefore the foundations of quantum mechanics, if the theory of relativity was upheld.

Reference: https://doi.org/10.1103%2FPhysRev.47.777

This article details the EPR Paradox and its legacy on the field of quantum physics.

1964: Bell’s Inequality

Bell demonstrated that no local hidden variable theory could reproduce all the predictions of quantum mechanics.  Local hidden variable theories require that measurement outcomes are predetermined via some probability distribution upon creation of a system. Bell derived an inequality that if violated suggested that no local hidden variable theory could describe quantum mechanics, suggesting quantum mechanics is a non-local theory.

Reference: https://cds.cern.ch/record/111654/files/vol1p195-200_001.pdf

Bell’s seminal theorem is explained along with its historical content and implications in this article.

1970: No Cloning Theorem

The no cloning theorem proved that perfect copies of unknown quantum states cannot be made. Initially developed by Park, it was later generalised in 1982 by work from Wootters and Zurek in addition to independent work from Dieks, at which point the theorem gained its name. This forms the basis of security for many quantum communication experiments. It also prevents the use of classical error correcting codes being directly applicable to quantum computers.

Reference: https://doi.org/10.1007%2FBF00708652

Watch this minutephysics video to find out more about the no-cloning theorem.

1982: First Bell Violation

Aspect et al. demonstrated the first violation of Bell’s inequality using polarisation entangled photons, giving the first evidence of the non-local nature of reality. However, sceptics identified certain loopholes which would avoid this conclusion.

Reference: https://doi.org/10.1038/nature15759

This review article discusses the loopholes that invalidate the result associated with a test of Bell’s inequality. It also discusses the details of how one would perform an experiment of this kind.

A‌bove: A modern example of an optics setup that is needed to show a bell violation. Setup like this are common place in experimental physis labs- this one coming from Dr Siddarth Joshi’s lab right here at the University of Bristol. The principles of bulk optic setups have not changed since the very first experiments, even though the design of components have substantially improved.

Optical components such as mirrors, beam splitters and fibre optical cables are attached onto posts and bolted to an optical table. They are then precisely aligned to guide the light in the required path. Small changes in temperature or vibrations in the lab can completely misalign the experiment, motivating the development and use of integrated optics. 

1987: HOM Interference

Hong, Ou, and Mandel demonstrated that two indistinguishable photons entering the two input ports of a beam splitter interfere and both emerge from the same output port. This result is integral to many fields of quantum physics but was initially proposed to measure small timing intervals for metrology.

Reference: https://doi.org/10.1103%2FPhysRevLett.59.2044

This review article covers HOM interference in detail and its application to different situations.

A‌bove: A graph of an experimental HOM dip, the signature of HOM interference provided by Dr Alex Clark. If we send two photons that are identical into a beam splitter and place photon detectors at the outputs, we expect to never get both detectors click at the same time- as both photons go through the same output. If one of these photons has travelled a different distance, they are no longer indistinguishable and so they will not interfere meaning there is a chance that both detectors click (called a coincidence)- as photons can exit in different outputs. By changing the difference in distance travelled, we control how indistinguishable the photons are and therefore how many coincidences we see. When the distances are the same, we see no coincidence leading to a dip in the graph. 

1993: Quantum Teleportation Theorised

Bennett et al. published a paper proposing quantum teleportation. The scheme consists of two parties who share a maximally entangled pair. By making local measurements on their half of the pair, and using classical communication, the state of one qubit can be projected onto another.

Reference: https://journals.aps.org/prl/pdf/10.1103/PhysRevLett.70.1895

Follow through this Brilliant article for an explanation of the concepts and mathematics behind quantum teleportation, or watch this minutephysics video.

1996: Entanglement Concentration

Popescu, with collaborators, demonstrated that partially entangled qubit pairs may be transformed into a smaller number of maximally entangled pairs. This process uses only local operations and classical communication, and was shown to asymptotically conserve the degree of entanglement in the system. Notably the process can be reversed to generate any partially entangled state using only standard singlet states.

Reference: https://journals.aps.org/pra/pdf/10.1103/PhysRevA.53.2046

2000: Quantum Computation and Quantum Information

Michael Nielsen and Isaac Chuang, sometimes affectionately labelled ‘Mike and Ike’, first published “Quantum Computation and Quantum Information”. This textbook has been held as a go-to resource for all in the field.

Follow this link to get a copy from the University library!

2002: Proof Entanglement is Needed

Jozsa and Linden proved the necessity of multi particle entanglement in order to achieve exponential quantum-computational speedup. This is explained by systems with only a small amount of entanglement being efficient to simulate classically. They ultimately say that entanglement should not be treated as the only key to quantum speed-up as large-scale entanglement may be efficiently simulated in some circumstances.

Reference: https://arxiv.org/ct?url=https%3A%2F%2Fdx.doi.org%2F10.1098%2Frspa.2002.1097&v=290af29b

2009: Quantum Thermal Equilibrium

Linden, Popsecu, Short, and Winter proposed a mechanism by which quantum systems equilibrate when in contact with a thermal reservoir. This result was one of many contributions towards the effort to derive the laws of thermodynamics as an emergent property of quantum mechanics. This foundational area of research has the potential to underpin future quantum technologies such as nanoscale heat engines, refrigerators and batteries.

Reference: https://journals.aps.org/pre/pdf/10.1103/PhysRevE.79.061103

Read this Quanta Magazine article of an overview of the field of quantum thermodynamics. For a more detailed introduction, read this review article about the role of quantum information in quantum thermodynamics or this quantum thermodynamics review article.

2015: Loophole Free Bell Violation

The first Bell test that claimed to close all possible loopholes was performed. This provided evidence that nature is not described by a local realistic theory.

Reference: https://doi.org/10.1038/nature15759

Read this physics world article on the world’s first loophole free Bell test

2017: Quantum Volume Proposed

Quantum volume was proposed as a hardware-agnostic metric for the performance of quantum computing devices. This metric assesses the amount of useful quantum computation performed by the device across space and time.

Reference: https://storageconsortium.de/content/sites/default/files/quantum-volumehp08co1vbo0cc8fr.pdf

This article explains the quantum volume as a metric through the lens of the Qiskit programming language

Above: This equation describes quantum volume, the maximum sized circuit a quantum computer can implement. This equation is the logarithmic version of the original quantum volume equation, as made by IBM. The terms are defined as follows: n is the number of qubits in the system and d is the circuit depth which takes into account the effective error rate This can then be used to find the maximum n = d circuit size, as this is what the ‘argmax’ operation ensures.

Edit this page