Quantum mechanics is a fundamental theory in physics that provides a description of the physical properties of nature at the scale of atoms and subatomic particles. It is the foundation of all quantum physics including quantum chemistry, quantum field theory, quantum technology, and quantum information science.
Classical physics, the description of physics that existed before the theory of relativity and quantum mechanics, describes many aspects of nature at an ordinary (macroscopic) scale, while quantum mechanics explains the aspects of nature at small (atomic and subatomic) scales, for which classical mechanics is insufficient. Most theories in classical physics can be derived from quantum mechanics as an approximation valid at large (macroscopic) scale.
Quantum mechanics differs from classical physics in that energy, momentum, angular momentum, and other quantities of a bound system are restricted to discrete values (quantization), objects have characteristics of both particles and waves (wave-particle duality), and there are limits to how accurately the value of a physical quantity can be predicted prior to its measurement, given a complete set of initial conditions (the uncertainty principle).[note 1]
Quantum mechanics arose gradually, from theories to explain observations which could not be reconciled with classical physics, such as Max Planck's solution in 1900 to the black-body radiation problem, and the correspondence between energy and frequency in Albert Einstein's 1905 paper which explained the photoelectric effect. Early quantum theory was profoundly re-conceived in the mid-1920s by Niels Bohr, Erwin Schrödinger, Werner Heisenberg, Max Born and others. The original interpretation of quantum mechanics is the Copenhagen interpretation, developed by Niels Bohr and Werner Heisenberg in Copenhagen during the 1920s. The modern theory is formulated in various specially developed mathematical formalisms. In one of them, a mathematical function, the wave function, provides information about the probability amplitude of energy, momentum, and other physical properties of a particle.
In classical physics, reality is described by objects, such as particles or fields, with defined spatial dependence, so that in the state of a classical system, in each moment of time, every particle has a defined position, and every field has a defined value per position.
On the contrary, a quantum state is a combination, called a superposition, of several different options for classical states, and these different options can interact with each other in a process called interference. It is contended whether this picture of reality holds at the particle level only, or for large objects as well, with the answer depending on any of the Interpretations of quantum mechanics; the possible answers have, however, little, or no, measurable effects, as interference cannot happen in large objects due to a process known as decoherence.
Thus, the state of a quantum system is described as a linear combination of classical states, or more generally as a vector in some linear space, where the classical (or other) states may be used as a vector basis for this space.
A quantum state can be described as a vector, called a state vector, in a linear space, that is spanned by basis vectors. Its evolution in time can be thought of as stemming from the interaction of these basis vectors, with the coefficient of each basis vector representing the measure by which this basis vector influences this interaction and time evolution. These coefficients define an inner product on the linear space, making it a normed vector space.
Since, colloquially speaking, time evolution from one moment to the next is thought to depend only on the previous moment, time evolution is represented by a differential equation called the Schrödinger equation that is a first order differential equation with respect to time; thus the difference in a state vector over a short time interval (t,t+dt) equals the interval length (dt) times a linear operator operating on the state vector; this linear operator is the Hamiltonian operator , up to a constant of proportion that will be explained below. Thus in any vector basis chosen to span the linear space, the Hamiltonian takes the form of a matrix, the diagonal elements of which describe the time evolution of each vector basis independently, and the off-diagonal elements describe the effect of one vector basis on the evolution of the other. Also note, that in order for the time derivative to be defined in all the vector space, it is defined as a Hilbert space.
The derivative with respect to time is defined as , so that time evolution along time t can be written as the following operator:
The constant is introduced so that the Hamiltonian is reduced to the classical Hamiltonian in cases where the quantum system can be approximated by a classical system; the ability to make such an approximation in certain limits is called the correspondence principle. The factor is explained further below; the universal constant has the role of transforming between the energy units of the Hamiltonian, and the frequency units (one over time) of the derivative. It is related to Planck constant by a factor of 2π, as we now see.
Since the quantum mechanical Hamiltonian must behave as a classical Hamiltonian in certain limits, in general it may depend on quantities that can be classically interpreted as position and momentum. Every basis vector may have different values of these, so position and momentum must be defined as linear operators in quantum mechanics, having different results on different basis vectors. 
This means that they are not simultaneously defined accurately over a quantum state. If the position is defined within an error , and momentum is defined within an error , then these satisfy: where there can be a factor preceding the factor, depending on the precise definition of "within an error" here. Note that when measuring a state, the errors can always be larger, which is why we have the following relation, known as the uncertainty principle:
This relation is identical to the one found in Fourier transform, so that a description of an object according to its momentum is the Fourier transform of its description according to its position. Thus an object with a defined momentum p is in fact a wave, with its quantum state depending upon position x according to:
This is why the basic constituents of quantum mechanics have both particle-like and wave-like properties: They behave as particles when their position is well-defined, and as waves when their momentum is well-defined. This is known as the wave-particle duality.
The fact that dependence in momentum is the Fourier transform of the dependence in position further means, that the momentum operator is equivalent (up to an factor) to taking the derivative according to the position, since in Fourier analysis differentiation corresponds to multiplication in the dual space. This is why in quantum equations in position space, the momentum is replaced by , and in particular in the non-relativistic Schrödinger equation in position space the momentum-squared term is replaced with a Laplacian times .
Finally, note that the same relation seen between position and momentum can be derived for every pair of generalized coordinate and its conjugate momentum (see canonical coordinates). Furthermore, since the classical Hamiltonian is the energy of the system, and in quantum mechanics it is the time derivative (up to an factor), energy and time follow the same relations as momentum and position (up to a sign): The time-dependence of an object with energy E is by a factor of:
And time and energy obey a similar uncertainty principle:
Where is the error in energy measurement, and is the time this measurement takes.
The uncertainty principle gives a natural solution to one of the main paradoxes of classical statistical mechanics: When counting different microstates in the phase space, one has to assume a minimal volume in phase space; this volume is the minimal product of a change in position and a change in momentum. In classical physics there is no such minimal volume, but in quantum mechanics it is automatically given by the fact that position and momentum are mutually definable only to a given accuracy. The phase space volume for any single space dimension turns out to be , the Planck constant, whose value can already be measured in semi-classical thermodynamic phenomena, such as black hole radiation.
When a measurement is performed, the introduction of a measurement device changes the Hamiltonian of the observed system. Note that such a measurement device may be any large object interacting with the observed system - including a lab measurement device, eyes, ears, cameras, microphones etc. When the measurement device is coupled to the observed system, the change in the Hamiltonian can be described by adding to the Hamiltonian a linear operator, that ties between the time evolution of the observed system with that of the measurement device. This linear operator can thus be described as the product of a measurement operator, acting on the observed system, with another operator, acting on the measurement devices.
After the observed system and the measurement device interact in a manner described by this operator, they are said to be entangled, so that the quantum state of the measurement device together with the observed system is a superposition of different states, with each such state consisting of two parts: A state of the observed system with a particular measurement value, and a corresponding state of the measurement device measuring this particular value. For example, if the position of a particle is measured, the quantum state of the measurement device together with the particle will be a superposition of different states, in each of which the particle has a defined position and the measurement device shows this position; e.g. if the particle has two possible positions, x1 and x2, the overall state would be a linear combination of (particle at x1 and device showing x1) with (particle at x2 and device showing x2). The coefficients of this linear combination are called probability amplitudes; they are the inner products of the physical state with the basis vectors.
Because the measurement device is a large object, the different states where it shows different measurement results can no longer interact with each other due to a process called decoherence. Any observer (e.g. the physicist) only measures one of the results, with a probability that depends on the probability amplitude of that result according to Born rule. How this happens is a matter of interpretation: Either only one of the results will continue to exist due to a hypothetical process called wavefunction collapse, or all results will co-exist in different hypothetical worlds, with the observer we know of living in one of these worlds.
After a quantum state is measured, the only relevant part of it (due to decoherence and possibly also wavefunction collapse) has a well-defined value of the measurement operator. This means that it is an eigenstate of the measurement operator, with the measured value being the eigenvalue. Thus the different parts corresponding to the possible outcomes of the measurement are given by looking at the quantum state in a vector basis in which all basis vectors are eigenvectors of the measurement operator, i.e. a basis which diagonalizes this operator. Thus the measurement operator has to be diagonalizable. Further, if the possible measurement results are all real numbers, then the measurement operator must be Hermitian.
As explained previously, the measurement process, e.g. measuring the position of an electron, can be described as consisting of an entanglement of the observed system with the measuring device, so that the overall physical state is a superposition of states, each of which consists of a state for the observed system (e.g. the electron) with defined measured value (e.g. position), together with a corresponding state of the measuring device showing this value. It is usually possible to analyze the possible results with the corresponding probabilities without analyzing the complete quantum description of the whole system: Only the part relevant to the observed system (the electron) should be taken into account. In order to do that, we only have to look at the probability amplitude for each possible result, and sum over all resulting probabilities. This computation can be performed through the use of the density matrix of the measured object.
It can be shown that under the above definition for inner product, the time evolution operator is unitary, a property often referred to as the unitarity of the theory. This is equivalent to stating that the Hamiltonian is Hermitian:
This is desirable in order for the Hamiltonian to correspond to the classical Hamiltonian, which is why the -i factor is introduced (rather than defining the Hamiltonian with this factor included in it, which would result in an anti-Hermitian Hamiltonian). Indeed, in classical mechanics the Hamiltonian of a system is its energy, and thus in an energy measurement of an object, the measurement operator is the part of the Hamiltonian relating to this object. The energy is always a real number, and indeed the Hamiltonian is Hermitian.
Let us choose a vector basis that is diagonal in a certain measurement operator; then, if this measurement is performed, the probability to get a measurement result corresponding to a particular vector basis must somehow depend on the inner product of physical state with this basis vector, i.e. the probability amplitude for this result. It turns out to be the absolute square of the probability amplitude; this is knwon as Born rule.
Note that the probability given by Born rule to get a particular state is simply the norm of this state. Unitarity then means that the sum of probabilities of any isolated set of state is invariant under time evolution, as long as there is no wavefunction collapse. Indeed, interpretations with no wavefunction collapse (such as the many-worlds interpretation) always exhibit unitary time evolution, while for interpretations which include wavefunction collapse (such as the Copenhagen interpretation) include both unitaty and non-unitary time evolution, the latter happening during wavefunction collapse. 
Scientific inquiry into the wave nature of light began in the 17th and 18th centuries, when scientists such as Robert Hooke, Christiaan Huygens and Leonhard Euler proposed a wave theory of light based on experimental observations. In 1803 English polymath Thomas Young described the famous double-slit experiment. This experiment played a major role in the general acceptance of the wave theory of light.
In 1838 Michael Faraday discovered cathode rays. These studies were followed by the 1859 statement of the black-body radiation problem by Gustav Kirchhoff, the 1877 suggestion by Ludwig Boltzmann that the energy states of a physical system can be discrete, and the 1900 quantum hypothesis of Max Planck. Planck's hypothesis that energy is radiated and absorbed in discrete "quanta" (or energy packets) precisely matched the observed patterns of black-body radiation.
In 1896 Wilhelm Wien empirically determined a distribution law of black-body radiation, called Wien's law. Ludwig Boltzmann independently arrived at this result by considerations of Maxwell's equations. However, it was valid only at high frequencies and underestimated the radiance at low frequencies.
The foundations of quantum mechanics were established during the first half of the 20th century by Max Planck, Niels Bohr, Werner Heisenberg, Louis de Broglie, Arthur Compton, Albert Einstein, Richard Feynman, Erwin Schrödinger, Max Born, John von Neumann, Paul Dirac, Enrico Fermi, Wolfgang Pauli, Max von Laue, Freeman Dyson, David Hilbert, Wilhelm Wien, Satyendra Nath Bose, Arnold Sommerfeld, and others. The Copenhagen interpretation of Niels Bohr became widely accepted.
Max Planck corrected this model using Boltzmann's statistical interpretation of thermodynamics and proposed what is now called Planck's law, which led to the development of quantum mechanics. After Planck's solution in 1900 to the black-body radiation problem (reported 1859), Albert Einstein offered a quantum-based explanation of the photoelectric effect (1905, reported 1887). Around 1900-1910, the atomic theory but not the corpuscular theory of light first came to be widely accepted as scientific fact; these latter theories can be considered quantum theories of matter and electromagnetic radiation, respectively. However, the photon theory was not widely accepted until about 1915. Even until Einstein's Nobel Prize, Niels Bohr did not believe in the photon.
Among the first to study quantum phenomena were Arthur Compton, C. V. Raman, and Pieter Zeeman, each of whom has a quantum effect named after him. Robert Andrews Millikan studied the photoelectric effect experimentally, and Albert Einstein developed a theory for it. At the same time, Ernest Rutherford experimentally discovered the nuclear model of the atom, and Niels Bohr developed a theory of atomic structure, confirmed by the experiments of Henry Moseley. In 1913 Peter Debye extended Bohr's theory by introducing elliptical orbits, a concept also introduced by Arnold Sommerfeld. This phase is known as old quantum theory.
According to Planck, each energy element (E) is proportional to its frequency (?):
where h is Planck's constant.
Planck cautiously insisted that this was only an aspect of the processes of absorption and emission of radiation and was not the physical reality of the radiation. In fact, he considered his quantum hypothesis a mathematical trick to get the right answer rather than a sizable discovery. However, in 1905 Albert Einstein interpreted Planck's quantum hypothesis realistically and used it to explain the photoelectric effect, in which shining light on certain materials can eject electrons from the material. Einstein won the 1921 Nobel Prize in Physics for this work.
Einstein further developed this idea to show that an electromagnetic wave such as light could also be described as a particle (later called the photon), with a discrete amount of energy that depends on its frequency. In his paper "On the Quantum Theory of Radiation," Einstein expanded on the interaction between energy and matter to explain the absorption and emission of energy by atoms. Although overshadowed at the time by his general theory of relativity, this paper articulated the mechanism underlying the stimulated emission of radiation, which became the basis of the laser.
In the mid-1920s quantum mechanics was developed to become the standard formulation for atomic physics. In the summer of 1925, Bohr and Heisenberg published results that closed the old quantum theory. Due to their particle-like behavior in certain processes and measurements, light quanta came to be called photons (1926). In 1926 Erwin Schrödinger suggested a partial differential equation for the wave functions of particles like electrons. And when effectively restricted to a finite region, this equation allowed only certain modes, corresponding to discrete quantum states - whose properties turned out to be exactly the same as implied by matrix mechanics. Einstein's simple postulation spurred a flurry of debate, theorizing, and testing. Thus, the entire field of quantum physics emerged, leading to its wider acceptance at the Fifth Solvay Conference in 1927.
By 1930 quantum mechanics had been further unified and formalized by David Hilbert, Paul Dirac and John von Neumann with greater emphasis on measurement, the statistical nature of our knowledge of reality, and philosophical speculation about the 'observer'. It has since permeated many disciplines, including quantum chemistry, quantum electronics, quantum optics, and quantum information science. It also provides a useful framework for many features of the modern periodic table of elements, and describes the behaviors of atoms during chemical bonding and the flow of electrons in computer semiconductors, and therefore plays a crucial role in many modern technologies. Its speculative modern developments include string theory and quantum gravity theory.
The word quantum derives from the Latin, meaning "how great" or "how much". In quantum mechanics, it refers to a discrete unit assigned to certain physical quantities such as the energy of an atom at rest (see Figure 1). The discovery that particles are discrete packets of energy with wave-like properties led to the branch of physics dealing with atomic and subatomic systems which is today called quantum mechanics. It underlies the mathematical framework of many fields of physics and chemistry, including condensed matter physics, solid-state physics, atomic physics, molecular physics, computational physics, computational chemistry, quantum chemistry, particle physics, nuclear chemistry, and nuclear physics.[better source needed] Some fundamental aspects of the theory are still actively studied.
Quantum mechanics is essential for understanding the behavior of systems at atomic length scales and smaller. If the physical nature of an atom were solely described by classical mechanics, electrons would not orbit the nucleus, since orbiting electrons emit radiation (due to circular motion) and so would quickly lose energy and collide with the nucleus. This framework was unable to explain the stability of atoms. Instead, electrons remain in an uncertain, non-deterministic, smeared, probabilistic wave-particle orbital about the nucleus, defying the traditional assumptions of classical mechanics and electromagnetism.
Quantum mechanics was initially developed to provide a better explanation and description of the atom, especially the differences in the spectra of light emitted by different isotopes of the same chemical element, as well as subatomic particles. In short, the quantum-mechanical atomic model has succeeded spectacularly in the realm where classical mechanics and electromagnetism falter.
Broadly speaking, quantum mechanics incorporates four classes of phenomena for which classical physics cannot account:
In the mathematically rigorous formulation of quantum mechanics developed by Paul Dirac,David Hilbert,John von Neumann, and Hermann Weyl, the possible states of a quantum mechanical system are symbolized as unit vectors (called state vectors). Formally, these vectors are elements of a complex separable Hilbert space - variously called the state space or the associated Hilbert space of the system - that is well defined up to a complex number of norm 1 (the phase factor). In other words, the possible states are points in the projective space of a Hilbert space, usually called the complex projective space. The exact nature of this Hilbert space is dependent on the system - for example, the state space for position and momentum states is the space of square-integrable functions, while the state space for the spin of a single proton is just the product of two complex planes. Each observable is represented by a maximally Hermitian (precisely: by a self-adjoint) linear operator acting on the state space. Each eigenstate of an observable corresponds to an eigenvector of the operator, and the associated eigenvalue corresponds to the value of the observable in that eigenstate. If the operator's spectrum is discrete, the observable can attain only those discrete eigenvalues.
In the formalism of quantum mechanics, the state of a system at a given time is described by a complex wave function, also referred to as state vector in a complex vector space. This abstract mathematical object allows for the calculation of probabilities of outcomes of concrete experiments. For example, it allows one to compute the probability of finding an electron in a particular region around the nucleus at a particular time. Contrary to classical mechanics, one can never make simultaneous predictions of conjugate variables, such as position and momentum, to arbitrary precision. For instance, electrons may be considered (to a certain probability) to be located somewhere within a given region of space, but with their exact positions unknown. Contours of constant probability density, often referred to as "clouds", may be drawn around the nucleus of an atom to conceptualize where the electron might be located with the most probability. Heisenberg's uncertainty principle quantifies the inability to precisely locate the particle given its conjugate momentum.
According to one interpretation, as the result of a measurement, the wave function containing the probability information for a system collapses from a given initial state to a particular eigenstate. The possible results of a measurement are the eigenvalues of the operator representing the observable - which explains the choice of Hermitian operators, for which all the eigenvalues are real. The probability distribution of an observable in a given state can be found by computing the spectral decomposition of the corresponding operator. Heisenberg's uncertainty principle is represented by the statement that the operators corresponding to certain observables do not commute.
The probabilistic nature of quantum mechanics thus stems from the act of measurement. This is one of the most difficult aspects of quantum systems to understand. It was the central topic in the famous Bohr-Einstein debates, in which the two scientists attempted to clarify these fundamental principles by way of thought experiments. In the decades after the formulation of quantum mechanics, the question of what constitutes a "measurement" has been extensively studied. Newer interpretations of quantum mechanics have been formulated that do away with the concept of "wave function collapse" (see, for example, the relative state interpretation). The basic idea is that when a quantum system interacts with a measuring apparatus, their respective wave functions become entangled, so that the original quantum system ceases to exist as an independent entity. For details, see the article on measurement in quantum mechanics.
Generally, quantum mechanics does not assign definite values. Instead, it makes a prediction using a probability distribution; that is, it describes the probability of obtaining the possible outcomes from measuring an observable. Often these results are skewed by many causes, such as dense probability clouds. Probability clouds are approximate (but better than the Bohr model) whereby electron location is given by a probability function, the wave function eigenvalue, such that the probability is the squared modulus of the complex amplitude, or quantum state nuclear attraction. Naturally, these probabilities will depend on the quantum state at the "instant" of the measurement. Hence, uncertainty is involved in the value. There are, however, certain states that are associated with a definite value of a particular observable. These are known as eigenstates of the observable ("eigen" can be translated from German as meaning "inherent" or "characteristic").
In the everyday world, it is natural and intuitive to think of everything (every observable) as being in an eigenstate. Everything appears to have a definite position, a definite momentum, a definite energy, and a definite time of occurrence. However, quantum mechanics does not pinpoint the exact values of a particle's position and momentum (since they are conjugate pairs) or its energy and time (since they too are conjugate pairs). Rather, it provides only a range of probabilities in which that particle might be given its momentum and momentum probability. Therefore, it is helpful to use different words to describe states having uncertain values and states having definite values (eigenstates).
Usually, a system will not be in an eigenstate of the observable (particle) we are interested in. However, if one measures the observable, the wave function will instantaneously be an eigenstate (or "generalized" eigenstate) of that observable. This process is known as wave function collapse, a controversial and much-debated process that involves expanding the system under study to include the measurement device. If one knows the corresponding wave function at the instant before the measurement, one will be able to compute the probability of the wave function collapsing into each of the possible eigenstates.
For example, the free particle in the previous example will usually have a wave function that is a wave packet centered around some mean position x0 (neither an eigenstate of position nor of momentum). When one measures the position of the particle, it is impossible to predict with certainty the result. It is probable, but not certain, that it will be near x0, where the amplitude of the wave function is large. After the measurement is performed, having obtained some result x, the wave function collapses into a position eigenstate centered at x.
The time evolution of a quantum state is described by the Schrödinger equation, in which the Hamiltonian (the operator corresponding to the total energy of the system) generates the time evolution. The time evolution of wave functions is deterministic in the sense that - given a wave function at an initial time - it makes a definite prediction of what the wave function will be at any later time.
During a measurement, on the other hand, the change of the initial wave function into another, later wave function is not deterministic, it is unpredictable (i.e., random). A time-evolution simulation can be seen here.
Wave functions change as time progresses. The Schrödinger equation describes how wave functions change in time, playing a role similar to Newton's second law in classical mechanics. The Schrödinger equation, applied to the aforementioned example of the free particle, predicts that the center of a wave packet will move through space at a constant velocity (like a classical particle with no forces acting on it). However, the wave packet will also spread out as time progresses, which means that the position becomes more uncertain with time. This also has the effect of turning a position eigenstate (which can be thought of as an infinitely sharp wave packet) into a broadened wave packet that no longer represents a (definite, certain) position eigenstate.
Some wave functions produce probability distributions that are constant, or independent of time - such as when in a stationary state of definite energy, time vanishes in the absolute square of the wave function (this is the basis for the energy-time uncertainty principle). Many systems that are treated dynamically in classical mechanics are described by such "static" wave functions. For example, a single electron in an unexcited atom is pictured classically as a particle moving in a circular trajectory around the atomic nucleus, whereas in quantum mechanics, it is described by a static, spherically symmetric wave function surrounding the nucleus (Fig. 1) (however, only the lowest angular momentum states, labeled s, are spherically symmetric.)
The Schrödinger equation acts on the entire probability amplitude, not merely its absolute value. Whereas the absolute value of the probability amplitude encodes information about probabilities, its phase encodes information about the interference between quantum states. This gives rise to the "wave-like" behavior of quantum states.
Analytic solutions of the Schrödinger equation are known for very few relatively simple model Hamiltonians including the quantum harmonic oscillator, the particle in a box, the dihydrogen cation, and the hydrogen atom. Even the helium atom - which contains just two electrons - has defied all attempts at a fully analytic treatment.
However, there are techniques for finding approximate solutions. One method, called perturbation theory, uses the analytic result for a simple quantum mechanical model to create a result for a related but more complicated model by (for example) the addition of a weak potential energy. Another method is called "semi-classical equation of motion", which applies to systems for which quantum mechanics produces only small deviations from classical behavior. These deviations can then be computed based on the classical motion. This approach is particularly important in the field of quantum chaos.
There are many mathematically equivalent formulations of quantum mechanics. One of the oldest and most common is the "transformation theory" proposed by Paul Dirac, which unifies and generalizes the two earliest formulations of quantum mechanics - matrix mechanics (invented by Werner Heisenberg) and wave mechanics (invented by Erwin Schrödinger).
Especially since Heisenberg was awarded the Nobel Prize in Physics in 1932 for the creation of quantum mechanics, the role of Max Born in the development of QM was overlooked until the 1954 Nobel award. The role is noted in a 2005 biography of Born, which recounts his role in the matrix formulation and the use of probability amplitudes. Heisenberg acknowledges having learned matrices from Born, as published in a 1940 festschrift honoring Max Planck. In the matrix formulation, the instantaneous state of a quantum system encodes the probabilities of its measurable properties, or "observables". Examples of observables include energy, position, momentum, and angular momentum. Observables can be either continuous (e.g., the position of a particle) or discrete (e.g., the energy of an electron bound to a hydrogen atom). An alternative formulation of quantum mechanics is Feynman's path integral formulation, in which a quantum-mechanical amplitude is considered as a sum over all possible classical and non-classical paths between the initial and final states. This is the quantum-mechanical counterpart of the action principle in classical mechanics.
The rules of quantum mechanics are fundamental. They assert that the state space of a system is a Hilbert space (crucially, that the space has an inner product) and that observables of the system are Hermitian operators acting on vectors in that space - although they do not tell us which Hilbert space or which operators. These can be chosen appropriately in order to obtain a quantitative description of a quantum system. An important guide for making these choices is the correspondence principle, which states that the predictions of quantum mechanics reduce to those of classical mechanics when a system moves to higher energies or, equivalently, larger quantum numbers, i.e. whereas a single particle exhibits a degree of randomness, in systems incorporating millions of particles averaging takes over and, at the high energy limit, the statistical probability of random behaviour approaches zero. In other words, classical mechanics is simply a quantum mechanics of large systems. This "high energy" limit is known as the classical or correspondence limit. One can even start from an established classical model of a particular system, then try to guess the underlying quantum model that would give rise to the classical model in the correspondence limit.
|Unsolved problem in physics:|
In the correspondence limit of quantum mechanics: Is there a preferred interpretation of quantum mechanics? How does the quantum description of reality, which includes elements such as the "superposition of states" and "wave function collapse", give rise to the reality we perceive?(more unsolved problems in physics)
When quantum mechanics was originally formulated, it was applied to models whose correspondence limit was non-relativistic classical mechanics. For instance, the well-known model of the quantum harmonic oscillator uses an explicitly non-relativistic expression for the kinetic energy of the oscillator, and is thus a quantum version of the classical harmonic oscillator.
Early attempts to merge quantum mechanics with special relativity involved the replacement of the Schrödinger equation with a covariant equation such as the Klein-Gordon equation or the Dirac equation. While these theories were successful in explaining many experimental results, they had certain unsatisfactory qualities stemming from their neglect of the relativistic creation and annihilation of particles. A fully relativistic quantum theory required the development of quantum field theory, which applies quantization to a field (rather than a fixed set of particles). The first complete quantum field theory, quantum electrodynamics, provides a fully quantum description of the electromagnetic interaction. The full apparatus of quantum field theory is often unnecessary for describing electrodynamic systems. A simpler approach, one that has been used since the inception of quantum mechanics, is to treat charged particles as quantum mechanical objects being acted on by a classical electromagnetic field. For example, the elementary quantum model of the hydrogen atom describes the electric field of the hydrogen atom using a classical Coulomb potential. This "semi-classical" approach fails if quantum fluctuations in the electromagnetic field play an important role, such as in the emission of photons by charged particles.
Quantum field theories for the strong nuclear force and the weak nuclear force have also been developed. The quantum field theory of the strong nuclear force is called quantum chromodynamics, and describes the interactions of subnuclear particles such as quarks and gluons. The weak nuclear force and the electromagnetic force were unified, in their quantized forms, into a single quantum field theory (known as electroweak theory), by the physicists Abdus Salam, Sheldon Glashow and Steven Weinberg. These three men shared the Nobel Prize in Physics in 1979 for this work.
It has proven difficult to construct quantum models of gravity, the remaining fundamental force. Semi-classical approximations are workable, and have led to predictions such as Hawking radiation. However, the formulation of a complete theory of quantum gravity is hindered by apparent incompatibilities between general relativity (the most accurate theory of gravity currently known) and some of the fundamental assumptions of quantum theory. The resolution of these incompatibilities is an area of active research. Candidates for a future theory of quantum gravity include string theory.
Predictions of quantum mechanics have been verified experimentally to an extremely high degree of accuracy. According to the correspondence principle between classical and quantum mechanics, all objects obey the laws of quantum mechanics, and classical mechanics is just an approximation for large systems of objects (or a statistical quantum mechanics of a large collection of particles). The laws of classical mechanics thus follow from the laws of quantum mechanics as a statistical average at the limit of large systems or large quantum numbers (Ehrenfest theorem). However, chaotic systems do not have good quantum numbers, and quantum chaos studies the relationship between classical and quantum descriptions in these systems.
Quantum coherence is an essential difference between classical and quantum theories as illustrated by the Einstein-Podolsky-Rosen (EPR) paradox - an attack on a certain philosophical interpretation of quantum mechanics by an appeal to local realism.Quantum interference involves adding together probability amplitudes, whereas classical "waves" infer that there is an adding together of intensities. For microscopic bodies, the extension of the system is much smaller than the coherence length, which gives rise to long-range entanglement and other nonlocal phenomena characteristic of quantum systems. Quantum coherence is not typically evident at macroscopic scales, except maybe at temperatures approaching absolute zero at which quantum behavior may manifest macroscopically. This is in accordance with the following observations:
A big difference between classical and quantum mechanics is that they use very different kinematic descriptions.
In Niels Bohr's mature view, quantum mechanical phenomena are required to be experiments, with complete descriptions of all the devices for the system, preparative, intermediary, and finally measuring. The descriptions are in macroscopic terms, expressed in ordinary language, supplemented with the concepts of classical mechanics. The initial condition and the final condition of the system are respectively described by values in a configuration space, for example a position space, or some equivalent space such as a momentum space. Quantum mechanics does not admit a completely precise description, in terms of both position and momentum, of an initial condition or "state" (in the classical sense of the word) that would support a precisely deterministic and causal prediction of a final condition. In this sense, a quantum phenomenon is a process, a passage from initial to final condition, not an instantaneous "state" in the classical sense of that word. Thus there are two kinds of processes in quantum mechanics: stationary and transitional. For a stationary process, the initial and final condition are the same. For a transition, they are different. Obviously by definition, if only the initial condition is given, the process is not determined. Given its initial condition, prediction of its final condition is possible, causally but only probabilistically, because the Schrödinger equation is deterministic for wave function evolution, but the wave function describes the system only probabilistically.
For many experiments, it is possible to think of the initial and final conditions of the system as being a particle. In some cases it appears that there are potentially several spatially distinct pathways or trajectories by which a particle might pass from initial to final condition. It is an important feature of the quantum kinematic description that it does not permit a unique definite statement of which of those pathways is actually followed. Only the initial and final conditions are definite, and, as stated in the foregoing paragraph, they are defined only as precisely as allowed by the configuration space description or its equivalent. In every case for which a quantum kinematic description is needed, there is always a compelling reason for this restriction of kinematic precision. An example of such a reason is that for a particle to be experimentally found in a definite position, it must be held motionless; for it to be experimentally found to have a definite momentum, it must have free motion; these two are logically incompatible.
Classical kinematics does not primarily demand experimental description of its phenomena. It allows completely precise description of an instantaneous state by a value in phase space, the Cartesian product of configuration and momentum spaces. This description simply assumes or imagines a state as a physically existing entity without concern about its experimental measurability. Such a description of an initial condition, together with Newton's laws of motion, allows a precise deterministic and causal prediction of a final condition, with a definite trajectory of passage. Hamiltonian dynamics can be used for this. Classical kinematics also allows the description of a process analogous to the initial and final condition description used by quantum mechanics. Lagrangian mechanics applies to this. For processes that need account to be taken of actions of a small number of Planck constants, classical kinematics is not adequate; quantum mechanics is needed.
Even with the defining postulates of both Einstein's theory of general relativity and quantum theory being indisputably supported by rigorous and repeated empirical evidence, and while they do not directly contradict each other theoretically (at least with regard to their primary claims), they have proven extremely difficult to incorporate into one consistent, cohesive model.
Gravity is negligible in many areas of particle physics, so that unification between general relativity and quantum mechanics is not an urgent issue in those particular applications. However, the lack of a correct theory of quantum gravity is an important issue in physical cosmology and the search by physicists for an elegant "Theory of Everything" (TOE). Consequently, resolving the inconsistencies between both theories has been a major goal of 20th- and 21st-century physics. Many prominent physicists, including Stephen Hawking, worked for many years to create a theory underlying everything. This TOE would combine not only the models of subatomic physics, but also derive the four fundamental forces of nature - the strong force, electromagnetism, the weak force, and gravity - from a single force or phenomenon. However, after considering Gödel's Incompleteness Theorem, Hawking concluded that a theory of everything is not possible, and stated so publicly in his lecture "Gödel and the End of Physics" (2002).
The quest to unify the fundamental forces through quantum mechanics is ongoing. Quantum electrodynamics (or "quantum electromagnetism"), which is (at least in the perturbative regime) the most accurately tested physical theory in competition with general relativity, has been merged with the weak nuclear force into the electroweak force; work continues, to merge it with the strong force into the electrostrong force. Current predictions state that at around 1014 GeV these three forces fuse into a single field. Beyond this "grand unification", it is speculated that it may be possible to merge gravity with the other three gauge symmetries, expected to occur at roughly 1019 GeV. However - and while special relativity is parsimoniously incorporated into quantum electrodynamics - the expanded general relativity, currently the best theory describing the gravitation force, has not been fully incorporated into quantum theory. One of those searching for a coherent TOE is Edward Witten, a theoretical physicist who formulated the M-theory, which is an attempt at describing the supersymmetrical based string theory. M-theory posits that our apparent 4-dimensional spacetime is, in reality, actually an 11-dimensional spacetime containing 10 spatial dimensions and 1 time dimension, although 7 of the spatial dimensions are - at lower energies - completely "compactified" (or infinitely curved) and not readily amenable to measurement or probing.
Another popular theory is loop quantum gravity (LQG) proposed by Carlo Rovelli, that describes quantum properties of gravity. It is also a theory of quantum spacetime and quantum time, because in general relativity the geometry of spacetime is a manifestation of gravity. LQG is an attempt to merge and adapt standard quantum mechanics and standard general relativity. This theory describes space as granular analogous to the granularity of photons in the quantum theory of electromagnetism and the discrete energy levels of atoms. More precisely, space is an extremely fine fabric or networks "woven" of finite loops called spin networks. The evolution of a spin network over time is called a spin foam. The predicted size of this structure is the Planck length, which is approximately 1.616×10-35 m. According to this theory, there is no meaning to length shorter than this (cf. Planck scale energy).
Since its inception, the many counter-intuitive aspects and results of quantum mechanics have provoked strong philosophical debates and many interpretations. Even fundamental issues, such as Max Born's basic rules about probability amplitudes and probability distributions, took decades to be appreciated by society and many leading scientists. Richard Feynman once said, "I think I can safely say that nobody understands quantum mechanics." According to Steven Weinberg, "There is now in my opinion no entirely satisfactory interpretation of quantum mechanics."
The Copenhagen interpretation - due largely to Niels Bohr and Werner Heisenberg - remains most widely accepted some 75 years after its enunciation. According to this interpretation, the probabilistic nature of quantum mechanics is not a temporary feature which will eventually be replaced by a deterministic theory, but is instead a final renunciation of the classical idea of "causality". It also states that any well-defined application of the quantum mechanical formalism must always make reference to the experimental arrangement, due to the conjugate nature of evidence obtained under different experimental situations.
Albert Einstein, himself one of the founders of quantum theory, did not accept some of the more philosophical or metaphysical interpretations of quantum mechanics, such as rejection of determinism and of causality. He famously said about this, "God does not play with dice". He rejected the concept that the state of a physical system depends on the experimental arrangement for its measurement. He held that a state of nature occurs in its own right, regardless of whether or how it might be observed. That view is supported by the currently accepted definition of a quantum state, which does not depend on the configuration space for its representation, that is to say, manner of observation. Einstein also believed that underlying quantum mechanics must be a theory that thoroughly and directly expresses the rule against action at a distance; in other words, he insisted on the principle of locality. He considered, but rejected on theoretical grounds, a particular proposal for hidden variables to obviate the indeterminism or acausality of quantum mechanical measurement. He believed that quantum mechanics was a currently valid but not a permanently definitive theory for quantum phenomena. He thought its future replacement would require profound conceptual advances, and would not come quickly or easily. The Bohr-Einstein debates provide a vibrant critique of the Copenhagen interpretation from an epistemological point of view. In arguing for his views, he produced a series of objections, of which the most famous has become known as the Einstein-Podolsky-Rosen paradox.
John Bell showed that this EPR paradox led to experimentally testable differences between quantum mechanics and theories that rely on local hidden variables. Experiments confirmed the accuracy of quantum mechanics, thereby showing that quantum mechanics cannot be improved upon by addition of local hidden variables. Alain Aspect's experiments in 1982 and many later experiments definitively verified quantum entanglement. Entanglement, as demonstrated in Bell-type experiments, does not violate causality, since it does not involve transfer of information. By the early 1980s, experiments had shown that such inequalities were indeed violated in practice - so that there were in fact correlations of the kind suggested by quantum mechanics. At first these just seemed like isolated esoteric effects, but by the mid-1990s, they were being codified in the field of quantum information theory, and led to constructions with names like quantum cryptography and quantum teleportation.Quantum cryptography is proposed for use in high-security applications in banking and government.
The Everett many-worlds interpretation, formulated in 1956, holds that all the possibilities described by quantum theory simultaneously occur in a multiverse composed of mostly independent parallel universes. This is not accomplished by introducing a "new axiom" to quantum mechanics, but by removing the axiom of the collapse of the wave packet. All possible consistent states of the measured system and the measuring apparatus (including the observer) are present in a real physical - not just formally mathematical, as in other interpretations - quantum superposition. Such a superposition of consistent state combinations of different systems is called an entangled state. While the multiverse is deterministic, we perceive non-deterministic behavior governed by probabilities, because we can only observe the universe (i.e., the consistent state contribution to the aforementioned superposition) that we, as observers, inhabit. Everett's interpretation is perfectly consistent with John Bell's experiments and makes them intuitively understandable. However, according to the theory of quantum decoherence, these "parallel universes" will never be accessible to us. The inaccessibility can be understood as follows: once a measurement is done, the measured system becomes entangled with both the physicist who measured it and a huge number of other particles, some of which are photons flying away at the speed of light towards the other end of the universe. In order to prove that the wave function did not collapse, one would have to bring all these particles back and measure them again, together with the system that was originally measured. Not only is this completely impractical, but even if one could theoretically do this, it would have to destroy any evidence that the original measurement took place (including the physicist's memory).
In light of the Bell tests, Cramer in 1986 formulated his transactional interpretation which is unique in providing a physical explanation for the Born rule.Relational quantum mechanics appeared in the late 1990s as the modern derivative of the Copenhagen interpretation.
Quantum mechanics has had enormous success in explaining many of the features of our universe, with regards to small-scale and discrete quantities and interactions which cannot be explained by classical methods. Quantum mechanics is often the only theory that can reveal the individual behaviors of the subatomic particles that make up all forms of matter (electrons, protons, neutrons, photons, and others). Quantum mechanics has strongly influenced string theories, candidates for a Theory of Everything (see reductionism).
In many aspects modern technology operates at a scale where quantum effects are significant. Important applications of quantum theory include quantum chemistry, quantum optics, quantum computing, superconducting magnets, light-emitting diodes, the optical amplifier and the laser, the transistor and semiconductors such as the microprocessor, medical and research imaging such as magnetic resonance imaging and electron microscopy. Explanations for many biological and physical phenomena are rooted in the nature of the chemical bond, most notably the macro-molecule DNA.
For example, consider a free particle. In quantum mechanics, a free matter is described by a wave function. The particle properties of the matter become apparent when we measure its position and velocity. The wave properties of the matter become apparent when we measure its wave properties like interference. The wave-particle duality feature is incorporated in the relations of coordinates and operators in the formulation of quantum mechanics. Since the matter is free (not subject to any interactions), its quantum state can be represented as a wave of arbitrary shape and extending over space as a wave function. The position and momentum of the particle are observables. The Uncertainty Principle states that both the position and the momentum cannot simultaneously be measured with complete precision. However, one can measure the position (alone) of a moving free particle, creating an eigenstate of position with a wave function that is very large (a Dirac delta) at a particular position x, and zero everywhere else. If one performs a position measurement on such a wave function, the resultant x will be obtained with 100% probability (i.e., with full certainty, or complete precision). This is called an eigenstate of position - or, stated in mathematical terms, a generalized position eigenstate (eigendistribution). If the particle is in an eigenstate of position, then its momentum is completely unknown. On the other hand, if the particle is in an eigenstate of momentum, then its position is completely unknown. In an eigenstate of momentum having a plane wave form, it can be shown that the wavelength is equal to h/p, where h is Planck's constant and p is the momentum of the eigenstate.
The particle in a one-dimensional potential energy box is the most mathematically simple example where restraints lead to the quantization of energy levels. The box is defined as having zero potential energy everywhere inside a certain region, and therefore infinite potential energy everywhere outside that region. For the one-dimensional case in the direction, the time-independent Schrödinger equation may be written
With the differential operator defined by
the previous equation is evocative of the classic kinetic energy analogue,
with state in this case having energy coincident with the kinetic energy of the particle.
The general solutions of the Schrödinger equation for the particle in a box are
or, from Euler's formula,
The infinite potential walls of the box determine the values of and at and where must be zero. Thus, at ,
and . At ,
in which cannot be zero as this would conflict with the Born interpretation. Therefore, since , must be an integer multiple of ,
The quantization of energy levels follows from this constraint on since
The ground state energy of the particles is for
The energy of the particle in the th state is
Particle in a box with boundary condition
In this condition the general solution will be same, there will little change to the final result, since the boundary conditions are changed only slightly:
At the wave function is not actually zero at all values of
Clearly, from the wave function variation graph we have, At the wave function follows a cosine curve with as the origin.
At the wave function follows a sine curve with as the origin.
From this observation we can conclude that the wave function is alternatively sine and cosine. So in this case the resultant wave equation is
A finite potential well is the generalization of the infinite potential well problem to potential wells having finite depth.
The finite potential well problem is mathematically more complicated than the infinite particle-in-a-box problem as the wave function is not pinned to zero at the walls of the well. Instead, the wave function must satisfy more complicated mathematical boundary conditions as it is nonzero in regions outside the well.
This is a model for the quantum tunneling effect which plays an important role in the performance of modern technologies such as flash memory and scanning tunneling microscopy. Quantum tunneling is central to physical phenomena involved in superlattices.
As in the classical case, the potential for the quantum harmonic oscillator is given by
This problem can either be treated by directly solving the Schrödinger equation, which is not trivial, or by using the more elegant "ladder method" first proposed by Paul Dirac. The eigenstates are given by
where Hn are the Hermite polynomials
and the corresponding energy levels are
This is another example illustrating the quantification of energy for bound states.
The potential in this case is given by:
The solutions are superpositions of left- and right-moving waves:
Each term of the solution can be interpreted as an incident, reflected, or transmitted component of the wave, allowing the calculation of transmission and reflection coefficients. Notably, in contrast to classical mechanics, incident particles with energies greater than the potential step are partially reflected.
"...it was long believed that the wave function of the Schrödinger equation would never have a macroscopic representation analogous to the macroscopic representation of the amplitude for photons. On the other hand, it is now realized that the phenomena of superconductivity presents us with just this situation.
The following titles, all by working physicists, attempt to communicate quantum theory to lay people, using a minimum of technical apparatus.
On Study Guides