What is the most essential reason that actually leads to the quantization. I am reading the book on quantum mechanics by Griffiths. The quanta in the infinite potential well for e.g. arise due to the boundary conditions, and the quanta in harmonic oscillator arise due to the commutation relations of the ladder operators, which give energy eigenvalues differing by a multiple of $\hbar$. But what actually is the reason for the discreteness in quantum theory? Which postulate is responsible for that. I tried going backwards, but for me it somehow seems to come magically out of the mathematics.
7 Answers
If I'm only allowed to use one single word to give an oversimplified intuitive reason for the discreteness in quantum mechanics, I would choose the word 'compactness'. Examples:
The finite number of states in a compact region of phase space. See e.g. this & this Phys.SE posts.
The discrete spectrum for Lie algebra generators of a compact Lie group, e.g. angular momentum operators. See also this Phys.SE post.
On the other hand, the position space $\mathbb{R}^3$ in elementary non-relativistic quantum mechanics is not compact, in agreement that we in principle can find the point particle in any continuous position $\vec{r}\in\mathbb{R}^3$. See also this Phys.SE post.
- 220,844
There are several forms of discreteness in quantum theory. The simplest one is the discreteness of eigenvalues and the associated countable eigenstates. Those arise similarly to the discrete standing waves on a guitar string. The boundary conditions only allow certain standing waves that nicely fit into the enforced region in space. Even though the string is a continuous object, its spectrum becomes discontinuous and is naturally labeled with natural numbers. Exactly the same thing happens in unbounded (from above) quantum potentials like the infinite well or the harmonic oscillator, where you also get discrete standing quantum waves. (Other potentials can generate both discrete and continuous eigenvalues at the same time)
Another reason for discreteness comes in with multi-particle systems. Quantum theory requires that a system that is realized in space-time contains a unitary representation of the symmetry group of space-time, the lorentz group. In fact, you can define a particle in quantum theory as a subsystem that contains such a group representation. And because you can't have any non integer fraction of a unitary group representation, you need to have an integer number of them in your total system. So the number of particles is also an (expected) discrete feature, and it plays a role when you talk about single photons for example, that are either absorbed completely or not at all.
And finally there is a form of discreteness that comes with quantum measurement. The measurement postulate says that the result of a measurement is an eigenvalue of an hermitian operator called an observable. Now the existence of discrete spectra for these operators is related to my first point (boundary conditions), but this one goes deeper. While the existence of a discrete spectrum of the energies of a system still allows all continuous energy values by superposition, the measurement outcome results in exactly one (often discrete) result. This is responsible for the discreteness of the beams in the Stern-Gerlach experiment for example. Why quantum measurement works this way is essentially an open question even today. There are some approaches to answer it, but there is no generally accepted answer that would explain all aspects convincingly.
- 957
If you want you can go back to Planck's derivation of the black body energy spectrum, otherwise known as Planck's law, as well as Einstein's use of Planck's work in his explanation of the Photo Electric Effect (which garnered him the Nobel prize) in order to first understand some of the experimental motivation. However, to understand the roots of quantum mechanics in atomic physics, one must go back to Bohr and Rutherford model of hydrogen. An Introduction to Quantum Physics by French and Taylor discusses the Bohr-Rutherford model of the hydrogen atom on page 24. This model was introduced around 1913 and Bohr provided two key postulates:
An atom has a number of possible "stationary states." In any one of these states the electrons perform orbital motions according to Newtonian mechanics, but (contrary to the predictions of classical electromagnetism) do not radiate so long as they remain in fixed orbits.
When the atom passes from one stationary state to another, corresponding to a change in orbit (a "quantum jump") by one of the electrons in the atom, radiation is emitted in the form of a photon. The photon energy is just the energy difference between the initial and final states of the atom. The classical frequency $\nu$ is related to this energy through the Planck-Einstein relation:
$$E_{photon} = E_i - E_f = h\nu$$
Which was described in Bohr's paper On the Constitution of Atoms and Molecules. These postulates are slightly dated in modern conceptions of electron motion, since we now understand things better in terms of the Schrodinger equation, which allows for an an extremely accurate model of the hydrogen atom. However, one of the key concepts Bohr introduced is the Correspondence Principle, which according to French and Taylor:
...requires classical and quantum predictions to agree in the limit of large quantum numbers...
This is a key ingredient in modern physics, and is best understood in terms of asymptotic analysis. Most modern theories connect to real observed phenomena at the large N limit of the theory.
Admittedly these are the practical origins of why we have quantum mechanics, as far as the reason nature chose these things, the answer might be very anthropic. We simple wouldn't exist without them. Dirac frequently pondered the question why and here was his answer in 1963:
It seems to be one of the fundamental features of nature that fundamental physical laws are described in terms of a mathematical theory of great beauty and power, needing quite a high standard of mathematics for one to understand it. You may wonder: Why is nature constructed along these lines? One can only answer that our present knowledge seems to show that nature is so constructed. We simply have to accept it. One could perhaps describe the situation by saying that God is a mathematician of a very high order, and He used very advanced mathematics in constructing the universe.
Despite several modern attempts to attack the more meta-physical aspects of this, and give them rigor, there is still no really good answer...as Feynman or Mermin said:
Shut up and calculate!
- 5,585
A few answers offered are either (a) not valid and/or (b) are not answering the question. For issue (a): It is not the mathematics that determines the physics, the mathematics only models the physics. For issue (b): The question is best answered by the reply that noted bound states are where energy levels are discrete, or in systems with semi-bounding potentials, like a wall. The Postulate of QM responsible is the superposition postulate. This is why wave forms arise as solutions to the spinor equation, which is essentially describing stochastic dynamics. (The result is the probability density.)
On this interpretation it is not that energy levels are discrete per se, but that the transition probabilities become very sharply peaked. (Extremely sharply peaked.)
An alternate view was expressed by Carl Bender in his lectures on perturbation methods, I cannot find where he caches this out, but he claimed perturbation theory explains why the energy levels get "quantized". Time stamp here: https://youtu.be/LYNOGk3ZjFM?list=PLOFVFbzrQ49TNlDOxxCAjC7kbnorAR1MU&t=4738 I have not located where he delivers on this promise, nor precisely where in the textbook with Orszag it gets highlighted. It is a bit buried. However, this is using the framework of standard QM, so this is not a great physical insight, it's just saying "do the math". At which point your best physical appeal is to analogy with resonances in ordinary classical wave phenomena, and you can try to imagine how the statistical account of QM can possibly behave analogously like waves, or wonder about why the superposition solutions to the governing equations for QM are valid. Then it is down to appeal to experiment. Which is all very unsatisfying to many philosophically inclined ontological realists.
So that sort of reply does not address the deeper question, which is why superposition occurs. No one really knows. It is just postulated, and validated empirically.
However, some hints can be given. In recent decades alternative axiom schemes for QM have been found, and the generalized probability theories (GPT) and the Barandes stochastic dynamics model, both suggest entanglement is the reason there are superpositions (or at least entanglement and superposition are inseparable). The Barandes model suggests the wavefunction is fictional, and Hilbert space pure abstraction, but whatever the real ontology is, the Schrödinger-Pauli & Dirac theories that provide solutions for the wavefunction (spinors) has a stochastic interpretation, and the quantum mechanics of it all is the fact the probability transitions are indivisible or non-Markov (see Barandes' papers for detials) https://arxiv.org/abs/2302.10778 (also https://arxiv.org/pdf/2402.16935)
Being a stochastic processes model Barandes' work does not explain indivisibility, hence does not explain entanglement. For the same reason neither do GPT's. Hence entanglement or superposition is still just at the level of postulate, or easily derived from postulates, in both these approaches.
Currently the cutting edge deeper explanation would be either holography (ER=EPR) or extending GR, to admit wormhole topology on the "Planck scale" (vaguely speaking). In the latter proposal, GR becomes a quantum theory according to GPT frameworks since a ubiquitous wormhole topology in the bulk is equivalent to entanglement on a (possibly asymptotic undefined) boundary theory. You can loosely take the latter to mean "10 meters or more away from a scattering event" without too much loss of the essential physics of it all.
GPT's assume measurement instruments so have a concept of a boundary that is operational, not cosmological. Which makes that ER=EPR rubric ok, I would hazard a guess. In Barandes' model (which reproduces all of orthodox QM and RQM) you are back to just having entanglement as a postulate, though it is a different postulate; it is the postulate that generic probability transition matrices are indivisible. (The stochastic dynamics is not memoryless, roughly.)
More technically, for a transition matrix $\Gamma(t)$ it will generically fail to separate or compose, meaning one cannot find an intervening $\Gamma(t\leftarrow t_0)$ that is still unistochastic such that, $$ \text{(Divisible:)}\quad\Gamma(t) = \Gamma(t\leftarrow t_0) \Gamma(t_0). $$ If such a unistochastic $\Gamma(t\leftarrow t_0)$ can be found then the system is divisible, Markov, and evolves classically (stochastically). No reason is given by Barandes why the stochastic dynamics of our universe is generically indivisible, it is the postulate. It implies entanglement exists, and hence superpositions. And hence the rest of the story leading to discrete energy levels for bounded systems.
- 21
- 1
In a more mathematical sense, the discreteness just arises out of the mathematics. For example: The Schrodinger equation is a classic Sturm-Liouville problem in ODE. https://en.wikipedia.org/wiki/Sturm–Liouville_theory
That means we get eigenfunctions (our eigenstates in QM) and eigenvalues corresponding to those eigenfunctions (our energy levels). The Hamiltonian operator in the Schrodinger equation would be our self adjoint SL operator.
- 178
- 10
A very interesting question, indeed !
In late 19th century Physics had ordinary crisis - classical physics at that time predicted that black body emitted radiation intensity must increase monotonically with increasing wave frequency. This can be seen from a graph (black curve, 5000K) :

One, by summing all energies which black body radiates away from all frequencies can show that it must approach infinity. Thus black body would almost instantly radiate all it's energy away and cool-down to absolute zero. This is known as "Ultraviolet catastrophe". But in practice, it was not the case. Black body really radiated according to unknown law at that time (blue curve, 5000K).
In 1900 Max Plank using strange assumptions at that time, that energy is absorbed or emitted discreetly - by energy quanta ($E=h\nu$) - was able to derive correct intensity spectral distribution law and resolve Ultraviolet catastrophe :
$$ B_{\lambda }(\lambda ,T)={\frac {2hc^{2}}{\lambda ^{5}}}{\frac {1}{e^{hc/(\lambda k_{\mathrm {B} }T)}-1}} $$
Albert Einstein in 1905 once again patched Physics and showed that Plank's quanta is not just empty theoretical construct, but real physical particles, which now we call photons.
- 16,916
The discreteness of quantum mechanics, is evident from the experimental evidence. Any experiment, take for example the stern gerlach, Will yield probabilistic answers under identical experimental conditions. The Matrix structure of quantum mechanics allows us to calculate only probability amplitudes of processes to happen.
- 2,052