9

Take, for example, the hydrogen atom. Both the classical and the quantum models are based on the same Hamiltonian, describing the Coulomb potential. The classical model however misses a lot of important properties like the discrete energy spectrum. The quantum model does the job right (of course, the simple Coulomb model only works well to some limit, but that is another story).

Apparently, to obtain the correct observables like the energy spectrum one only needs to know that the right description is quantum. No new parameters which are model-specific appear (Plank's constant is universal).

Speaking more generally and more loosely, the quantum description becomes relevant at a very small scale. It seems natural to expect that a lot more details are visible at this scale. However, the input of our model, the Hamiltonian, stays basically the same. Only the general theoretical framework changes.

Probably, the question may be rephrased as follows. Why do the quantization rules exist? By the quantization rules I mean the procedures that allow to go from the classical description to the quantum in a very uniform fashion that is applicable to many systems?

Most likely my question is not too firm and contains some wrong assumptions. However, if it were not for this confusion I would not be asking!

2 Answers2

5

The "reason" why the procedure of quantization works cannot be known. Asking why that which describes nature describes nature is not a question physics can answer.

However, the procedure of quantization does not work without extra knowledge. In fact, it's not even known in all cases what the "correct" procedure for quantization is. I'll list several hurdles (without any claim to completeness) that should convince you that there is additional information necessary to quantize a classical system:

  • The Groenewold-van Hove no-go theorem (see also this answer of mine) says that canonical quantization that just replaces Poisson brackets with commutators does not work in the generality in which we would like. There are several possible modifications to the Poisson bracket (or rather the product of classical observables on the phase space) that yield a consistent quantization procedure, but that choice is not unique. You are using additional information when you pick a particular modification. This is essentially the formal reflection of what is usually called an "ordering ambiguity": Given a classical observable $x^np^m = x^{n-1}p^m x = \dots = p^m x^n$, which of these classically equivalent expressions do you turn into the corresponding quantum operator, if you have the CCR between $x$ and $p$ making them all unequal in the quantum theory?

  • Quantum anomalies: For a general discussion of anomalies, see this excellent answer by DavidBarMoshe, for a formal derivation of the possibility of the appearance fo central charges in the passage from the classical to the quantum theory see this answer of mine. The bottom line is that in the course of quantization we can get our classical symmetry groups "enlarged", and formerly invariant objects may not be invariant anymore. This usually introduces a new parameter into the quantum theory, the central charge of the enlarged symmetry group, and again needs additional input to be determined, if it doesn't wreck the quantum theory altogether.

    In fact, this might be the most important aspect of such an anomaly: If you have an anomaly of a gauge or gravitational symmetry, you don't have a consistent quantum theory. In certain field theories, the anomaly term is naturally determined by the rest of the theory, so unless those "miraculously" cancel, the quantum theory of such field theories does not exist in the usual sense. No amount of additional information can fix this, we simply do not know a consistent quantization of such theories.

  • The lattice problem: Classically, it is rather uncontroversial that we can view comtinuum field theories as the limits of discretized theories. Quantumly, this becomes horrendously difficult: It is not known whether the continuum limit of a quantized lattice theory conincides with the quantization of the continuum theory; in fact, I believe this is not always the case, see, for example, the problem of triviality of the lattice $\phi^4$ theory. However, one might remark that this particular problem is due to the absence of a fully rigorous framework of quantum field theory in general.

Finally, let me remark that thinking of quantization as a fundamental operation has it the wrong way around if we take quantum mechanics seriously: It is the classical system that must be obtained from the quantum system in a certain limit, not the other way around. It is perfectly possible that there are quantum systeme without corresponding classical system - they just have no way to view them that would look classical to us. For a handwavy example, think about fermionic/spin-1/2 degrees of freedom: These are very hard to come by in a classical theory since there's simply no motivation to consider them, but they emerge rather naturally from the quantum viewpoint.

In this sense, it is remarkable how well quantization works as a general guiding principle, but it shouldn't surprise us that the "we don't need any extra knowledge" is not really accurate.

ACuriousMind
  • 132,081
1

Some years ago I started with almost the same question: "What is it that makes us quantizate a system or what happens when we quantizate a system?"

Questions like this were asked approximately 90-50 years ago in a similar way by analyzing whether or not the description of quantum mechanics is complete and real (that is wheter all elements have a real counterpart).

The topic was setteled with the so called Copenhagen interpretation, the EPR paradox and finally with Bell's inequalites, which all together tell us that quantum mechanics is a bit strange. For example one shouldn't think of the wavefunction as a real particle unless currently meassured by some classical measurment apparatus and that such things are in absolut contradiction to a reasonable pictorial explanation of quantum mechanics.

I found all that a bit dissatisfying and went on to find a flaw in that view of quantum mechanics.


The first thing I stumbled across was bohmian mechanics which tries to explain the quantization procedure onto the fact that we indeed just didn't know the "right" classcial equations. One can show, that solving the Schrödinger equation (which one arives at by canonical quantization) $$ \left(-\frac{\hbar^2}{2m}\Delta + V(x)\right)\ \Psi = i\hbar \frac{\partial}{\partial t}\ \Psi $$ is equivalent to solving two equations \begin{align} (1)&\ \ \dot{\vec{p}} = \vec{F} - \vec{\nabla} Q\\ (2)&\ \ \frac{\partial R^2}{\partial t} + \vec{\nabla} \cdot \left(\frac{\vec{p}}{m} \cdot R^2\right) = 0 \end{align}

when one considers wavefunctions $\Psi = R \cdot \exp \left(i\frac{S}{\hbar}\right)$ which is no restriction to generality. Equation (2) is the continuity equation for a charge-denstiy $\rho = R^2$ whichs happens to be the probability distribution $\varrho = |\Psi|^2 = R^2$ in quantum mechanics. The first equation (1) is just usual classical mechanics extended by an additional potential $Q = -\frac{\hbar^2}{2m} \frac{\Delta R}{R}$ the so called quantum potential. This interpretation has some problems though. First and foremost, it can't explain (just axiomize) why a real charge-distribution $R^2$ governs the whole statistical behavior of a system regardless of the other acting forces $\vec{F}$.

The key to understanding quantum mechanics is understanding its statistical nature. So could it be that quantum mechanics is some kind of usual classical statistical mechanics (since both seem to be related by the same Lagrangians/Hamiltonians)?

Bell investigated this question by his famous Bell inequalities and came to the conclusion, that there are indeed expectation values in quantum mechanics (which are in agreement with experiment) that cannot be reproduced by any classical statistical mechanics (in the usual sense of non-instantanious action, e.g. relativistic mechanics). He was nominated for a Nobel prize which accounts for the credibility physicist put into these inequalities. As a result there should be no way of describing quantum mechanics onto the basis of classical statistical mechanics.

However, as far as my analysis goes there is a mayor flaw in the derivation of those inequalites, which make them devoid of meaning (e.g. classical systems can violate them too). I'm not the first to come to this conclusion, in fact there is a huge list of so called loopholes in Bell's theorem, which for the most part concentrate on the measurment process and on whether or not violations if found can be interpreted according to Bell's theorem.

Unfortunately, due to the philosophical nature of that question, that whole field of research has been drifted to the crackpot area. Only lately (last 10-20 years or so) it became a bit more popular again.

Now, if you accept my statement that Bell's theorem is wrong, there is no need to discard the possiblity of quantum mechanics being some kind of statistical mechanics. In fact, there might be a way to show that the process of quantizating a theory is just doing classical statistical mechanics with some further assumptions.

Still, this cannot account for the fact that usual classical statistical mechanics is an ensemble statistical mechanics while standard QM and experiments are usually about single particles. In ensemble mechanics, one calculates expectaion values on the basis of many similar and independant particles that have different initial values (e.g. place and momentum). In an experiment however, it seems a single particle mystically knows how to behave according to different and not present ensemble particles. This problem can be solved by the so called principle of Ergodicity, which states that for some systems the mean value over time is the same as the ensemble mean. Usually, this only holds for chaotical systems, which clearly we have counterexamples for (not every system we observe behaves chaotically).

The current pinnacle of quantum mechanics QFT gets rid of the description of nature onto the basis of particles. Everything becomes a field, whichs is an object with infinitely many degress of freedom. E.g., there is an electron field, as well as an photon field. Only later one introduces states, that are in close relation to particles as we know them. In context, of the classical statistical interpretation this means, that particles are just statistical artifacts of the theory, that is, the fields can be in states that "simulate" the behaviour of particles. Due to the infinity of degrees of freedom of such a field, it is quite possible that the principle of Ergodicity holds, such that a measurment within a certain finite time interval $\Delta t$ actually reflects the ensemble mean of the field!


As a result we have regained the following pictorial view of quantum mechanics:

Take for instance the hydrogen atom. It consists of an electron field, a photon field and a proton field (or rather quark and gluon fields, that form the proton). Those fields behave according to the non-quantizated equations of QFT-Lagrangians. Due to the infinity of degrees of freedom, the behaviour is highly chaotic. As a result, we are only interested in the mean behavior of such a system. One would then try to calculate the time mean of that system which is (due to the principle of Ergodicity) the same as the ensemble mean. The process of canonical quantization is now just the usage of usual ensemble statistical mechanics. We know that there are statistical states that correspond to our pictorial view of single particles and thus we can explain why experiments show that the hydrogen atom consists of particles that act differently than free particles. E.g. the electron doesn't radiate Bremsstrahlung and has a quantizated mean energy level because bound (statistical) partical states are ultimatly different to the ones formed by non-interacting fields (free statistical particle states).


So, to come back to your question: "Why is there no need in extra knowledge to go from the classical to the quantum desctiption of a system?"

Answer: We simply do statistical mechanics based on the classical equations.

This is a highly hypothetically standpoint, but it represent my currents views on the quantization process and quantum mechanics. It all falls and stand with the assumption: $\textrm{quantization} \leftrightarrow \textrm{statistical mechanics}$. There has been some work on this topic, e.g. in the form of Koopman–von Neumann classical mechanics which shows that statistical mechanics can be brought into a form of operators on Hilbert-spaces. Recently I also found a way to derive the quantization rule $\vec{p} \rightarrow -i\hbar \vec{\nabla}$ based on a classical statistical mechanical expectation value, but it's not yet in a form that can be published. So take all this with caution.

image357
  • 3,149