4

I am currently exploring the mathematical structure of Quantum Mechanics on an introductory level. A couple of books and online sources (including this website) stated how the Uncertainty Principle is a consequence of two non-commuting operators like position-momentum or the spin operators. But in some books including Stephen Hawking's - "A Brief History of Time", the Uncertainty Principle is interpreted as a limitation imposed due to an inevitable tradeoff between wavelength and energy. For e.g. in the book "A Brief History of Time" on Page 60 the author writes -

In order to predict the future position and velocity of a particle, one has to be able to measure its present position and velocity accurately. The obvious way to do this is to shine light on the particle. Some of the waves of light will be scattered by the particle and this will indicate its position. However, one will not be able to determine the position of the particle more accurately than the distance between the wave crests of light, one needs to use light of a short wavelength in order to measure the position of the particle precisely. Now, by Planck's quantum hypothesis, one cannot use an arbitrarily small amount of light; one has to use at least one quantum. This quantum will disturb the particle and change its velocity in a way that cannot be predicted. Moreover, the more accurately one measures the position, the shorter the wavelength of the light that one needs and hence the higher the energy of a single quantum. So the velocity of the particle will be disturbed by a larger amount. In other words, the more accurately you try to measure the position of the particle, the less accurately you can measure its speed, and vice versa.

Hawking clearly stated it as a physical consequence, which is more intuitive than the mathematical one. Finally, my question is which one of the interpretation is correct? Or is it possible that both interpretations are equally plausible and that the mathematical derivation just happens to explain the physical cause through a different approach.

Qmechanic
  • 220,844
rigel_13
  • 1,377

5 Answers5

5

Physics is a mathematical description of the behavior of the universe. The uncertainty principle is written as it is to describe the physical behavior. So it is a physical consequence.

But it is also a consequence of the math, because the math matches the physical behavior. It is not a coincidence. The math was purposely written that way. Given the close match, physicists work with the math and often forget that it isn't the universe. They see the math as the reason the universe behaves the way it does. This is a perfectly reasonable attitude. They use the math to predict new laws, new behavior. Then experiments often show the prediction is right. If not they look for flaws in their reasoning or assumptions, or perhaps that the behavior is not as they expected.

mmesser314
  • 49,702
5

I generally have a problem with such "intuitive" explanations of the Uncertainty Principle, as I feel that they create more problems than they solve. Don't get me wrong: it's certainly a very useful way to introduce the topic to the uninitiated, since going into detail about wave functions and non-commutativity of operators representing physical observables is usually quite tedious. (Indeed, Feynman himself uses it beautifully when he describes the double-slit experiment.) Furthermore, most of us do tend to think classically, and such a "semi-classical" intuitive explanation seems (or at least seemed to me, when I first heard it) a welcome escape from the inherent "spookiness" of Quantum Mechanics.

My problem with such an explanation is that it seems to associate the uncertainty principle in some way with measurement, which I do not think is true. At the core of the explanation is the idea that in order to measure position of a particle one would require to interact with the particle. But the Uncertainty Principle is not a result of any measurement, but is instead intrinsic to the mathematics of the quantum theory. One could (incorrectly) then assume -- as we all do, when we hear such an explanation -- that the electron actually has a definite position and momentum somewhere, and that it is the act of shining light on it that introduces the uncertainty.

This is, I believe, untrue. Our current understanding is that even if we had a measuring device which did not disturb the system at all, even in principle, there would still be unavoidable uncertainty, which stems from the fact that it does not make sense to speak of a particle's position and momentum in definite terms, simultaneously. Such ideas are a classical hangover that we need to get over, what Wittgenstein would call a "grammatical mistake".

This "explanation" has a lot in common with Fraunhofer diffraction, where if you shine a coherent beam of light on a tiny obstacle (or aperture), the width of the resulting diffraction pattern and the size of the aperture are inversely related: a smaller object produces a "wider" diffraction pattern. This is sometimes (erroneously, in my opinion) described as a manifestation of the uncertainty principle, when in fact it is just the manifestation of the wave nature of light. The mathematical functions describing the aperture and the pattern on the screen are related by a Fourier Transform, and the properties of this transform are such that if one of them becomes more constrained, the other will become less constrained. It turns out that the wave functions of a particle in position and momentum space are also related by a Fourier Transform, for different reasons, to do with the nature of the operators that represent the physical observables. As you try to squeeze one of them, you have to loosen your grip on the other.

There is, as far as I know, no proof of why certain conjugate pairs of physical observables behave in this fashion. In fact, as @ZeroTheHero points out, we may not even know how to measure some of the operators that satisfy these "generalised uncertainty principles". It may even be -- as some have conjectured -- that one of the operators of the pair may not even be a "physical" observable!

So I would say that -- in this case at least -- the "physical" explanation is more misleading than anything.

Philip
  • 11,520
4

If the uncertainty relation is a physical consequence of a theory, we have not found why or how, at least in general.

The uncertainly relation applies to any pair of (usually non-commuting) operators, even some which have limited physical interpretation, like $(x+p)^3$ and $(x^2-p)^7$, or even to totally abstract operators which are general functions of observables like spin or angular momentum. In general we don’t even know how to perform a measurement for such weird operators, and indeed it’s possible that some of these observables do not have a unique quantization.

In some of the simpler cases, we can connect the uncertainty relation to Fourier-like relations between classical observable, but extending this to the general case would be highly non-trivial.

ZeroTheHero
  • 49,168
  • 21
  • 71
  • 148
3

Hawking clearly stated it as a physical consequence, which is more intuitive than the mathematical one.

In general in physics, laws, postulates, principles are distillates of a large number of observations. The mathematical models and the solutions are chosen so that these laws, principles, postulates are obeyed.

Finally, my question is which one of the interpretation is correct?

It is observations and data, then mathematics is chosen to model it.

Look at page 10 here. Since quantum mechanics works with wave equations, it is reasonable that the mathematics to model quantum mechanical states exists. It is the interpretation of the data that differs and makes it a physics axiom.

In particular the Heisenberg uncertainty is directly connected to the probabilistic nature of quantum mechanics. Wave mechanics used in quantum mechanics are probability waves .

anna v
  • 236,935
3

Heisenberg himself, who discovered the uncertainty principle, originally thought it was a measurement issue, just as Hawking describes it.

As time went by, it was found to explain many phenomena unrelated to measurement as such. For example it is what allows virtual particles to exist for fleeting moments of time - by definition not long enough to be directly measured, but long enough to produce the Casimir effect and to greatly influence particle-particle interactions in general. It also explains quantum tunnelling, a phenomenon exploited in some solid-state electronic devices such as the Josephson junction and the tunnel diode.

Ultimately, Heisenberg came to concede that he had discovered a fundamental characteristic of reality.

Hawking was being very naughty when he gave the measurement-based explanation. One consequence of virtual particles is the evaporation of black holes via the radiation that is named after him, so he knew perfectly well that he was giving an outdated and false account. I suppose he felt it necessary to talk down to his readers.

Guy Inchbald
  • 7,578