-1

Is this correct?

The Heisenberg Uncertainty Principle is not a Principle or law of physics. It's one of many results you can work out from quantum theory with some math:

-- In quantum theory observables can be represented by Hermitian matrices.

-- If an observable of a system can be represented by a particular matrix at a particular instant, then all matrices of the same dimension represent observables of that system.

-- In a state specified by the vector |psi>, an observable X is sharp if and only if X|psi> = x|psi> for some real number x. In which case x is an eigenvalue of X and |psi> is an eigenvector of X.

Now let Y be any matrix that does not have |psi> among its eigenvectors. (For any vector, there exists an infinity of such matrices.)

If the actual state is |psi>, the observable Y cannot be sharp. (Because of the 'if and only if' above.)

EDIT: I'm trying to find out if this line of reasoning is correct. Those other questions don't address this.

curi
  • 29

1 Answers1

0

Your statements about quantum mechanics are correct, but there's more that needs to be said to get to the Heisenberg uncertainty principle. Specifically, we need a lower bound on products of variances.

Fix a state $\left|\psi\right\rangle$. In this state, any linear operator $\hat{\mathcal{O}}$ on the Hilbert space (i.e. square matrix) has mean $\left\langle\hat{\mathcal{O}}\right\rangle:=\left\langle\psi\left|\hat{\mathcal{O}}\right|\psi\right\rangle$. Let $\hat{A},\,\hat{B}$ denote two operators for which $\hat{A},\,\hat{B},\,\hat{A}^2,\,\hat{B}^2$ have finite means in our chosen state. Then $\hat{A}$ has variance $\sigma_A^2=\left\langle\hat{A}^2\right\rangle-\left\langle\hat{A}\right\rangle^2$, and similarly with $B$.

The Cauchy-Schwarz inequality on the Hilbert space implies the Schrödinger uncertainty relation $$\sigma_A^2\sigma_B^2\geq\left|\frac{1}{2}\left\langle\left[\hat{A},\,\hat{B}\right]_+\right\rangle-\left\langle\hat{A}\right\rangle\left\langle\hat{B}\right\rangle\right|^2+\left|\frac{1}{2}\left\langle\left[\hat{A},\,\hat{B}\right]_-\right\rangle\right|^2$$(proven here), where $\left[\hat{A},\,\hat{B}\right]_\pm:=\hat{A}\hat{B}\pm\hat{B}\hat{A}$. From this we obtain the Robertson uncertainty relation $$\sigma_A\sigma_B\geq\left|\frac{1}{2}\left\langle\left[\hat{A},\,\hat{B}\right]_-\right\rangle\right|.$$

(Note: we call $\left[\hat{A},\,\hat{B}\right]_-$ a commutator, and may denote it $\left[\hat{A},\,\hat{B}\right]$. Similarly, we call $\left[\hat{A},\,\hat{B}\right]_+$ an anticommutator, and may denote it $\left\{\hat{A},\,\hat{B}\right\}$.)

If $\left[\hat{A},\,\hat{B}\right]_-$ is a nonzero multiple of the identity operator, we call $A,\,B$ conjugate variables, as happens for example with $A=x,\,B=p_x$. (I've not placed hats on $A,\,B$ here, because "they are conjugate variables" is actually a statement about the classical observables. You may wish to read how it's defined in terms of Lagrangian mechanics, Poisson brackets etc.) Conjugate variables have a positive lower bound on $\sigma_A\sigma_B$, so a state for which $\hat{A}$ has low variance is one for which $\hat{B}$ has high variance.

J.G.
  • 25,615