2

For context, I think the comparison to tests of general relativity here is apt. There is the post-Newtonian formalism that has some well-defined parameters that can discriminate between general relativity and other theories.

Is there anything like this but for quantum mechanics? If there is, is there any neatly defined parameter that can be thought of as a "measure of nonlinearity" for tests of quantum mechanics?

(Obviously, the parameter would be zero according to quantum mechanics, but that's besides the point.)

This physics SE post is relevant, but it doesn't ask the same question. I haven't yet looked at Weinberg's paper.

Qmechanic
  • 220,844
MaximusIdeal
  • 8,868

2 Answers2

2

The post-Newtonian expansion in GR is an expansion in powers of $1/c^2$. You can derive this expansion from an effective field theory approach in which every possible term consistent with symmetries is considered, and "ranked" according to its importance in the $1/c^2$ expansion. We can therefore identify what are the most important terms that would give the largest contributions, if they were different from GR. (Actually, to be precise, the post-Newtonian expansion is just a way of doing perturbation theory with GR; it is the parameterized post-Newtonian framework that is used to test GR, by replacing known, computable coefficients in the post-Newtonian expansion with free parameters that can be constrained with observations).

The situation in quantum mechanics is quite different, since there's not really a physical parameter with respect to which we know it makes sense to deform the theory, and therefore there is not a physically motivated, systematic expansion of correction terms ordered by importance. You could try to test specific non-linear alternatives to quantum mechanics if you have one in mind, but this is more analogous to testing specific alternative theories of gravity, as opposed to the post-Newtonian expansion.

Actually, I am not even sure if there is a viable, non-linear alternative to quantum mechanics that can be experimentally tested, although admittedly I am not an expert on this topic so there may well be one that someone can point to here. Weinberg proposed a framework for introducing nonlinearities into quantum mechanics in [1], but this was shown to lead to superluminal signaling in [2] and [3]. Some of the history is discussed in [4].

References

[1] S. Weinberg. Precision tests of quantum mechanics, Phys. Rev. Lett. 62:485, 1989

[2] N. Gisin. Weinberg’s non-linear quantum mechanics and superluminal communications, Phys. Lett. A 143:1–2, 1990.

[3] J. Polchinski. Weinberg’s nonlinear quantum mechanics and the Einstein-Podolsky-Rosen paradox, Phys. Rev. Lett. 66:397–400, 1991.

[4] S. Aaronson. Is Quantum Mechanics An Island In Theoryspace? https://arxiv.org/abs/quant-ph/0401062

Urb
  • 2,724
Andrew
  • 58,167
2

Nonlinear terms in the Schrodinger equation are used in so-called objective collapse theories. These are theories of quantum mechanics which explicitly (mathematically) specify under what conditions the famous Copenhagen wavefunction collapse happens. This is unlike the traditional Copenhagen interpretation which leaves this question unanswered, resulting in an INCOMPLETE physical theory.

Wavefunction collapse is nonlinear because we might have $$ |\psi_1\rangle = \frac{1}{\sqrt{2}}|0\rangle + \frac{1}{\sqrt{2}}|1\rangle \rightarrow |0\rangle $$

and

$$ |\psi_2\rangle = \frac{1}{\sqrt{4}}|0\rangle + \sqrt{\frac{3}{4}}|1\rangle \rightarrow |0\rangle $$

but no linear operator could map both $|\psi_1\rangle$ and $|\psi_2\rangle$ to $|0\rangle$

Therefore, objective collapse theories include explicit nonlinear (and stochastic) terms into the Schrodinger equation to explain wavefunction collapse.

These theories make predictions different than regular quantum mechanics. See my answer at https://physics.stackexchange.com/a/659421/128186 for more details. In short, regular quantum mechanics says that we could perform double slit experiments with bigger and bigger objects and see interference patterns if we are able to sufficiently isolate the large particles from environmental interactions. By contrast, objective collapse theories have terms that induce wavefunction collapse if the particle becomes too "macroscopic". Here macroscopic can be defined as some combination of too massive, to spatially large, too many particles, or various other measures of macroscopicity.

Therefore, one way to test for the presence and bound the magnitude of these non-linear terms is to perform macroscopic superposition experiments, such as double slit experiments with large objects. See https://arxiv.org/abs/1410.0270.

Finally, I'll put a quick plug for a totally different approach, though I don't have a reference. Another way to test deviations from the Schrodinger equation would be to take a system with many particles and perform operations to put them into complicated superposition and entangles states and allow them to evolve under various interactions. Then, measure very precisely the final state of the system of particles and check if you get the prediction of regularly quantum mechanics. The bigger your system is and the more complicated interactions you perform the more sensitive you become to deviations from linearity of the Schrodinger equation. In fact, what I have just described is a quantum computer. A large controllable quantum system.

In summary, we can test for non-linear terms in the Schrodinger equation by performing experiments with LARGE quantum system and seeing if we get the results that are expected by regular quantum mechanics, and not the results expected by a theory with a non-linear term.

Urb
  • 2,724
Jagerber48
  • 16,234