I have looked at several books on the foundations of quantum theory and found that the path integral formulation is hardly ever discussed in detail. I find this surprising because this formulation of quantum mechanics quite naturally leads to classical mechanics in the limit of the action (in units of Planck's constant) being large. Moreover, the path integral formulation is also more widely applied than the canonical formulation in quantum field theory. Why is the path integral formulation not used as an ontological basis for quantum theory? Why are issues like the measurement problem, EPR problem, nonlocality, decoherence, etc not studied within the tenets of the path integral formulation?
2 Answers
Here is my own opinion as a mathematical physicist who is deeply interested in foundational issues of the Quantum Theory.
The point is that the path integral formulation, in spite of being quite intuitive, has no definite general mathematically rigorous foundation. A rigorous foundation at least as powerful as the Hilbert space formulation.
The foundamental obstruction is that the Feynman integral is not an integral in the sense of measure theory, so it needs an ad hoc mathematical manipulation.
I am the first person who admits that this viewpoint may be considered a pure prejudice, after all one is dealing with physics and not mathematics.
ADDENDUM
Since the question seems now more precise in philosophical terms, I would like to say something further.
What are the ontological elementary objects of the path integral formulation?
I am not an expert, but I suspect the path integral formulation, in the modern version, cannot exist without the standard formulation. In the modern view it is just another way to compute the very objects of the standard formulation. The ontological elementary objects are however of the standard formulation: states, probability amplitudes etc. I do not know if the original formulation by Feynman is still used. What is used, at least in QFT, is the translation due to Freeman Dyson actually.
- 80,330
I've published several papers on interpreting the path integral, and have some thoughts about this.
The summary of my below points are: 1) the obvious ontology can't possibly work, 2) there are nevertheless different ontologies which might work, and 3) even then, the particular ontology in any case would necessarily have to be dependent upon the choice of a future measurement basis. I'm not sure whether this line of thinking is not pursued because people get stuck at 1), or because people really dislike the retrocausal implications of 3). Still, the path integral is built in a time-symmetric way, using both initial and final boundary conditions, so many people have noted that any interpretation of the path integral would therefore almost inevitably have a retrocausal flavor. I'll note here that all the retrocausality would safely be at the hidden level (no retro-signaling, no time-travel, etc.).
The particle path integral allows one to find the probability that a particle will go from A to B, by calculating a complex probability amplitude for each possible path from A to B, then summing/integrating all those complex numbers, and finally squaring the total result to get a probability. The obvious ontology is that the particle takes one real path from A to B, we just don't know which one. But if that were the case, then we could assign a positive probability to each path, and sum those probabilities, rather than summing and squaring complex numbers.
The proof that this is impossible is simple. Just consider a case with zero probability, due to two paths which have opposite finite-magnitude amplitudes. The only probabilities one could assign to those paths would be zero. (Leaving aside Feynman's dalliance with negative probabilities, which makes no mathematical sense.) Finally, note that one could block one of those paths and end up with a non-zero probability, in contradiction to the claim that each path had zero probability. So a one-real-path interpretation doesn't work.
But who says the only ontology is a particle ontology? It turns out that a one-real-field ontology is still available. The detailed proof is here, but the general point is that it's always possible to construct groups of possible particle paths, where each group has some particular probability. The idea is that one group of particle-paths happens, we just don't know which one. In the above example, those two paths would be in one group, and that group would have a probability of zero. This happens all the time in classical optics, where multiple paths cancel, given an underlying field ontology. More generally, given a group of non-intersecting classical paths, that entire path-group can correspond to a single realistic field. (The reverse move is made in Bohmian mechanics, where a single realistic field is mapped onto a large number of possible particle paths, aka "field streamlines".) So a "one real field" interpretation of the path integral is in principle available as long as one doesn't think in terms of particles. Instead, one would think in terms of a field which can start in a particle-like clump, end in a particle-like clump, and allowed to split up into a non-particle-like structure the middle.
However, in determining those allowed fields (those possible groups of paths), it turns out that different future measurement geometries change the possible path-groups. If you focus a photon source onto a single slit, you end up with one parsing of path-groups, but if you allow the photon source to spread through a double-slit you end up with a different parsing of path-groups. And those path-groups differ not just after the lens and the slits, but all the way back to the source. So for this to work, the future measurement geometry would somehow have to constrain what is allowed to happen in the past. This isn't all that surprising given that we're interpreting a calculation which takes the future boundary "B" to be fixed, as a future boundary constraint, but it's beyond the sort of ontology which physicists tend to accept (outside of closed timelike curves in general relativity).
This has all been single-particle analysis, but here are two papers (1, 2) which show how the path integral can be formally extended to entanglement as well. Those papers are neutral as to the underlying ontology, but the one-real-field ontology could apply to these cases as well, if one was willing to allow for future-input dependence. (See this Rev. Mod. Phys. paper detailing how future-input-dependence trivially allows violations of Bell Inequalities – the so-called 'retrocausal loophole'.). I'll note that several of the quantum foundations issues you ask about are indeed addressed via a path-integral perspective in the second of these two papers.
So the answer to your question is probably: quantum physicists generally don't like explanations of the past which depend on the future, and that's where the path integral leads. Nevertheless, the "solve for entire histories all at once" perspective is starting to be advocated as an under-appreciated framework that might solve some problems in quantum foundations. In my view, if we have to give up our conventional notions of forward-time-causality in the hidden quantum world in order to come up with a realistic account of what might be going on when we're not looking, that would be a tradeoff very worth considering.
- 2,498
- 8
- 16