21

In just about every interpretation of quantum mechanics, there appears to be some form of dualism. Is this inevitable or not?

In the orthodox Copenhagen interpretation by Bohr and Heisenberg, the world is split into a quantum and classical part. Yes, that is actually what they wrote and not a straw man. The Heisenberg cut is somewhat adjustable, though, somewhat mysteriously. This adjustability is also something that can be seen in other interpretations, and almost kind of suggests the cut is unphysical, but yet it has to appear somewhere. There is a duality between the observer and the observables.

Von Neumann postulated a two step evolution; one Schroedinger and unitary, and the other a measurement collapse by measurement, whatever a measurement is. Another form of the duality. What a measurement is and when exactly it happens is also adjustable.

In decoherence, there is a split into the system and the environment. A split has to be made for decoherence to come out, but again, the position of the split is adjustable with almost no physical consequence.

In the many-worlds interpretation, there is the wavefunction on the one hand, and a splitting into a preferred basis on the other followed by a selection of one branch over the others. This picking out of one branch is also dualistic, and is an addendum over and above the wavefunction itself.

In the decoherent histories approach, there is the wavefunction on the one hand, and on the other there is an arbitrary choice of history operators followed by a collapse to one particular history. The choice of history operators depends upon the questions asked, and these questions are in dual opposition to the bare wavefunction itself oblivious to the questions.

In Bohmian mechanics, there is the wavefunction, and dual to it is a particle trajectory.

Why is there a duality? Can there be a nondual interpretation of quantum mechanics?

Qmechanic
  • 220,844
Sebastian
  • 227

5 Answers5

13

Some people ascribe the duality to the duality between the classical appratus and the quantum microscropic system, but I think this is a little old-fasioned. The quantum description also works for a bad apparatus and a big apparatus--- like my eye looking at a mesoscopic metal ball with light shining on it. This situation does not measure the position of the ball, nor the momentum, nor anything precise at all. In fact, it is hard to determine exactly what operator my eye is measuring by looking at some photons.

A modern approach to quantum mechanics treats the whole system as a quantum mechanical, including my eye, and myself. But then the source of the dualism is made apparent. If I simulate my own wavefunction on a computer, and that of the ball, and the light, (the simulation would be enormously large, but ignore that for now), where is my perception of the ball contained in the simulation?

It is not clear, because the evolution would produce a enormously large set of wavefunction values in extremely high dimension, most of which are vanishingly small, but a few of which are smeared over configurations describing one of many plausible possible outcomes. The linear time evolution would produce a multiplying collection of weighted configurations, but it will never contain a data bit corresponding to my experience. But I can introspect and find out my own experience, so this data bit is definitely accessible to me. So I can see a data bit using my mind which is not clearly extractable from this computer simulation of my mind.

The basic problem is that the knowledge in our heads is classical information, it might as well be data on a computer. But the quantum system is not made up of classical information, but of wavefunction data, and wavefunction data is not classical information, nor is it a probability distribution on classical information, so it does not have an obvious interpretation as ignorance of classical information.

The reason probability is unique is because only probability calculus has the Monte-Carlo property that if you sample the distribution and average over time-evolution of the samples, its the same as averaging over the time-evolution of the distribution. In quantum mechanics, samples can interfere with other samples, making the restriction to a collection of independent classical samples inconsistent. So I can't say the simulation is simulating one of many samples, at best I can say it is approximately simulating one of many clumps-of-samples corresponding to nearly completely decohered histories.

But when I entangle myself with a quantum system using a device which entangles itself with a quantum system, I find _by_doing_it_ that the result is probabilistic on the classical information in my mind. The classical information is determined after the entanglement event, the result is random with probabilities given by the Born rule, so the result is definitely a probability. But the result is only at best asymptotic to a probability in quantum mechanics.

Why Duality?

The duality in quantum descriptions is always between the linear evolution of the quantum mechanical wavefunction and the production of classical data according to a probability distribution. Wavefunctions are not probabilities, but when they produce classical data, they can only be probabilities, so they turn into probabilities. How exactly do they turn into probabilities?

This is the mismatch between the probabilistic calculus for knowledge and information, and the quantum mechanical formalism for states. In order to produce probabilities from pure quantum mechanics, you have to find the proper reason for why wavefunctions are linked to probabilities.

Each interpretation has a bit of a different flavor for explaining the link, but of these, Copenhagen, many-worlds, CCC, many-minds, and decoherence/consistent-histories all place the reason in the transition to a macroscopic observer-realm. The details are slightly different--- Copenhagen has a ritualized system/apparatus/observer divide, a classical-quantum divide which looks artificial. Many-worlds has an observer's path of memories, which selects which world is observed. Many-minds too, I can't distinguish between many-minds and many-worlds, not even philosophically. I think many-minds was invented by someone who misunderstood many-worlds as being something other than many-minds. Consciousness-Causes-Collapse is the same as well, except rejecting the alternate counterfactual mental histories as "nonexisting" (whatever that means exactly, I can't differentiate this one from many worlds either). Decoherence/consistent-histories insists that the path is a decoherence consistent selection which is simply a good direction in which the wavefunction has become incoherent and the density matrix is diagonal, but it is specified outside the theory. Its always the same dualism--- the classical data is not in the simulation, and we can see it in our heads, and the reduction to a diagonal density matrix is only asymptotically true, and it needs to be exactly true to work.

The variables that describe our experience of the macroscopic world are discrete packets of information with a definite value, or probability distributions on such, which are modeling our ignorance before we get the value. There's nothing else that is out there which can describe our experience. The quantum simulation just doesn't contain these classical bits, nor does it contain anything which is exactly and precisely a classical probability distribution.

Quantum mechanically simulate a particle in a superposition interacting with a miniature model brain, and light from the particle triggers a molecule in the brain to store the information about the position of the molecule, the quantum formalism will produce a superposition of at least two different configurations of the molecule and of the brain but at no point will it contain the actual value of the observed bit, nor a probability distribution for this value.

If this quantum wavefunction simulation is a proper simulation of the brain, then this internal brain has access to more information than the complete simulation contains viewed from the outside. As far as I see, there are exactly two possible explanations for this.

Many Worlds

The idea starts with the observation that you can't know in advance what it's supposed to feel like to be in a superposition, because what a physical phenomenon "feels like" is not part of physics. There is always a dictionary between physics and "feels like" which tells you how to match physical descriptions to experience. For example, matching light of a certain wavelength to the experience of seeing red.

If you simulate a classical brain, and you copy the data in the classical brain simulation, by querying the copies, you will see that they cannot differentiate between their pasts, and they will both think they are the same person. The quantum simulation contains all sorts of things inside, and it is not clear how it feels to internal things, because that all depends on how you query the things. If you query extremely unlikely components of the superposition, you can get any answer at all to any question you ask. You have to ask questions, because without a positive way to investigate the brain's feelings, there is no meaning you can assign to the assertion that it has feelings at all. When you ask the question, you must choose which branch of the simulated quantum system to query.

So there is no obvious way to embed classical experiences into the simulation, and the many-worlds interpretation takes the point of view that it is just a perceptual axiom, like seeing red, that the way our classical minds are embedded into a quantum universe is that they feel a unique path through a decohering net of spreading quantum events. A classical mind just doesn't "feel" superposed, it can't feel superposed because feelings are classical things.

The embedding into the model is just a little off because of this, and our minds have to select a path through the diverging possible histories. The path-selection by the mind produces new classical information through time, and the duality in quantum mechanics is identified with the philsophers' mind-body duality.

Quantum mechanics is measurably wrong

I think this is the only other plausible possibility. The existence of classical data in our experience make it philosphically preferrable to have a theory which can say something about this classical data, which can interpret it as a sharp value of a quantity in the theory, rather than a history-specification which is outside the physics of the theory. This can be philosophically preferred for two reasons:

  • It allows a physical identification of mental data with actual bits which can be extracted from the simulation, so that the definite bit values encoding our experiences are contained in a fundamental simulation directly, as they are in the classical model of the world.
  • It means that simulations of the physical world could be fully comprehended--- they are classical computations on classical data, or probability distributions which represent ensembles of classical data.

I think the only real reason to prefer such a theory is if it could described the world with a smaller model than quantum mechanics, one which would require fewer numbers to simulate. It seems like an awful waste to require exponentially growing resources to simulate N-particles, especially when the result in real life is almost always classical behavior with a state variable linear in N.

But the only way a theory can do this is if the theory fails to coincide with quantum mechanics at least when doing Shor's algorithm. So this position is that quantum mechanics is wrong for heavily entangled many-particle systems. In this case, the dualism of quantum mechanics would be because it is an approximation to something else deeper down which is not dual, but the approximation makes wavefunctions out of probability distributions in some unknown limit, and this limit is imperfect. So the wavefunctions are approximations to probabilities, not the other way around, and we see the real deal-- the probabilities, because on our scale, the wavefunction description is no good.

Nobody has such a theory. The closest thing is the Born version of quantum mechanics, which is computationally even bigger than quantum mechanics, and so even less philosophically satisfying.

It might be good even to find a half-way house, just a method of simulating quantum systems which does not require exponential resources except in those cases where you set up a quantum computer to do exponential things. Nobody has such a method either.

5

The duality has something to do with strength of interaction of a system with its environment, which may or may not consist largely of a piece of measurement apparatus of which we are consciously aware. In short, the duality arises from fixating on two extremes of behaviour: strongly coupling with the environment, or not. (Realizing this doesn't necessarily simplify our understanding of QM, but it is the theme underlying the dualities you have noted.)

What all of the interpretations agree on is this: a system which is isolated evolves according to the Schrödinger equation, and a system which interacts strongly enough with a macroscopic system — such that we can observe a difference in the behaviour of that large system — does not. These are two polar extremes of behaviour; so it is not in principle surprising that they exhibit somewhat different evolutions. This seems to me where the duality comes from: stressing these two opposite poles.

  • In the Copenhagen interpretation, the "quantum" systems are the isolated ones; the "classical" systems are the large macroscopic ones whose conditions we can measure. Nothing is said about the regime in between.
  • In von Neumann's description, the evolution of isolated systems is by the Schrödinger equation; ones strongly coupled to macroscopic systems get projected. Again, nothing is said about the regime in between.

"Decoherence" and "Many-Worlds" are not really distinguishable interpretations of quantum mechanics (indeed, in Many-Worlds, the preferred basis is thought to be selected by decoherence, though this must still be demonstrated as a technical point). While there is some debate about the precise ontological nature of the phenomenon, and important technical issues to resolve, pretty much everyone in the "decoherence" camp (with or without many worlds) agrees that the statistical nature of quantum mechanics — as opposed to the determinism of the unitary dynamics itself — arises from interaction with the environment. The fuzziness of the boundary between the two situations of "isolated system" and "strong coupling to the environment", in fact, is a symptom of the fact that "not completely isolated" does not automatically take you all the way to the regime of "strongly coupled to the environment". There is, presumably, a gradient. Furthermore, you get to choose what the boundaries of "the environment" — that part of the world which is just too big and messy for you to try to understand, or more to the point, experimentally control — are. So, if a physical system is only a little leaky, or is interfered with only slightly by the outside world, you can try to account for this outside meddling, and so describe the system as one which may be somewhat less leaky.

Some of the projects of interpretations of quantum mechanics are trying precisely to describe the two extremes, and so everything in between, using a monism of dynamics. Many-worlds, for instance, seems to shrug at the question of why we only perceive one world out of many, but wholeheartedly believes that all dynamics is in principle unitary, and is trying to prove it. And Bohmian Mechanics already has monism, albeit at the cost of faster than light signalling between particles by way of the quantum potential field — albeit signalling which manifests macroscopically only as correlations, for essentially thermodynamical reasons — which understandably puts most people off.

Note that there are also dualisms in science, historically and in modern times, outside of quantum mechanics:

  • historically: terrestrial and celestial mechanics (subsumed by Newtonian mechanics)
  • historically: organic versus inorganic matter (subsumed once the chemistry of carbon started to become well-understood)
  • currently: gravity (treated geometrically) versus other elementary forces (treated by boson mediation)
  • currently: "hard sciences" (theories of the world largely excluding human behaviour) versus soft "sciences" (theories of the world largely concerning human behaviour)

Any time you have two different models of the world which do not seem obviously compatible, but which do (at least somewhat successfully) describe systems well in some domain, there is a sort of duality between those two models. The dualities in our current understanding of quantum mechanics are somewhat unique in that they concern exactly the same systems, and in the fact that interactions in one of the regimes ("strong coupling with the environment") seems to be the only way for us to obtain information about what happens in the other ("weak coupling with the environment")!

4

The duality is inherent in the way we do physics. We never consider the whole universe with all its details. In order to make sense of what we observe (whic his always a small part of the universe only) we - the users of physics - must make a distinction between ''the observed = the system'' and ''the remainder = the environment''.

The observed system is then described as closely as warranted, while the remaining environment is described in a simple, effective way - e.g., as an external classical field (in many applications), as a classical measurement (in the Copenhagen interpretation), as a bath of harmonic oscillators in equilibrium (in decoherence studies), or as ignored details (in thermodynamics and in cosmology).

This is necessary in order that we can get rid of unwanted details without lsing predictability of the system of interest.

Thus the duality you mentioned is imposed on the universe by inqusitive minds.

2

To take a different approach to the variety of ways in which you present QM (which all seem fine, but perhaps they miss the underlying structure), we compute expected values of an observable $O$ using the trace rule in QM, $E[O]=\mathsf{Tr}[\hat O\hat\rho]$, in which on one side there is an operator that represents a measurement and on the other side there is a density matrix that represents a state, essentially because of the Hilbert space structure of vectors and an inner product. Loosely, the inner product of the Hilbert space allows us to ask what components a prepared vector state has "in the same direction" as each of a (possibly infinite) set of reference states.

Hilbert spaces are the mathematical structure at the very bottom of all quantum mechanics, and the inner product (that every Hilbert space has as part of its construction) induces a linear duality between prepared states and reference states. That duality may play out in a different interpretations in different ways, but it will always be there.

In short, if we have a Hilbert space structure, we have a linear duality. If we don't have a Hilbert space structure, we're not doing quantum mechanics. Not that we can't use other mathematical structures, but it will not be QM unless it can be presented in terms of the mathematics of Hilbert spaces, effectively as a matter of definition.

And welcome to PhysicsSE.

EDIT: As a result of Niel's and Ron's Comments, I looked at what I've missed in the Question (not infrequently I find that my first response misses some "detail" or another, and sometimes it's the whole point). My initial Answer addresses the cut into System and Observer, which I see as inevitable just because of the underlying mathematics I point out above, but it does not explicitly address the difference between unitary and collapse evolutions. I see these two evolutions so much as an obvious consequence of the mathematical duality that I didn't notice that I was conflating something that would not be obvious. I find Niel's Answer somewhat more congenial to my own thinking, which I would say, still too concisely, as: the difference between unitary and collapse evolutions comes from placing the Heisenberg cut in such a way that there is an (effectively) infinite number of DoFs on the human Observer's side of the mathematical duality, while there is only a relatively small number of DoFs on the other side. That's a somewhat Decoherence-y way of looking at things, to which I do not fully subscribe, but I find it a useful approach nonetheless. I find both Niel's and Ron's Answers Useful, although as different sides of a coin, and I commend them both to you.

The duality between the wave function and Bohmian trajectories is rather different, and rather unbalanced, as Niel points out, and it looks as if Ron hasn't much addressed it. I find that I can't see how to address that duality in a unified way, partly because its attractions have never seemed compelling enough for me to work within the mathematics of the Bohmian POV.

Peter Morgan
  • 10,016
1

You correctly noticed that in some interpretations there is a "split" between "quantum" and "classical" and this split is somewhat arbitrary. You can move it closer to the observer without loosing consistency. If you make it to the extreme and move it as close to the observer as possible you will find that the whole universe follows certain laws such as unitary evolution, when separated from the observer, and only the observer, a single isolated person does not.

This is what you should obtain and it is correct.

What is bad with it? Only one problem. It makes the most fruitful instrument of research ever invented by humans, the scientific method, non-applicable.

Scientific method requires independent confirmation of the observations and repeatability. If there is a special person in the universe then the scientific community would be unable to predict the observations by that person based on their own experiments or their predictions will be wrong however advanced instruments they use.

That's why the quantum interpretations. All of them are designed to reconcile the scientific method with quantum mechanics to a degree which allows to obtain practical results. Still scientific method remains in conflict with quantum mechanics, but this conflict can be kept contained so that practical results in applied science are possible.

Anixx
  • 11,524