3

If information and entropy are equivalent, and information is a conserved quantity because of unitarity, what does it mean to say that entropy is growing since the big bang (I know why it grows, I'm not getting how to conciliate the two affirmations).

I undestand the process by which entropy grows, and I know why information (quantum states) are conserved. What I am not getting is how these two facts are reconcilable with each other, since information is equivalent to entropy.

donut
  • 393

3 Answers3

3

I will be using this answer by Bob Bee to address this part of the question:

If information is conserved, why did the early universe had lower entropy than now?

It seems that the concept of conservation of information arises within a quantum mechanical system of complete solution of the early universe . This presupposes that quantization of gravity has been achieved. The quantum level does not have a definition of entropy as defined by thermodynamics, which is a classical theory, emergent from the underlying classical statistical mechanics level.

From Bob:

information in its basic simplest form, in quantum theory, is the state of the system (which could be composed of many subsystems). A physical system is defined by a state vector. It could and often is infinite dimensional, but could also have finite dimensional Hilbert subspaces (like the spin). The evolution of a system,considered a pure state, is given by a unitary operator which preserves causality (at the Hilbert space level, not in the probabilistic interpretation of collapse and measurements). You can always go back by applying the inverse operator. When the state becomes mixed information can be considered to be lost, and entropy increases.

italics mine

The way I understand it, during the time of the universe when conservation of information holds due to a pure quantum mechanical solution for the universe, entropy is constant. Once decoherence sets in, the classical statistical mechanics, entropy increases. In this view there is no conflict between a low entropy in the beginning of the universe and fixed at a given value, and the increase after quantum mechanical solutions decohere. The entropy law has a larger or equal sign in front.

"since information is equal to entropy" is not true in the quantum level. Look at the article on information entropy which uses classical probabilities, not quantum mechanical. Also this on thermodynamics and information theory does not involve the unitarity argument . It seems that the unitarity argument is important for conservation of information in quantum systems, but not in defining information entropy.

One should also keep in mind that quantization of gravity is still an open research field.

anna v
  • 236,935
1

As far as I know, there's no mainstream consensus regarding this. Reconciling irreversibility into quantum mechanics (much like quantum mechanics itself) doesn't have a well accepted interpretation.

But that doesn't mean we don't have a formalism for it! We do, and it involves a form of generalized quantum state, the density matrix formalism. Within it it's much more natural to include and derive irreversible (non unitary) terms through couplings to systems that aren't of interest to our model (say, a thermalized electromagnetic field).

The irreversibility then stems from coupling with an external system which we have no information about. Quantum theory in this density matrix formalism is extremely well founded as implications of information theory.

With this in mind, for the entropy of the Universe to increase, it would have to couple to something which is extremely complicated to describe with detail. We can then argue that (this is just my intuition, and could be horrendously wrong), since the Universe is infinite, if I describe any local region of it, it's coupled to the endlessness that surrounds it, which definitely is complicated to describe. The information is lost to the rest of the Universe.

0

Entropy and information are not identical. The entropy of a system is a measure of what you do not know about the microscopic details of a system given some macroscopic observation. Boltzmann's entropy equation: $S = k_B \log W $ means that the entropy $S$ increases when the number of microstates corresponding to a given macrostate ($W$) increases.

The second law means that as the universe evolves it becomes describable by more and more microstates aka. entropy increases. The information is still 'in' the microstate but it is not information available to macroscopic observation.

Perhaps an example will help. Imagine a box with a gas contained in one half of the box. The gas is made up of $N$ particles with each have a position (3 components $x$, $y$, $z$) and velocity (three components also). Therefore we can describe the gas with $6N$ variables. As the gas spreads to the rest of the box this number doesn't change; this is the type of 'information' your question wonders about. However, there are fewer ways to arrange the particles such that they are all on one half of the box than there are ways to arrange them spread over the whole box, so therefore the entropy is increasing as the gas spreads.

Stuart
  • 163