42

The second law says that entropy can only increase, and entropy is proportional to phase space volume. But Liouville's theorem says that phase space volume is constant.

Taken naively, this seems to imply that the entropy can never change. What's wrong with the reasoning here?

Qmechanic
  • 220,844
knzhou
  • 107,105

8 Answers8

24

So, the short answer is that you're quite correct: if the dynamics of a system is subject to Liouville's theorem, then phase space volume is conserved, so the entropy associated to a given probability distribution remains constant as it evolves under those dynamics. This is actually just one instance of a much more general puzzle: how do we reconcile the irreversibility of thermodynamics with the reversibility of classical mechanics (if we are seeking a way of "reducing" thermodynamics to classical statistical mechanics)? The literature on this puzzle is huge. If you're interested, a good introduction is "Time and Chance", by David Z Albert.

In terms of how this is handled in practice, the answer is (as Ross Millikan says) that we use processes of coarse-graining or projection, exploiting the fact that the probability distribution spreads out into filaments. Again, the details of this process (and its conceptual significance) are somewhat involved. Good papers to look at for that are "The Logic of the Past Hypothesis" (available at http://philsci-archive.pitt.edu/8894/) and "What Statistical Mechanics Actually Does" (http://philsci-archive.pitt.edu/9846/), both by David Wallace.

dexterdev
  • 416
  • 2
  • 14
17

Liouville's theorem says the accessible volume in phase space does not increase, but it tends to become narrow filaments that "fill up" a much larger volume. If you think of a particle in a reflecting box, you might start it with a known position $\pm 1$ mm in all three axes and a known velocity $\pm 1$ mm/sec in all three axes. This is a phase space volume of $64$ mm^6/sec^3. If you follow the evolution of lots of points within the starting volume, they will scatter throughout the box at various velocities. After enough time, the particle can within $\pm 1$ mm of anywhere in the box with a range of velocities. When we look at the entropy at a later time, we spread all of these together, so we say the particle can be in the whole volume of the box at any of a range of energies. That gives a much higher entropy. If you found the exact regions of phase space the particle could be in the volume would not have increased, but the smearing out has made the volume increase and with it the entropy.

13

I think the intended question was about a thermally isolated system, so no heat could be exchanged; work exchange is allowed.

What's wrong with the reasoning here?

In short, the statistical entropy (=statistical physics concept of thermodynamic entropy) that is defined for any equilibrium state and can increase in thermally isolated system is not the same thing as the information entropy (=a functional of the probability density, or phase space volume where the probability density is the same everywhere) that remains constant due to the Liouville theorem. These two entropies are related, but they are not the same concept nor do they always have the same value. When in non-equilibrium process the system comes to a new equilibrium, the statistical entropy has increased, while the information entropy has remained constant.

1)

The second law says that entropy can only increase

More accurately, it says that when state is changed from equilibrium macrostate $A$ to equilibrium macrostate $B$, thermodynamic entropy cannot decrease. Equilibrium macrostate can be specified by stating values of sufficient number of macroscopic variables $X_1,X_2,...$ (we can denote them together as a tuple $\mathbf X$). (For example, equilibrium state of ideal gas in a closed vessel is specified by giving values of volume $V$ and internal energy $U$.) Then thermodynamic entropy function $S(\mathbf X)$ of these state variables can be introduced.

With this, the consequence of 2nd law for adiabatic systems - non-decrease of thermodynamic entropy - can now be stated in this way:

$$ S(\mathbf X_A) \leq S(\mathbf X_B). $$

2)

entropy is proportional to phase space volume.

Here we get into statistical entropy, a statistical physics concept of thermodynamic entropy. Statistical entropy of the equilibrium macrostate $\mathbf X$ is proportional to logarithm of the volume of the phase space region defined by all those points that are compatible with the macrostate $\mathbf X$. Let us denote volume of this region by $\Omega(\mathbf X)$. Statistical entropy of macrostate $\mathbf X$ is defined as

$$ S^{(stat)}(\mathbf X) = k_B \ln \Omega(\mathbf X). $$

Thus statistical entropy is a function of the macrostate $\mathbf X$. If macrostate changes, $\Omega$ may change, and if so, then so does $S^{(stat)}$.

Statistical entropy changes are (in the so-called thermodynamic limit) supposed to be the same as changes of thermodynamic entropy of the system modeled - otherwise something in the statistical model is wrong.

3)

Liouville's theorem says that phase space volume is constant

[...in time].

Yes, but "volume" in Liouville's theorem is something different. Consider, at time $t_1$, a set of representative points in phase space that are all compatible with the macrostate $\mathbf X_A$. Let the volume of this set be denoted $\Delta \omega(t_1)$, and of course, we have $\Delta \omega(t_1) = \Omega(\mathbf X_A)$.

As time goes on, these points move along their phase trajectories, but volume of the set of all points remains constant, at least as long as the set is "nice enough" (measurable). Thus the function $\Delta \omega(t)$ is a constant function.

If this set of moving representative points comes, at some time $t_2$, to positions that are all consistent with the macrostate $\mathbf X_B$, we have a kind of description of evolution of the macrostate $\mathbf X_A$ to the macrostate $\mathbf X_B$ (an ensemble of imagined systems that move from one macrostate to another). The set of representative points moved from being in the region $A$ to being in the region $B$. But this does not mean that the set of moving points became the same as the set of all points in region $B$; at time $t_2$, there may be "holes" in $B$, not occupied by any representative points. Motion of the representative points is described by "all points of the region $A$ come into the region $B$", not by "all points of the region $A$ move onto all points of the region $B$". Mathematically speaking, the mapping of the phase region $A$ from time $t_1$ to $t_2$ is into $B$ (it is an injection), not onto $B$ (it is not a surjection). So $\Delta \omega (t_2)$ is not volume of the same set as the volume $\Omega(\mathbf X_B)$ is. $\Delta \omega(t_2)$ is usually smaller.

The volume that is sometimes used to formulate the Liouville theorem is, using a here-made up notation, volume of the set $M(L_t A)$ - set of phase points that are reached by evolution of the original set $A$ in time $t$. At time $t_2$, this is a different set than the set $M(B)$ of all microstates compatible with the macrostate $\mathbf X_B$, even if $M(L_{t_2}A)$ is a subset of $M(B)$.

The big "If" above is an assumption about behaviour of the set of representative points chosen. But in mechanics, if we check all points in region A, then in general, not all will end up in B. Some of them may go to completely different regions of phase space, e.g. regions describing macrostates of lower phase volume and lower statistical entropy. So the Liouville theorem does not imply either increase or decrease of statistical entropy, but it is compatible with both. And it is a statement about phase volume of a different set from that referred to in statistical entropy.

In other words, the actual problem many people have with understanding how 2nd law is compatible with the Liouville theorem is that they mistakenly think the formula for statistical entropy is $$ k_B\ln \Delta\omega(t). $$ But this is not so. The latter expression is for a different kind of entropy, a special case of information entropy, or the Gibbs entropy, which is not a function of the macrostate, but a functional of the probability distribution:

$$ S^{Gibbs}[\rho] = k_B\int -\rho \ln \rho ~ dqdp . $$ When $\rho$ is $\frac{1}{\Delta \omega}$ inside the region made up of the moving points, and 0 outside, we have $$ S^{Gibbs}[\rho] = k_B \int -\frac{1}{\Delta \omega} \ln \frac{1}{\Delta \omega}~ dqdp = k_B \ln \Delta \omega. $$ which is constant in time.

In a scenario where the Liouville theorem applies, information/Gibbs entropy remains constant in time, while statistical entropy, being a function of macrostate $\mathbf X$, may change in time.

10

Your logic is actually correct. The discordance between the conservation of phase-space volume according to the Liouville theorem and the Second Law is known as the Ergodic Problem. Heuristic explanations as the one provided by Ross Millikan, or course graining the dynamics for another example, do not hold under closer formal examination, since the math rigor consistently breaks down at some point or other. There is a rich history (read large number of toms) of trying to rigorously eliminate said discordance, but the ergodic problem is theoretically still open. Practically, however, nobody cares much as long as the techniques of non-equilibrium statistical mechanics, quantum (fields included) or classical, produce meaningful results that can be used consistently.

udrv
  • 10,551
  • 1
  • 26
  • 35
7

Infinitesimal Perturbations

It is true that entropy would not increase in a completely isolated system that had no perturbations.

As a system evolves the phase space often distorts, elongating and forming twists or folds. Generally as this process progresses, the whole accessible space becomes covered, but with gaps so as to maintain the original volume. As time progresses the width of those gaps becomes smaller and smaller. This thinning process can be seen in this lovely gif from wiki commons:

Hamiltonian_flow_classical

Now if the system was perturbed by an amount less than or equal to half the width of those gaps, then suddenly the entire volume would be filled in. In any real system there are always perturbations from things like black-body radiation, or debatably even fluctuations in vacuum energy. While these perturbations are small, they increase the volume of the phase space proportionally to the surface area, which in turn would increases continuously other than the decrease do to the perturbations closing the gaps. Thus eventually any size perturbation will significantly effect the phase space volume and thus entropy.

Eph
  • 4,630
6

Jaynes has a compelling argument whereby he actually "proves" the 2nd law (or an aspect of it, at least), using Liouville's theorem, which goes roughly like this: Say you have a system which is measured to be in macrostate A, and it reproduceably evolves to macrostate B in a time $\tau$. We don't know the microscopic details of state A, and it corresponds to a region of phase space $P(A)$. Similarly, state B corresponds to a region $P(B)$. Given that A reliably evolves to B, then any microstate compatible with A must evolve to a microstate compatible with B. Well then, in phase space, start with a phase space volume of all points that are microstates of A. They apparently all evolve to B by the time $t=\tau$, and the phase space cloud (which has the same volume, because of Liouville's theorem) must be fully contained within $P(B)$, so vol(P(B)) > vol(P(A)).

user42541
  • 341
2

My answer will add to others by describing an easy-to-visualize analogy, and then by making a point about irreversibility.

First the analogy.

Compare phase space (i.e. the state space in which the Liouville theorem applies) to a large grassy field. The blades of grass represent small regions of the space. Now imagine a pool of water in a small square container of side $1$ metre, somewhere in the field. This represents a set of initial states. Now let the container move and change shape, while maintaining its area (so the depth of the water is a constant, with the water being incompressible). Suppose the container extends out into a long filament and this filament wraps around itself in a spiral, but the water depth remains constant because the container preserves its area. Now look down on this spiral from above. Say the spiral has a circular outer edge with radius $R = 2$ metres. The water which had area $1\,{\rm m}^2$ to begin with still occupies that same area, but now in a long filament wrapped around in a spiral inside a region of area $\pi R^2 = 12.6\,{\rm m}^2$.

Finally, let's make assertions about macroscopic parameters. We could furnish the values of some macroscopic parameters sufficient to specify the initial state as falling in that square. How about the final state? The best we can do using macroscopic parameters is to say the state is somewhere inside the circle of radius $R$. The macroscopic parameters simply do not provide a fine-detail description. So in terms of macroscopic quantities the final state is indeed one of higher entropy and no contradiction has arisen.

However a contradiction of the 2nd law (entropy increase) can still arise if a state such as the final one here could somehow evolve to the state such as the initial one. This logically implies one of the following:

  1. either the 2nd law can be broken (so it is no longer a universal law)
  2. or states such as the spiral with reversed velocities never arise in practice
  3. or there is a physical blurring of trajectories such that Liouville's theorem does not in fact describe the dynamics at 100% accuracy

Many physicists at the moment opt for 1 or 2, but I would like to remind readers that 3 is also a viable option. Newton's laws may, for all we know, be approximate at the level of say $10^{-100}$ in precision, and then chaotic systems will rapidly amplify the imprecision so that the spiral described above washes out to fill the circle and the water level falls. Similarly, the unitary treatment commonly adopted for quantum mechanics may be approximate. One reason to think that physical evolution is not unitary is because would it require physical variables and quantities to be specified with exponentially growing precision (during chaotic motion) and that goes against the idea that only finite amounts of information can be physically represented in finite volumes of space.

So whereas at the moment the fashion in physics is to say that reversibility is the precise statement and irreversibility is a convenient approximation, one can also take the reverse point of view. The empirical and theoretical evidence is quite strong for this other point of view, where irreversibility is what happens and reversibility is a convenient approximation for many motions over sufficiently small intervals of time.

Andrew Steane
  • 65,285
0

This is a very interesting question. I have often asked myself this question and found a good answer first by reading Prigogine's works on non-equilibrium thermodynamics. He outlines this paradox and focuses mostly on the microscopic processes. Microscopic processes might be reversible or irreversible. If all microscopic processes are reversible, then the system is not able to produce entropy. In this case Liouville's theorem is fulfilled, the phase space stays at the same volume all the time. The problem is, by perceiving our universe, we realize that there is such a thing as increase in entropy. The second law of thermodynamics must allow for entropy increase. Therefore a paradox exists, as in another answer here already named as ergodic paradox, or mostly referred to as "Loschmidt's paradox". So comparing this problem to a billard table (without holes), the balls represent the particles and once pushed, a ball rolls over the table, making collisions with other balls, transferring momentum and so on. If all processes on this table are reversible, the system will never go back to the quiet state, with all balls resting. There will always be a rolling ball, thus phase space will remain at the same volume. In contrast a specific resting configuration is a single point in phase space. The whole dynamics change, once irreversible processes are allowed. If these irreversible processes are all according to the 2nd law of thermodynamics, then the balls will come to rest after finite time. This is actually what we perceive in reality. So a quite easy solution is to introduce elementary particle processes, which are not completely reversible. Unfortunately whenever physicists unveil the dynamics at these time scales, all processes are found to be reversible. What is fission of an atom, seemingly an irreversible process, becomes an interaction of smaller particles, neutrons and protons, if one looks closer. These smaller particles then interact with each other just using reversible processes, thus offering a description without irreversible processes. One can then raise the hand and say "What is with CP violation?", Ok good objection, but when one looks at CPT-symmetry, the reversibility is restored. So even going to quantum mechanics, does not help. Because there is Liouville's theorem even in quantum mechanics, it is called Von-Neumann-equation. Still we know, that entropy increases when wave functions collapse (or at least stay the same, if the density matrix does not change). Prigogine stays classical and does not discuss the quantum aspects of this paradox, but he offers a solution for the classical version. He basically says, that the idea of elementary particles is flawed. If you assume elementary particles exist, and elementary means, there is no irreversible process, which breaks them up into smaller particles, then Liouville's theorem becomes true. Because describing a model world with particles that only obey reversible processes conserves phase space. But the concept of elementary particles might be a human concept and nature can produce an infinite number of "new particles", if necessary. What is elementary then, is not particles, but rather irreversible processes. This is why Prigogine starts his book "From Being To Becoming" with a quote from Goethe: "Komm, drücke mich recht zärtlich an dein Herz! Doch nicht zu fest, damit das Glas nicht springe! Das ist die Eigenschaft der Dinge: Natürlichem genügt das Weltall kaum; Was künstlich ist, verlangt geschloßnen raum."

In english (if someone has a better translation, please post it :D) "Come, press me gently to your heart! Not too firm, so that the glass does not break! This is the characteristic of things: For natural things the space is not enough; Only the artificial needs closed space."