21

I understand that, in thermodynamics, entropy has a precise definition (the infinitesimal change of entropy being the infinitesimal heat transfer divided by the temperature), and that in statistical mechanics, for a system consisting of a large number of identical subsystems, so to speak, it is the log of the number of possible distributions of the subsystems, corresponding to some given energy levels.

And from what I understand, the Second Law of Thermodynamics basically says that, for some simple systems, such as two reservoirs at different temperatures, connected to each other (and isolated from the rest), heat goes from hot to cold, so that the fact that the entropy wants to be as big as possibly can, is similar to systems going towards an equilibrium state (at least for simple systems such as the one above).

(Please forgive me if my descriptions here are not so accurate from a physical point of view, for I am not a physicist, just a mathematician.)

Fine. But I frequently see people linking entropy to chaos. I assume there is some scientific work which started this train of thought, and then the media kept stretching the words further and further. Can someone please point me to that scientific work? Also, is the link between entropy and chaos valid, in the eyes of modern physicists?

I have seen a few questions that overlap with mine, but I have not found the exact answer to my question.

Cody Gray
  • 109
Malkoun
  • 697

3 Answers3

13

I would say the connection between chaos and entropy is through ergodic theory, and the fundamental assumption of statistical mechanics that a system with a given energy is equally likely to be found in any 'microstate' with that energy.

Although chaos is a very general aspect of dynamical systems, Hamiltonian chaos (encountered in classical mechanics) is characterized by a paucity of conserved quantities, such as energy, and total linear/angular momentum. The crucial fact is not that these conserved quantities are merely difficult to find, but that they do not exist. Because of this, the trajectories of a chaotic dynamical system will trace out a high-dimensional submanifold of phase space, rather than a simple 1 dimensional curve. Each trajectory is locally 1 dimensional, but if you looked at the set of all points in phase space traced out over all time, you would find a higher-dimensional space, with dimension $2D-N_C$, where $N_C$ is the number of globally conserved quantities.

In most many body systems, $N_C$ is finite, or at least subextensive (i.e. any 'hidden' conservation laws are insignificant in the scaling limit as volume and particle number go to infinity, while keeping intrinsic parameters such as density and temperature constant). One takes total energy as the single most important conserved quantity, and the rest is textbook statistical mechanics.

Now, the precise type of non-linear behavior that would introduce ergodicity to the system is usually ignored in physics, because everything absorbs and emits radiation, which almost always causes systems to reach equilibrium. However, going a step deeper to consider self-thermalization of non-linear wave equations historically lead to the important discovery of the Fermi-Pasta-Ulam problem. Essentially, the discovery was that many nonlinear differential equations have hidden conservation laws that cause the heuristic argument described above to break down.

TLDR
  • 3,836
10

I'll have to disagree that those notions of entropy are disjoint. I'll try to explain my view.

In Statistical Mechanics entropy is defined in terms of accessible regions in phase space. It is the logarithm of this volume times a constant. In the process of deriving this formula starting from the number of accessible configurations it is postulated that all configurations must be equally accessible. This postulate is called the Ergodic Hypothesis. Since you're a mathematician I think you're probably familiarized with the term ergodic: it is a system whose evolution preserves measure (in our case, Liouville measure, which is Lebesgue's measure on phase space). Now, not every system is ergodic. Even though, estimates can be carried out and point that in a general gas, which has a huge number of particles, non-ergodicity would result in an extremely small error in Physics measurements (Laudau does that in his first volume on Thermodynamics). Even though, systems like spin glasses are canonical examples of non-ergodic systems where usual Statistical Mechanics is not applicable.

You see that the ergodic hypothesis is a key assumption in Statistical Mechanics. But what does chaos mean in Classical Mechanics? Well, it means your trajectories will cover your whole phase space. If you take a chaotic system (which is not only ergodic but also strongly mixing), the particle's trajectory will cover each and every bit of phase space accessible to it, bounded by energy conservation laws.

The conclusion is that if you assume Statistical Mechanics as being applicable, this is the same as assuming you cannot predict trajectories in phase space, either because you have too many initial conditions or because you can't track each and every trajectory, and after an infinite time they'll also cover the whole phase space. This is intrinsically connected to the notion of chaos in Classical Mechanics.

In Thermodynamics I think no one really understood what entropy meant, so I can't elaborate on that. It only gets clear in Statistical Mechanics.

QuantumBrick
  • 4,183
2

The (physical) concept of entropy is predominantly applied to many-particle systems. We can regard such a system as high-dimensional dynamical system, whose dynamical variables comprise the positions, momenta, and other variable properties of all particles. It can exhibit, in theory, three types of dynamical behaviours:

  1. A low-dimensional regular (i.e., non-chaotic) dynamics, i.e., a fixed-point, periodic or quasiperiodic¹ one. Such dynamics are possible for very low temperatures, e.g., a completely frozen system would correspond to a fixed-point dynamics and simple lattice vibrations would correspond to periodic dynamics. For higher temperatures, however, such dynamics do not correspond to what we observe in reality and simulations.

  2. A high-dimensional regular dynamics, i.e., a quasiperiodic¹ dynamics. Such a system could be described as the superposition of many independent periodic processes, each having a different, incommensurable frequency. While these processes need not affect a single particle but could be rather obfuscated, there is no reason why they would not interact at all (for a high temperature and sufficiently many processes). Moreover, it can be argued that high-dimensional quasiperiodicity is practically indistinguishable from chaos.

  3. A high-dimensional chaotic dynamics.

So, it makes sense to say that a system that has some entropy (i.e., whose temperature is not close to absolute zero) also exhibits a chaotic dynamics on the microscopic level. But this does not mean that the two are the same. In a multi-particle system, it’s not the mere presence of entropy that we care about but how and when it increases. So, entropy and chaos are as much linked as entropy and temperature or, say, mass and momentum.

Note that this not really about ergodicity. Complex systems with insurmountable energy barriers (consider spin glasses) can still be chaotic; and quasiperiodic dynamics can be ergodic (consider the example of a single particle moving on a quadratic torus, which is ergodic and quasiperiodic if the components of its momentum are incommensurable, but periodic and not ergodic otherwise).

Finally, note that the information-theoretic concept of entropy is used to characterise dynamical systems and to distinguish between chaos and regularity, but I assume this is not the reason for your question.


¹ Quasiperiodic dynamics are dynamics which can be described as a superposition of at least two periodic dynamics, but are not periodic (which is why the frequencies of the sub-dynamics must not be commensurable, i.e., have a common multiple). While the phase space of a periodic dynamics is a topological circle, the phase space of a quasiperiodic dynamics is a torus or hypertorus.

Wrzlprmft
  • 6,417
  • 2
  • 28
  • 48