4

From what I gather (and please correct me if I'm wrong), Jaynes argues that thermodynamic and information entropy are the same since the assumption in statistical thermodynamics that the energy distribution attained is that which maximizes the ways energy is distributed is equivalent to assuming the maximum ignorance distribution, which appears to me to be a subjective concept or at least one relating to information.

However, isn't this similarity just a coincidence? There are physical reasons energy maximizes the ways it is distributed (2nd law) having nothing to do with the fact that the distribution it attains happens to maximize our inability to describe the energy (maximizes ignorance). .

(Here I am talking about classical physics. I realize the uncertainty principle may unite information and thermodynamics in quantum.)

Update: The discussion in the link below has some claim that information and thermodynamic entropy are independent specific examples of a more general concept, but they are not equivalent.

Is information entropy the same as thermodynamic entropy?

Qmechanic
  • 220,844
SuchDoge
  • 437

2 Answers2

1

We needn't go to QM for the moment. If you take a lecture course in classical thermodynamics, the second law is derived from informational considerations, although they typically discuss it in terms of maximising the number (typically denoted $W$ or $\Omega$) of microstates per macrostate. We can then show the second law's entropy-increasing formulation is equivalent to heat moving from warmer sources to cooler ones, but again, the information analysis is fundamental (it's even needed to define what temperature is).

J.G.
  • 25,615
0

The similarity pointed by Jaynes between thermodynamic entropy and information entropy could be coincidental but its appeal is too powerful to resist. Just think about it:

  • Information entropy: Someone tells us they have a coin in their pocket and ask us to guess the probability of Heads and Tails. They won't let us see the coin, let alone experiment with it. Knowing nothing else and forced to give an answer most people would say that the probability is 50-50. It turns out this is the probability of heads and tails that maximizes the Shannon entropy of all possible distributions we can assign to the coin.
  • Thermodynamic entropy: A system at constant energy, volume and number of particles exists in an enormous number of possible microstates. We want to guess the probability distribution but we cannot see the microstates, nor can we play with them to see what are their relative probabilities. Knowing nothing else and forced to give an answer we say that all microstates are equally probable. Again it turns out that this is the distribution that maximizes the Shannon entropy among all possible distributions we can assign to the microstates.

The argument does not "prove" anything, but it is hard to ignore. To Jaynes it was so profound that the first version of his paper on this topic was titled How Does the brain Do Plausible Reasoning?. The paper was rejected (the review and Jaynes's response can be found at the end of the manuscript in the previous link) but its subsequent reincarnation as Information Theory and Statistical Mechanics has some 15000 citations as of today.

Themis
  • 5,951