2

I am having trouble in understanding the following concepts :

Pg 231 Appendix B of the link http://books.google.ca/books?id=lEu7CTGjdDkC&printsec=frontcover&source=gbs_ge_summary_r&cad=0#v=onepage&q=entropy&f=false which is of the book Chaos and the Evolving Universe by Sally J. Goerner mentions that Entropy $S$: $$ S = \ln V $$ where $V$ is the phase space volume. According to the book, this equation is from the concept of Boltzmann's entropy.

(Q1) How is This equation coming? How can we say that entropy = log of phase space volume? References and explanation would be appreciated. According to the book, the Bolzmann's constant, $k_B$, is taken to be $1$. But actually there is a value to the constant. Can I take the Boltzmann constant to be equal to $1$?

Also, if the entropy increases, does this mean that the volume decreases?

(Q2) Secondly, can Kolmogorov entropy, from Information theory be stated as logarithm of phase space volume that is equivalent to entropy from statistical mechanics? I am unsure if I can replace Boltzman with Kolmogorov Sinai (KS) entropy.

(Q3) What is the difference between Gibb's entropy and Shannon's entropy since the formula http://en.wikipedia.org/wiki/Entropy_%28statistical_views%29 is the same.

N. Virgo
  • 35,274
Srishti M
  • 291

3 Answers3

2

Since you sound like a self-learner, I'll recommend Chris Cramer's free MOOC course on Statistical Molecular Thermodynamics via Coursera. He's a great lecturer and will give you a very clear-cut explanation to your first question of derivation within the first 3-4 lectures. Depending on the units one is using, it's often common to massage them so that Boltzmann's constant equal to one to make the mathematics easier.

Your following questions can be most easily answered by referencing Arieh Ben-Naim's text A Farewell to Entropy, which, based on E.T. Jaynes' work in statistical mechanics (there's a great fundamental paper he wrote in the mid-20th century), creates an elegant tie relating entropy in statistical mechanics with that of Shannon. You're not very likely to find this type of link in much of the literature given it's recent and growing acceptance and most statistical mechanics texts and journal articles will generally say the two concepts are completely unlinked aside from their names and the forms of their equations.

2

Q1: The important thing to know is that there are several distinct concepts of entropy in statistical physics and mathematics and there is no "the entropy" (life is hard). Only in thermodynamics, the word entropy has clear meaning by itself, because there it is the Clausius entropy. To answer your question, in short: it can be shown that for macroscopic systems (large number of particles), in quasi-static processes that isolated system undergoes, the quantity $\ln V$ behaves as the Clausius entropy in thermodynamics, that is, it does not change as the process proceeds. So it is sometimes called entropy too (I do not think Boltzmann entropy is a good name for it, since it is not clear whether Boltzmann thought this to be "the entropy"; it is said he never wrote in his papers and was first written down by Max Planck). It would be better to call it, say, phase-volume-entropy or so :-). The volume of the phase space could be that of the region of phase space accessible to the system, or to the region corresponding to lower energy than the system has. For ordinary macroscopic systems, these give the same value of entropy.

Q2: I do not know, but there seems to be a connection.

Q3: they are largely the same, the difference is that the Gibbs formula with probabilities is meant for states of a macroscopic physical system; for such system in thermodynamic equilibrium, the Gibbs formula with the Boltzmann exponential probabilities gives value that is practically equal to the value of the phase-space-entropy (for phase space that is consistent with the macroscopic variables of the equilibrium state).

The Shannon expression describes something very different, a degree of uncertainty of the actual value of some variable (say, one character of a message). There is a connection; the maximum possible value of the Shannon expression given fixed average energy and temperature is almost equal to the phase-volume-entropy for phase space region that would be assigned were the system isolated and in state with the same values of macroscopic quantities (energy, volume, ...) This is basis of the information-theoretical approach to statistical physics (see works of Edwin Jaynes on statistical physics http://bayes.wustl.edu/etj/node1.html ).

1

Thought I would throw a bit of history and philosophy of science in here for your amusement, starring none other than Von Neumann and Shannon...

Shannon replied: My greatest concern was what to call it. I thought of calling itinformation', but the word was overly used, so I decided to call it `uncertainty' . . . John von Neumann, he had a better idea. Von Neumann told me, "You should call it entropy, for two reasons. In the first place your uncertainty function has been used in statistical mechanics under that name. In the second place, and more importantly, no one knows what entropy really is, so in a debate you will always have the advantage."

(McIrvine and Tribus, 1971 See also Tribus, 1988)