1

Sorry, but I lost the reference of the author telling this.

Is $k \ln 2$ the largest or the smallest entropy that a single particle can carry?

And why is this so?

EDIT: It was a paper from the 1970s.

Qmechanic
  • 220,844
KlausK
  • 846

1 Answers1

3

In units where $k=1$, the entropy of a system is the logarithm of the "multiplicity" of indistinguishable microscopic states that correspond to its observable macroscopic state. The usual symbol for multiplicity, in modern textbooks, is $\Omega$, and this statistical definition of entropy (due to Boltzmann and engraved on his tombstone) is $S=k\ln\Omega$.

This definition of entropy makes it clear why entropy is an "extensive" quantity proportional to the size of the system, like mass or volume or particle number or internal energy, rather than an "intensive" quantity like pressure or temperature. If we have two separate but equivalent volumes of gas with entropy $S$, that means there are $e^{S/k} =\Omega$ indistinguishable ways for the molecules in that gas to arrange themselves. The total multiplicity of our two adjacent samples is (in $k=1$ units)

$$ \Omega=e^S=e^{S_1+S_2}=e^{S_1}e^{S_2}=\Omega_1\Omega_2 $$

from the experimental fact that entropies add. This is like saying that if I have a shuffled deck of cards in one of $52!$ orderings, and you have your own deck of cards in one of $52!$ orderings, the total number of states of our two-deck system is $(52!)^2$. However, if we shuffle our decks together, the number of microstates becomes $104!\gg(52!)^2\gg2^{52}52!$, so we are unlikely to ever re-separate your deck from my deck just by randomly shuffling. This is the statistics behind the "entropy of mixing," and is the statistical-thermodynamical reason why you can't get the yellow and blue Play-Doh back after you've made green with them.

If you restrict yourself to a spin degree of freedom, a spin-zero particle has zero entropy, since all of its orientations are equivalent. A spin-half particle, whose orientation has multiplicity $\Omega=2$, may have spin entropy $S=k\ln2$, which is known as "one bit" when used in information theory.

If multiplicity is quantized in integer steps, as for card-counting and other probabilistic systems, then the bit is the smallest nonzero entropy. In thermodynamic systems, we deal with huge multiplicities, and the quantization of $\Omega$ is irrelevant.

rob
  • 96,301