5

We are all familiar with the Boltzmann-Gibbs-Shannon entropy formula:

$H_{\text{BGS}} = -\sum_{k}p_{k}\log{p_{k}}$

In information theory, this can be interpreted as the expectation value of the "surprise."

Using the maximum entropy principle, one can derive the micro, macro, and grand canonical ensembles from this expression. However, when it comes to Bose-Einstein and Fermi-Dirac statistics, these cannot be derived directly. Instead, we need to use the following entropy expressions:

$H_{\text{FD}} = -\sum_{k}(1-p_{k})\log{(1-p_{k})} - \sum_{k}p_{k}\log{p_{k}}$

$H_{\text{BE}} = \sum_{k}(p_{k}+1)\log{(p_{k}+1)} - \sum_{k}p_{k}\log{p_{k}}$

Interestingly, both of these expressions include an additional term. It caught my attention that we have a +1 and a -1, similar to the adjustments in the Bose-Einstein and Fermi-Dirac distributions. I would like to understand the significance and physical meaning behind these terms. Is there an intuitive way to interpret them?

1 Answers1

5

The similarity between $H_{\text{BGS}}$ and $H_{\text{FD}}$, $H_{\text{BE}}$ is formal. $H_{\text{BGS}}$ represents the entropy of a multiparticle system. In this case, $k$ enumerates the states of a multiparticle system and $p_k$ are the probabilities of these states. $H_{\text{FD}}$ and $H_{\text{BE}}$ express the entropy of quantum ideal gases. In this case, $k$ enumerates single-particle states, and $p_k$ are the average numbers of particles in these states, not probabilities. To emphasize the fact that $p_k$ are not probabilities in the later case, let's mention that in an ideal Bose-Einstein gas, $p_k$ can be greater than $1$.

Quantum ideal gases become classical in the limit of $p_k\ll 1$. In this limit, $H_{\text{FD}}$, $H_{\text{BE}}$ are formally reduced to $H_{\text{BGS}}$, while $p_k$ are still not probabilities, but the average number of particles. Thus, the additional terms in $H_{\text{FD}}$, $H_{\text{BE}}$ are related to the quantum statistics of indistinguishable particles.

Gec
  • 7,419
  • 3
  • 14
  • 34