What is the entropy of a hydrogen atom, a bound proton and electron?
First attempt: The standard molar entropy of hydrogen gas is 130.68 $J \, mol^{-1} K^{-1}$ at $298 K$. $1 \, mol = 6.02214076×10^{23}$.
Therefore, one hydrogen atom has an entropy of $2.17 \times 10^{-22} J/K$.
A thermodynamic definition of entropy is $S=-k_B \sum_{i} p_i \log{p_i}$ where $p_i$ is the probability of a given microstate and $k_B$ is Boltzmann’s constant. This closely resembles the formula for Shannon entropy, $H=-\sum_i p_i \log{p_i}$ where $p_i$ is the probability of a message $m_i$ taken from some message space $M$. See info-entropy relationship.
Dividing by $k_B$ “yields” the Shannon information, which appears to be 15.72 nats, roughly three bytes. This disagrees with the linked question, which yields a much larger information content.
Second attempt:
The probabilities $ P_i $ can be obtained from the Boltzmann distribution:
$ P_i = \frac{{e^{-E_i/(k_B T)}}}{{\sum_j e^{-E_j/(k_B T)}}} $
where $ E_i $ is the energy of the $ i $th eigenstate.
The energy levels of a hydrogen atom are given by the formula:
$ E_n = -\frac{{13.6 \, \text{eV}}}{{n^2}} $
where $ n $ is the principal quantum number (1, 2, 3, ...).
At $ T = 300 K$, using the formula above:
Calculate the probabilities for each energy eigenstate:
$ P_n = \frac{{e^{-(-13.6 \, \text{eV})/(n^2 k_B \cdot 300 \, \text{K})}}}{{\sum_{j=1}^{\infty} e^{-(-13.6 \, \text{eV})/(j^2 k_B \cdot 300 \, \text{K})}}} $
Calculate the entropy using the formula: $ S = -k_B \sum_{n=1}^{\infty} P_n \log(P_n) $
The sum in the denominator of $P_n$ diverges.
In this paper, the “Shannon entropy” is computed and appears to be something like $3+\ln(\pi)$. Any help in understanding where I’m going wrong would be greatly appreciated!