4

I have been reading about the Gibbs paradox, in which the assumption that particles of a monoatomic ideal gas are distinguishable leads to a paradox in which entropy is not extensive. In Schroeder's Thermal Physics textbook, this is corrected by assuming that ideal gas particles are indistinguishable, and so the multiplicity function is divided by $N!$ ($N$ = number of particles).

That makes sense to me, however my confusion arises when applying this logic to different systems, for example the Einstein solid. In the textbook, the formula for the entropy of an Einstein solid is $S = Nk\ln\left(\frac{q}{N}+1\right)$ where $q$ is the number of energy units shared among the oscillators and $N$ is the number of oscillators. This is extensive since if you replace $q \rightarrow 2q$ and $N \rightarrow 2N$, the entropy doubles exactly. But, this formula is derived assuming particles of the Einstein solid are distinguishable, while on the other hand the ideal gas entropy assumes particles are indistinguishable.

My questions are:

  1. Why is it correct to assume ideal gas particles are indistinguishable, but not correct to assume oscillators in the Einstein solid are indistinguishable? What specifically results in this difference between the two systems, or is there a general principle that can describe when particles should be considered indistinguishable?
  2. How does this notion of indistinguishability play into different definitions of entropy? Specifically, I have read the following derivation of the Gibbs Entropy Formula ($S = -k \sum p_i \ln(p_i)$ where the sum runs over all states $i$ and $p_i$ is the probability of the particle being in the $i$th state). This derivation also seems to assume distinguishability of particles, since the multiplicity function it is derived from is just a multinomial coefficient. So does that mean that this definition of entropy, applied to the monoatomic ideal gas, is incorrect?
Qmechanic
  • 220,844

2 Answers2

3

Ok, I'm going to attempt an answer.

Let's start by treating the Einstein solid just like we treat an ideal gas: we assume all the oscillators are distinguishable. Then what does a microstate of the Einstein solid look like? It looks like an assignment of energies to each oscillator, plus a description of how the oscillators are arranged in the solid. So the microstates are given by $(E_1,E_2,\dots,E_N; x_1,x_2,\dots,x_N)$ where $E_i$ is the energy of the $i$th oscillator, and $x_i$ is its position. When computing the number of microstates, we can write

$$ \Omega=\sum_{states} 1 = \sum_{S_N}\ \ \sum_{E_1+\dots+E_N=E_{tot}} 1 $$ where we have summed over the possible permutations $S_N$ of the indices $1,2,\dots, N$. But the energy does not care what permutation we've summed over, and there are $N!$ possible permutations, so we find

$$ \Omega= N!\sum_{E_1+\dots+E_N=E_{tot}} 1 $$

BUT of course we have overcounted, because our oscillators are indistinguishable. Thus, each of those elements of $S_N$ really referred to the same state! So we should divide by $N!$, and get back to where we started:

$$ \Omega= \sum_{E_1+\dots+E_N=E_{tot}} 1 $$

Jahan Claes
  • 8,388
1

also in an Einstein solid you have indistinguishable particles, so Gibb's paradox also holds there if you do not take the $\frac{1}{N!}$ into account in the partition function.

To point 2: Indistinguishability plays a role in entropy since the definition of Boltzmann is:

$S = -k_B \ln(\Omega)$, where $\Omega$ is the number of microstates over the number of macrostates.

The formula you have with the probabilities can be derived from the entropy formula above but the indistinguishability plays a role since you have an $N!$, where you should use the Stirlings approximation:

$\ln N! = N \ln(N) - N$.

This is also quite important if you want to compute the chemical potential $\mu$ because there you have to take the derivative of the ln of the partition function with respect to $N$.

I hope that helps you a bit :)

Armani42
  • 110