While having a look at a book on statistical physics (Statistical Physics and Protein Folding), I came across an argument for the principal of maximum entropy which I don't understand. It goes as follows. Assume there two subsystems $S_1$ and $S_2$ in contact with each other with total energy $E=E_1 + E_2$. Total energy is constant but the subsystems are allowed to transfer energy to each other. Assume that the we have a experimental resolution $\Delta$ for energy levels of the system AND $E_0$ is the minimum energy the subsystems can achieve. Let $\Gamma_1(E_1)$ and $\Gamma_2(E_2)=\Gamma_2(E-E_1)$ be the total number of accessible states in each subsystem with given energy $E_1$. Then the total number of states in the whole system is $\sum_{E_0 < E_1 < E}\Gamma_1(E_1)\Gamma_2(E-E_1)$ and the total entropy is then $S(E)= k_B \ln(\sum_{E_0 < E_1 < E}\Gamma_1(E_1)\Gamma_2(E-E_1))$. Now assume that in the summation the maximum term is achieved at $E_1 = \bar{E}_1$. Then this summation (i.e the total entropy) has the lower bound $$ k_B \ln(\Gamma_1(\bar{E}_1)\Gamma_2(E-\bar{E}_1)), $$ and the upper bound $$ k_B \ln(\Gamma_1(\bar{E}_1)\Gamma_2(E-\bar{E}_1)) + k_B \ln(\frac{E}{\Delta}). $$ Then the book goes on saying that in a macroscopic system of N particles we expect $S$ and $E$ both to be of order $N$. Therefore we can write $$ S(E) = k_b \ln\Gamma_1(\bar{E}_1) + k_b \ln\Gamma_2(E-\bar{E}_1) +O(\ln(\frac{N}{\Delta})) $$ So then if we "neglect" the last term then $$ S(E) = S(\bar{E}_1) + S(\bar{E}_2) $$ with $\bar{E}_2 = E -\bar{E}_1$. Now the principle of maximum entropy says entropy of an isolated system never decreases. I don't quite understand how the equation above gives us this since I don't see how the equation above says anything about subsystems $S_1$ or $S_2$. I don't think it implies that the system $S_1$ prefers the energy $\bar{E}_1$ and moreover as our ability to make more precise measurement increases and $\Delta \rightarrow 0$ the error term $\ln$ grows and so it sort of says that the above equality is susceptible to fluctuations. Can somebody clarify these points?
2 Answers
I think you're misreading their argument. What they're presenting isn't a proof of the second law of thermodynamics, it's a proof that the entropy is dominated by states in which the energy is within $\Delta$ of being shared in a certain way. You're certainly correct that if $\Delta$ is small the error term gets big. That makes sense, because it's not true for all $\Delta$ that the energy is within $\Delta$ of being shared in the preferred way.
Although this argument could be part of a proof of the second law starting from statistical principles, one way to see that it isn't complete for that purpose is that it doesn't mention time.
Yes, the equation doesn't imply any preferred state. It is the second law that states, for an isolated system, the entropy will keep increasing till its maximum. This is law so there is no need for a proof.
- 2,076