7

The question was asked long time ago on this site, but not answered properly.

In classical thermodynamics, entropy is defined up to a constant. In statistical thermodynamics, there is no such freedom. What exactly is the reason for either point of view?

In classical thermodynamics, only entropy changes play a role. But why does this argument disappear in statistical mechanics?

Also several formulations of the third law allow for entropy to reach a constant value (non-zero) at lowest temperature.

Addenda

  1. The paper by Steane arxiv.org/abs/1510.02311 (he added it below) shows that the discussion is pointless, because absolute entropy values can be determined also in classical thermodynamics. He shows that classical entropy is NOT defined up to a constant. A great read.

  2. The entropy implied here is observed entropy.

KlausK
  • 846

1 Answers1

17

The key point is that statistical mechanics entropy is just a name for $$ \eta = -k_B \sum_i p_i \ln p_i \tag{1} $$ (Gibbs expression in terms of the probability of the microstates $p_i$). It is clear that if there is a macrostate such that all probabilities but one are zero $\eta=0$.

To identify $\eta$ with the thermodynamic entropy, we need to establish a link between $\eta$ and $$ S=S_0 + \int \frac{q_{rev}}{T}. $$ Such a link can be established by working with the differentials (see the accepted answer to this other question for the case of the canonical ensemble). I.e., in a closed system, we have $d \eta = dS$.

However, equality of the differentials only implies equality of the functions within an arbitrary constant. Therefore, even though equation $(1)$ is introduced without additive constants, its relation with thermodynamic entropy implies the possibility of such a constant. Of course, the constant can be chosen such that the entropy at $0$ K vanishes.