2

The Boltzmann entropy equation is commonly used in statistical interpretation of entropy to relate entropy $S$ with the number of microstates $\Omega$:

$$S=k\ln(\Omega) \, .$$

The classical thermodynamic entropy $S$ is related to heat $Q$ and absolute temperature $T$:

$$dS=\frac{\delta Q}{T} \, .$$

Now, the number of microstates $\Omega$ for the distribution of energy to the particles in a system happens to depend on temperature $T$, the partition function $P$, and the number of particles $n$ (assuming constant volume):

$$\ln(\Omega_{\mathrm{thermal}}) = n\ln(P) + \frac{U}{kT}$$

$$P=\sum{\exp \left( {-\frac{E_i}{kT}} \right)} \, .$$

So it seems that the dependence of $\Omega_{\mathrm{thermal}}$ on temperature in the above equation links the classical thermal entropy and the statistical thermal entropy.

However, I'm not aware of $\Omega_\text{config}$, the number of spatial configuration microstates, as having a similar dependence on temperature $T$ or internal energy $U$.

So why is the Boltzmann entropy equation is also used for the case of configuration entropy?

Doing so would seem to imply that there is heat transfer involved in changing the configuration entropy.

DanielSank
  • 25,766

2 Answers2

1

Boltzmann entropy equation IS configurational entropy.

Also, the 3rd Eq. you have written is wrong, or at least P isn't pressure. P has to be dimensionless. If I had to guess, I'd say P is probability and that expression comes from considering the microcanonical ensemble, i.e., an ensemble in which you have the energy, the volume and the number of particles fixed.

The connection between Boltzmann entropy and thermodynamic entropy, to my knowledge, comes when you take the entropy,

$S=-k\int dqdp P(q,p)\log P(q,p)+\text{constraints}$,

and maximize it then you obtain an equation which are very similar to the thermodynamics Eqs. However, this is on a mathematical standpoint.

There is an interesting discussion about this here: Proving that the Boltzmann entropy is equal to the thermodynamic entropy

Henry Kel
  • 11
  • 1
0

The thermodynamic entropy is unknown normally. But you can use the old trick considering all the pausible cases having the same probability. Of course, you obtain some unreal results, but you don´t have the problem of discard factible results anyway

It is not a physical trick, really. It is used in all statistical treatments of data in general, when there is no another option.

Of course, you have to decide the cut, like you have to do always in statistics. In bayesianism they put very high cuts (1%, 0.5 %) and after they say "statistical fails", "science fails" and another yellow commentaries. There is a physical realistic cut, you can check my other answers. It is very low $ \approx 10^{-25} $ for probability if you don´t know the priori distribution (many times it is supposed but little times really known, it have no sense caution in cuts and not in distribution ways, but it is physically justified