2

Entropy of an ideal gas is defined as the logarithm of the number of possible states the gas can have multiplied by Boltzmann's constant:

$${\displaystyle S=k_{\mathrm {B} }\log \Omega .}$$

In deriving the Maxwell-Boltzmann distribution, we initially start by counting a finite number of states, so this definition of entropy makes sense. But in the end we say that the number of possible states is so high that we can acctually say the distribution is continuous. But if the distribution is continuous, the number of possible states is infinite. So why is entropy not always infinite when a continuous distribution is used?

Qmechanic
  • 220,844
S. Rotos
  • 921
  • 1
  • 15
  • 32

2 Answers2

1

The distribution is continuous is exactly true only in the thermodynamic limit, in which case $E$, $V$, $N$, and $S$ are all infinite, while $E/V$, $N/V$, and $S/V$ are all finite (or, equivalently, $E/N$, $V/N$, and $S/N$ are finite). In a real-life situation in which the volume isn't infinite, the distribution is technically discrete. But usually the volume is macroscopic while the length scales in the problem are microscopic, in which case taking the distribution as continuous is an extraordinarily good approximation. Then the total entropy is enormous but finite.

0

So why is entropy not always infinite when a continuous distribution is used?

If the space of possible states is a continuous region, then the original definition of entropy is not useful (all entropies would be the same, infinity). One may introduce different definition:

$$ S = k_B \ln \Omega $$

where $\Omega$ is volume of the region (as opposed to number of points in that region). This is sometimes done in statistical physics (pre - quantum statistical physics).