0

Boltzmann's entropy formula: $S=k_{\mathrm {B} }\ln \Omega$

where $\Omega$ is the number of real microstates corresponding to the gas's macrostate.

Let's assume that we are talking about an ideal gas in a fixed close isolated space, with $n+1$ atoms.

It seem that the number of possible microstates of the following two situations are the same. Is their entropy the same ($x>y>0$)?

    1. $n$ atoms have energy $x$, and one atom has energy $y$,
    1. $n$ atoms have energy $x/2$, and one atom has energy $y + nx/2$

There is a bijection between the microstates of 1 and 2. So the number of possible microstates for these two are the same.

However intuitively, I would expect that 2 should be more likely as I would expect that each time there is an interaction between two atoms, some energy from the higher energy atom would be transferred to the lower energy atom.

Where I am getting things wrong?

And generally, how do we compute the number of possible microstates of a system as in the two cases above? (I wonder how experimental physicists use the formula in practice.)

Kaveh
  • 207

2 Answers2

1

When discussing formulae for entropies, we need to be careful about the realm of validity of these formulae.

As I also explain here, the Boltzmann entropy formula $S = k\mathrm{ln}(\Omega)$ is specifically for microcanonical macrostates in thermodynamic equilibrium or those that otherwise have all microstates equiprobable.

But the question has just written down the possible states of a system, not a macrostate - asking what the entropy of these systems is is not a meaningful question, since entropy is a property of macrostate (=probability distribution over the microstates of a system), not of a system.

If you assume both systems are in a macrostate where all states are equally likely, then indeed both these macrostates will have the same entropy, but this observation doesn't really mean anything: Since the states have different energies, this kind of equiprobable macrostate is not an equilibrium state for either of these systems, there is no physical reason we should specifically look at these particular macrostates.

ACuriousMind
  • 132,081
0

You're comparing the two microstates incorrectly.

Scenario Uno

  • There are $n$ atoms with energy $x$ each.
  • There is one atom with energy $y$.

Scenario Dos

  • There are $n$ atoms with energy $x/2$ each.
  • There is one atom with energy $y + nx/2$.

Now, let's analyze these scenarios in terms of the number of microstates:

In Scenario 1, you have $n$ identical atoms with energy $x$ and one atom with energy $y$. The number of microstates for this scenario is determined by how you can distribute the energy $x$ among the $n$ identical atoms and how you can distribute the energy $y$ to the remaining atom. This can be calculated using combinatorics.

The number of ways to distribute $x$ energy units among $n$ identical atoms is given by the partition function. Let's call this $\Omega_1$:

$\Omega_1 = \frac{(n+x-1)!}{n!(x-1)!}$

The number of ways to distribute $y$ energy units to the one remaining atom is simply 1.

So, in total, the number of microstates $\Omega_1$ for Scenario 1 is $\Omega_1 = \frac{(n+x-1)!}{n!(x-1)!}$.

In Scenario 2, you have $n$ identical atoms with energy $x/2$ each and one atom with energy $y + nx/2$. Similar to Scenario 1, you can calculate the number of microstates $\Omega_2$ for Scenario 2 using the partition function.

$\Omega_2$ is the number of ways to distribute $(x/2)$ energy units among $n$ identical atoms and distribute $(y+nx/2)$ energy units to the one remaining atom.

So, to directly compare the entropy of these two scenarios, you would need to compute the entropy using Boltzmann's formula for each scenario, taking into account the appropriate $\Omega$ value for each.

The misconception might be in assuming that the two scenarios have the same entropy just because there is a bijection between their microstates. While the number of microstates may be the same, the energies of the atoms are different, and this affects the entropy calculation. Entropy is a measure of the distribution of energy among particles, and the same number of microstates does not necessarily mean the same entropy when the energy distributions are different.