7

In any sizable system, the number of equilibrium states are much, much greater then the number of non-equilibrium states. Since each accessible micro state is equally probably, it is overwhelmingly likely that we will find the system in an equilibrium state.

However, for a closed system, one that does not interact with any external system, the number of micro states is fixed. Therefore the entropy is fixed.

At any given time, the system will be in one of its micro states. Even if the system is in a micro state that does not look like equilibrium from a macroscopic point of view, its entropy will remain the same, since entropy is a property of all the micro states, not just a given micro state.

So, are the number of accessible micro states (and hence entropy) of an isolated system constant?

Qmechanic
  • 220,844
yalis
  • 1,025

5 Answers5

5

Isolated system: Since the matter, energy, and momentum is fixed, the total number of microstates available that satisfy these constraints is fixed/constant. So is the entropy constant? Yes, if the system is in equilibrium. No, if the system is not in equilibrium. What does that mean in terms of microstates?

If the system is in equilibrium, all these microstates are equally probable and the system visits each of these microstates over the course of time (also known as ergodic hypothesis). Therefore the each micostate has equal probability and $S=k \ln\Omega$. In such a state there is no more increase in entropy possible.

If the system is in non-equilibrium the system doesn't have equal probability of being in every microstate. In fact if the system is stuck in non-equilibrium ( e.g., a hot part and cold part in the box separated by a thermally insulating wall) it cannot access some microstates at all. Hence the system is restricted to fewer microstates. Technically, there is no unique global thermodyanmic state for the whole system and you cannot define entropy. But you can calculate entropy by summing up entropy of different local equilibrium parts (e.g., entropy of the hot part and cold part separately). Since the total number of microstates you can access is small, entropy is lesser than what it could be if you remove that thermally insulating wall letting the system equilibrate. Thus, when you equilibrate more microstates become accessible. Think of the system spreading in the phase space. Thus entropy increases. Once the system has reached complete thermodynamic equilibrium no more entropy increase in possible. All allowable microstates have been made accessible and equally probable!

Sankaran
  • 1,702
2

Since entropy is

$$S=k_b \ln \Omega,$$

it is synonymous to a certain phase space volume $\Omega$ which we have to fix somehow. Phase space volume is something like a counter of possible states of the system (in the most simple case, the positions and momenta of all particles). This is (normally) done by setting some macroscopic parameters, like temperature, pressure, number of particles etc.

So, if you have a closed system (i.e. a box where nothing can get in or out), you can e.g. fix volume, energy and the number of particles. You end up with $S(E,V,N)=k_b \ln \Omega(E,V,N)$.

This can be understood as the number of all possible configurations inside the box which have the total energy $E$, the total possible volume $V$ and consist of $N$ particles - including all the stuff in the box on the left side, on the right side, evenly distributed, half of it moving and the other half standing still etc. $S$ is a constant number because $E$, $V$ and $N$ are fixed in a closed system.

The problem is that you can make no statement about the system whatsoever. But - instead of $S(E,V,N)$, you can look at other dependencies of $S$. Let's say you divide up the box into two equally sized parts, a left part and a right part. $N_1$ is the number of particles in the left part, and $N_2$ is the number of particles in the right part (also $N_1+N_2=N$).

Now, you can calculate $S(E,V,N_1,N_2)$ and choose $N_1$ and $N_2$. That way, $S$ can change because you can choose $N_1$ and $N_2$ (particles may go from the left to the right part and vice versa). The important point is that $S$ is much greater for $N_1\approx N_2$ than for any other values.

This means that most of the possible configurations have approximately the same number of particles on the left and on the right side. Because every configuration is equally probable, most of the time, the system will look like this. But not all the time, though - even when $S(E,V,N_1,N_2)=0$, there is exactly one microscopic configuration which fulfils the macroscopic parameters, and it will occur at some time (if the system is ergodic, but that is a different story). In the given example, the entropy will increase and decrease all the time when particles go from the left half to the right half.

But by definition, the percentage of time when the system is in a specific configuration $N_1$,$N_2$ is proportional to $\Omega(E,V,N_1,N_2)=\exp\left( \frac {S(E,V,N_1,N_2)} {k_b}\right)$ (I just thought of this last formula, please correct me if I'm wrong).

In conclusion: The term entropy is meaningless without the parameters one wants to fix/vary.

Funny example: Let's calculate $S(N_{\textrm{people alive on earth}})$ for the whole universe which describes the number of possible configurations of the universe for a given number of living people on earth. $S(0)$ will probably be much bigger than for any other $N_{paoe}$, which means that, most of the time, there will be no people on earth and the entropy will be huge (this is my interpretation of the the heat death of the universe). Then, they might come back and the entropy will be very small.

zonksoft
  • 1,805
1

Yes, if we say the entropy is just the log of the number of accessible microstates, then the entropy cannot change for an isolated system as it evolves. But this contradicts common sense. For an isolated system, we must introduce a notion of coarse graining for entropy to be a useful concept. Coarse graining can be done by dividing the system into subsystems, and then adding the entropies if the subsystems. This coarse grained entropy can and will change over time, as energy is exchanged between the different subsystems. The exact value of the coarse grained entropy will in general depend on the details of how you form the subsystems. The coarse grained entropy should eventually approach the original fine grained entropy as the system reaches equilibrium.

user1631
  • 5,027
0

the entropy can decrease as well. On average the entropy will increases (answer above), but it is possible that the entropy decreases.

Willi
  • 1
0

I believe it depends on your definition of equilibrium. If you fully describe all parts of an isolated system macroscopically, and this description does not change over time, then yes, entropy is already at its maximum.

What confuses me about some of the given answers is that they describe states in the system that should yield different macroscopic measurements. For instance, let's consider the example where half of the particles are still, even though energy (E), number of particles (N), and volume (V) are the same as in equilibrium. In such a case, pressure (P) will still be evolving significantly over time. Therefore, this microstate should not be considered in the computation of entropy since it only includes equilibrium states.

If you consider as equilibrium all states with energy E, then, in an isolated system, all feasible microstates will be equilibrium states. But, you are just neglecting other macroscopic degrees of freedom. Of course, isolated systems can be out of equilibrium. As was pointed out, see our universe.