3

Filling a box with a certain amount of gas with a specific total energy and allowing the gas to reach a maximum entropy state, what happens next?

Would the gas remain in a maximum entropy state indefinitely?

What would prevent the gas atoms/molecules to end up in a more orderly state at some point just out of coincidence? After all, the gas particles have some energy total in this closed system and will keep moving around.

Is it even possible that if we waited a long long time, those gas molecules could fall back in a state of minimum entropy at some point?

edit: Just how many maximum entropy states are there in a gas with N particles inside a closed system vs lesser entropy states? Would it be even more likely for a gas to decrease in entropy than remain in a maximum entropy state? If yes, then this would be a clear violation of the 2nd law of thermodynamics. It would have to be stated more precisely.

pZombie
  • 379

5 Answers5

3

If you put a gas in a box and wait for it to reach equilibrium, then a) its full behaviour then is described by equilibrium statistical mechanics and b) it will remain in this state - as described by equilibrium statistical mechanics - forever (really forever) if nothing is done to it.

The key point is that, although it carries "equilibrium" in it, equilibrium statistical mechanics allows (in the sense that it does not forbid it) to have gas states where the particles are all lumped in a single corner of the box.

This can seem counter-intuitive because we would imagine that, density say, is uniform at equilibrium while in fact it is only very probable.

The way to look at it is the following (that I got from a very nice review from Oliver Penrose):

  • When you consider a single system, over a certain time it will first reach "equilibrium" in the sense that it will explore mostly the same region of phase space (or a region that is essentially self averaging) until it eventually reaches a state out of this self averaging region. One obvious time scale for this is the recurrence time scale for such a thing to happen.

  • Now, if you repeat the experiment many times, you will never start from the same micro state i.e. you cannot control exactly the initial conditions of the micro state.

  • As a consequence there will be a strong dispersion in the recurrence times that can be observed and thus for seemingly unlikely recurrence events to occur.

  • One way to account for these recurrences and the strong dispersion in their observation is to simply allow them to occur at equilibrium and they will correspond mainly to the tails of the distribution that are very very unlikely but still can happen.

In equilibrium statistical mechanics, a system that "goes out of his way" is just said to fluctuate, and as long as those fluctuations are quantitatively described by equilibrium statistical mechanics, they are named equilibrium fluctuations and are perfectly normal.

gatsu
  • 7,550
2

The gas could fall into a state of low entropy randomly. It is important to remember that the laws of thermodynamics are probabilistic, and they say not what will happen but what usually will.

Jimmy360
  • 4,019
1

The second law of thermodynamics says that the entropy of an isolated system can increase, but that entropy can not decrease without the addition of energy to the system, or the transfer of entropy to another system.

Increasing entropy has been associated with the arrow of time, as entropy seems to be the only quantity in physical processes that requires time to be directional.

I am unsure how one could determine if an isolated system had reached "maximum entropy", as that term seems to have several technical meanings, some of which are used in the social sciences:

http://en.wikipedia.org/wiki/Maximum_entropy

Ernie
  • 8,598
  • 1
  • 18
  • 28
1

I want to post the model i have in my mind currently (after reading your answers) and see if you agree or disagree with it.

A gas in a box (idealized closed system) has N possible states. Each of those N states has a certain entropy value. Given the gas is in one of those N states with a certain entropy value, there are X states with a lower entropy value, we could calculate the total probability of the gas reaching one of those X states.

There are also Y states with a higher entropy value, we could also calculate the total probability of the gas reaching one of those states.

total states N = X+Y (+1?)

Each of the X and Y states in particular have their own probability value depending on which state the gas is in currently. (not sure here if there are states with 0 probability depending on the current state of the gas - input appreciated)

The gas could jump to either a state of the X states or of the Y states, hence reach a state of higher or lower entropy at any given time. However, it is more likely to reach one of the Y lesser entropy states if the combined probabilities (of all Y states) are higher for such a state (and vice versa).

The point where reaching one of the X states is just as likely as reaching one of the Y states, hence, falling into a state of higher or lower entropy is equal, is what some seem to call the equilibrium state(i would rather call equilibrium point). This is NOT the maximum entropy state of a gas.

In fact, reaching a maximum entropy state for a gas seems highly unlikely just as it is highly unlikely for a gas to reach a minimum entropy state. Not impossible for both options, given enough time.

A gas inside a box will always tend to move towards the equilibrium point because of above considerations.

Once at the equilibrium point, where the next state of the gas being one of higher entropy is just as likely as being one of lower entropy, the 2nd law of thermodynamics has no meaning at all, not even in the form of "A closed system is more likely to increase in entropy".

edit: Furthermore, once at the equilibrium point, nothing stops the gas from reaching a state of minimum entropy, given enough time. It's just not very likely to happen in short time(not impossible). Given enough time however, it's a given it will happen at some point.

Feel free to toast me now, and hopefully tell me where i am wrong so i can adjust my worldview if required.

pZombie
  • 379
1

This question is based on a fundamental misunderstanding of entropy. Entropy is not one state it is many states that are functionally equivalent to each other. The behavior of large scale systems that is described as entropy is really the result of our inability to distinguish between unique states that seem identical to each other. If you look at a closed container of a gas the expected high entropy state is that collisions will be distributed evenly and pressure will be uniform over any surface. However as loci that you examine become smaller and smaller you will get to a threshold where the probability of a collision within that loci decreases to the extent that you will see "quantum" effects--non uniform behavior. The divide between large scale and quantum is artificial due to our inability to distinguish between equivalent states at the large scale. The high entropy state that you describe in your example above is not one state, it is the system moving from one state to another each of which are functionally the same.