0

I have a classical gas in a box with adiabatic walls so it's isolated from its environment, and at $t=0$ I (magically) record the exact microstate of all the particles. Since this is classical, the behavior may be chaotic but it is theoretically deterministic.

It seems like I have two descriptions of this system. One is the standard statistical mechanical description with 6N degrees of freedom, and another is "Since the state is known and deterministic, there's exactly 1 possible state, and 0 DOF". If we say that the entropy is non-0 at some point, then there's multiple possible states that arose from the single known state, which breaks determinism.

But of course, they're the same box, so they should have the same temperature, entropy, etc. (Or at least, if we put them in contact with another system, they should behave the same way.) How do we resolve this? Is the second description of the system 'not in thermodynamic equilibrium' because by knowing the microstate, those motions aren't microscopic anymore? Do we have to somehow "lose track of time" when considering the entropy of the ensemble?

Related, though not exactly answers to my question:

Kaia
  • 109
  • 4

2 Answers2

2

The physical system will do what it will do regardless of how we describing it. You knowing the microstate, will not affect how the system behaves.

You have discussed two different frameworks you can use to describe a system in thermal equilibrium.

The "classical mechanics" framework is that if you know the position and momentum of every particle (ie, the microstate), then you can evolve it using Newton's laws and know the state at every future time. From that, you can compute any observable you want (in principle). For example, if we put a pressure sensor in the gas, you could work out how each particle would interact with the sensor, and derive the precise reading as a function of time.

The "thermodynamics" framework is an effective description that we use when we coarse grain over irrelevant or unknown microscopic degrees of freedom. We start with the observables we actually can measure about the gas, such as the temperature, pressure, and volume, and think about what those look like in an equilibrium state and how they are related to each other. Using statistical mechanics, we can say that an ensemble of microstates subject to certain constraints will have a given average value for, say, pressure. Because the number of particles is so large, the fluctuations around this average are also small. In this framework, we assume that we don't know anything about the microstate, beyond the macroscopic observables. You could try to force your full knowledge of the microstate into this framework, and think of a distribution highly peaked around your microstate (with a zero or very small entropy), but you would be running against the spirit of this way of thinking.

Even though we give up information in the thermodynamics/statistical mechanics description, the two frameworks are consistent when they are both valid. If you measure the microstate (and didn't arrange for the gas to be in some super special state), then your microstate will simply be one draw from the distribution considered in statistical mechanics. Since we know what the average of observable quantities is in the distribution, and since the fluctuations around those averages are very small, then we expect the observables as predicted by your microstate will be the same as the statistical mechanics average quantities to a very good approximation.

Of course, there are questions for which you do need the microstate, and thermodynamics/statistical mechanics will not be sufficient. For example, you could ask for the the momentum of particle $j$ at time $t$. So knowing the microstate does strictly give you more information. However, usually these questions are not interesting and not even observable in practice. Another situation where you would benefit from knowing the microstate would be if you started the system in some bizarre, untypical state, like having all the particles of the gas in one corner of the container. This isn't strictly inconsistent with statistical mechanics, but of course you will get some results that are far from the average because you started in a state that was assumed to be very unlikely, so the statistical mechanics predictions will be inaccurate. However, the system will tend toward equilibrium, and sooner or later reach a more typical state closer to the average.

Andrew
  • 58,167
2

Entropy, unlike energy or volume, is not a quantity that is unique to a physical system. It is bound to its description.

I like this example of stochastic thermodynamic where you have the following system:

$$dX/dt = F(X)+\eta(t)\tag 1$$

where $\eta$ is a random gaussian noise with $\langle\eta(t)\eta(0)\rangle = e^{-|t|/\tau}/\tau$. $X$ can represent whatever you want, the position of a particle in a complex environment for example.

The dynamics of $X$ is very well described equivalently by the equations:

$$dX/dt = F(X)+\Omega\\ d\Omega/dt=-\Omega/\tau+\zeta(t) \tag 2$$

where $\langle\zeta(t)\zeta(0)\rangle = \delta(t)$. The equivalence between (1) and (2) follows from the integration of the dynamical equation for $\Omega$. Now, you can assign an entropy to these systems (called a stochastic entropy). And, even if the dynamics of $X$ for both system is exactly the same, the entropy production will be different. Why? Of course because in one case we have more information about the system, $(1)$ is like a marginalisation of (2). You can learn more on that in "Nonequilibrium and information: The role of cross correlations" if you wish.


Back to your question. The usual conceptual issue with thermodynamic entropy is that in this formalism, the thermodynamic entropy seems to be a property of the system itself, like energy or volume. Which is partly true. However, it will still be dependent on what you chose to include in the description of your system. If you have a gas of electrons, do you want to include the magnetization or not in the description? This will affect the entropy. So in this sense, already, the thermodynamic entropy is not unique. An other conceptual issue that is often being raise is that, entropy is related to heat flow and heat is something that has a "hard" physical meaning (not something that depends on probability, marginalization, etc...). This is also false! Heat is related to energy passing to degrees of freedom that have been marginalized, or equivalently, it's energy that is lost to potential use because it went into uncontrolled degrees of freedom. But if you know everything about your system, you don't have any heat, since you keep track of all your energy and know where it is. This is discussed in the book Stochastic Energetics by Ken Sekimoto (a great mind!). He provides an example in which, different description of the same system leads to different heat production.


To answer your question. It does not make sense to talk about the entropy of a system. I would go as far as to say that, the thermodynamic entropy of a system is not well defined because it depends on what macroscopic variable you want to use. If you want to talk about entropy, you have to give some ensembles associated to it, or probability distribution.

Syrocco
  • 2,660