10

The most common definition I’ve heard of entropy in physics is the number of micro-states for a given macro-state. Most examples use the atomic scale as the micro-setting and some kind of simple, closed, multi-atom system (such as gas in a box) as the macro scale.

While this generally makes sense, I’m really struggling to understand how to think about this without subjectively defining specific macro-states of interest, and a specific scale for the micro-states. I understand that a low entropy macro-state such as “all gas in the corner of the box” will likely become a high entropy macro state such as “gas diffused throughout box”, but this feels a bit circular as I would really only be able to recognize (and assign value) to low entropy (i.e structured, at my perceptive scale) states. My questions are as follows:

  1. For an objective viewer at an arbitrary macro scale, how does one define a set of macrostates to evaluate the entropy for?

  2. Is the micro state always defined in terms of atoms, or sub-atomic particles? Can it be generalized to any scale < macro with the second law still holding?

FYI I’m a mathematician (working in information theory) and have only taken a few physics courses in my career, so apologies for the elementary question.

Qmechanic
  • 220,844

4 Answers4

4

A Macrostate can be fixed by "macroscopic" thermodynamical variables (where the amount of them depends on the system). Macroscopic in this case means averaged over "infinite time". (Notice this makes sense since we define thermodynamics as physics in equilibrium. I.e. not macroscopic time dependence per definition. You may or may not agree and there are non-equilibrium ways too).

Alright, here an example. If I have an ideal gas system, isolated, as you considered, then I have the average total energy (in this case it is conserved) and volume. Given the macrostate fixed by these two variables, how many microstates, i.e. phase space volume (space and momentum) do I have?

There is indeed a scale necessary, since those are dimensionful quantities. But this is arbitraty, as long as I and you agree to use the same scale, entropy change will always have the same sign.*the scale being hbar

Besides, even in information theory, you can choose any log basis you want.

2
  1. For an objective viewer at an arbitrary macro scale, how does one define a set of macrostates to evaluate the entropy for?

In thermodynamics, the set of variables that the fundamental entropy function is a function of depends on the system and the level of detail we choose to describe the system with; but it always starts with internal energy $U$, volume $V$, and continues with additional variables, characterizing numbers of particles of different kind, and possibly additional volume-like quantities (external constraints), such as external magnetic field induction, or external gravity field strength. This works for systems in a lab, but does not work for astronomic systems which are not in equilibrium and which we can't do experiments with, such as solar systems or galaxies.

The above works for equilibrium systems. Out of equilibrium, when the system can be divided into many small parts which still have well-defined temperature varying across the system, we can generalize and introduce a description of macrostate in terms of energy and population counts in cells : if volume is constant, we can have $(U_i,\{N_i\})$. Or we can describe a system in terms of continuous densities on continuous region $V$: $(u(\mathbf x), n(\mathbf x))$. This is similar to coarse-graining description in statistical physics, but it is different: we are not smoothing out complicated density in phase space, but instead just introducing a more detailed description of the macrostate. There is no restriction to size or scale of these cells except if we want to make a link to thermodynamics, they should still contain many particles, so energy inside a cell is much greater than energy of interaction with the other cells.

In statistical physics, we then use the above chosen macroscopic description variables as constraints on the microscopic probability description, either phase space function $\rho(q,p)$ in classical theory, or density matrix $\rho_{ik}$ in quantum theory. Then we can seek the probability function/density matrix that maximizes the information entropy functional.

We can also introduce very small cells in phase space much greater than $h^{3N}$, introduce coarse-grained probability/population description of these cells, and use it as constraints on the maximization problem to find the likely state and the corresponding statistical physics coarse-grained entropy. This should work almost at any scale.

  1. Is the micro state always defined in terms of atoms, or sub-atomic particles? Can it be generalized to any scale < macro with the second law still holding?

It need not be atoms/particles, it can be any detailed description of microstates which we can assign probability in a consistent way. One can try to apply the same methods to a field with infinite number of degrees of freedom, but problems may arise, e.g. some variant of UV catastrophe or a lack of consistent measure on the space of field states.

Second law is not guaranteed to hold though, it seems to hold for systems in our world, but in a reversible model of microworld, this can only be explained by a special initial condition, not by equations of the model; in non-reversible models, such as stochastic/kinetic models, which obviously break reversibility, non-decrease of entropy can be shown to follow from the model.

2

Given your mathematical background, I believe you will find a mathematical answer more appealing than a physical one.

For an objective viewer at an arbitrary macro scale, how does one define a set of macrostates to evaluate the entropy for?

To understand micro- and macrostates, we are going to start with the basics - a physical system is any set of physical objects which we have chosen to examine. Every physical system has properties which we call parameters if they can be quantified mathematically. For example, the pressure of a gas is a parameter because it can be quantified by a number. Other parameters of a gas are also the positions of its particles since those can be quantified by vectors. However, a gas's colour is not a parameter because colour is a percept (the closest thing to colour which can be called a parameter is the wavelengths of light which the gas reflects, but these are still not the colour itself, for different people will perceive them as different colours).

Depending on which set of parameters of a system we choose to examine, we have different parameterisations or descriptions of it. Note that these always describe the same system but represent different aspects of it. Given a parameterisation with $n$ parameters $P = (p_1,\cdots,p_n)$ we call any $n$-tuple of values $(v_1,\cdots,v_n)$ for the parameters $p_1,\cdots,p_n$ a physical state of the system in the parameterisation $P$. The set of all possible physical states $P$ is known as its phase space, and I will denote it as $S(P)$.

To talk about entropy, we always need two different parameterisations of the same system. Suppose we have two parameterisations $P = (p_1,\cdots, p_n)$ and $P' = (p_1',\cdots,p_m')$ of the same system, the phase space of $P$ is bigger than the phase space of $P'$ and there is a surjection $f: S(P) \to S(P')$ from the phase space of $P$ to the phase space of $P'$. This just means that if we know the values for the parameters in $P$, we can calculate the values of the parameters in $P'$. In this case, we call any physical state in $S(P')$ a macrostate and any physical state in $S(P)$ a microstate. The multiplicity $\Omega$ of a given macrostate $M \in S(P')$ is the number of microstates which correspond to it, i.e. it is the number of physical states $m \in S(P)$ such that $f(m) = M$.

Since, in real physical scenarios, multiplicities can become very large, we introduce the concept of entropy. The entropy $S$ of the physical system described by $P$ and $P'$ is defined as the Boltzmann constant $k_B$ times the natural logarithm of the multiplicity $\Omega$ of the system's current macrostate.

$$S \overset{\text{def}}{=} k_B \cdot \ln \Omega$$

The Boltzmann constant is there for historical reasons, and the logarithm gives us a relatively small number to work with, which is easier than working with multiplicities directly. And... this is basically it - that's all there is to entropy. It is just a relation between two different parameterisations which we choose to describe a physical system. Note that the choice of the two parameterisations $P$ and $P'$ results in different entropies.

Is the micro state always defined in terms of atoms, or sub-atomic particles? Can it be generalized to any scale < macro with the second law still holding?

In the case of gases, $P = (\vec{r}_1,\vec{p}_1,\cdots,\vec{r}_n,\vec{p}_n)$ is taken to be the parameterisation which studies the positions and momenta of all gas particles and $P' = (p,V,T)$ is the parameterisation which studies the pressure, volume and temperature of the gas. The entropy of the gas is thus a measure of how many different configurations of the gas's particles give rise to the specific pressure, volume and temperature that the gas is measured to have. The second law of thermodynamics applies only in this specific context - if you change the parameterisations, the entropy value will be different, even if the temperature $T$ is part of one of the parameterisations.

0

I'm answering as a practical matter versus theoretical. Entropy is not scale invariant, however relative entropy can be shown to be scale invariant.

For an objective viewer at an arbitrary macro scale, how does one define a set of macrostates to evaluate the entropy for?

The confounding issue here is that physics has shown physical space does not accomodate arbitrary macroscales. We know as fact that we cannot keep subdividing physical space without seeing fundamental changes in the physics that govern the space. In this sense space is not fractal in itself. At present we still cannot affirmatively say that the dimensions of space stay constant at arbitrary scales.

Is the micro state always defined in terms of atoms, or sub-atomic particles? Can it be generalized to any scale < macro with the second law still holding?

The enormous success of particle approaches to understanding physics is best epitomized by the parton model used in understanding physics in the interior spaces of hadrons.

Trying to understand the distribution of energy within this parton framework is still only being discovered and understood as recently as this year (2024). So its a defacto practice to approach any problem by subdividing everything in terms of point particles. The problem of physical space not allowing arbitrary scales is part of the driver to propose strings instead of point particles as being the fundamental basis of division.

The mental hangup on using strings instead of points is that strings carry an indivisible dimensionality (hence the use of word fundamental). A lot of people cannot get their minds wrapped around the idea that something carrying some notion of length cannot itself be further subdivided into particles. So this is why a lot of people cannot accept string theory.