3

A system has a Hamiltonian that depends on a few external parameters $V,X_1,X_2...$.

$$H=H(V,X_1,X_2....).$$

We can assume the dependence is continuous enough. A process is in the limit of infinite slowness when the parameters are changed in a continuous way taking a time that tends to infinity. This is often a definition for "quasistatic" but quasistatic has other meanings and I prefer to speak of "infinite slowness" (see note).

Is there a known assumption about the system (rather than the process), that ensures that all infinitely slow processes are reversible? Example: an ideal gas where $V$ is the volume has such a property (assuming no solid friction in the piston or that sort of imperfection).

Note: Formally, infinitely slow is obtained by considering a change $X_i(t)$ and slowing it down as $X_i(\lambda t)$ with $\lambda\rightarrow 0$. Typically, the free (Joule) expansion broken down into small steps is not infinitely slow as I mean it. It is instead a succession of infinitely many infinitely small and infinitely fast steps, waiting for equilibrium between each step. According to some definitions of quasistatic, the infinitesimal free expansion can be considered as quasistatic but is not infinitely slow as I mean it here.

Benoit
  • 601
  • 4
  • 10

3 Answers3

2

Your question is broad and I am limiting myself to one particular point. (And I do not consider systems with solid friction or hysteresis). The question would be : is it enough to break down a path into a series of elementary paths to make the transformation reversible.

The answer is yes when we adiabatically compress a gas from a pressure $P_1$ to a pressure $P_2$. If we decompose into a series of elementary compressions, the transformation is reversible.

But in general, the answer is no. Take the example of Joule expansion in a vacuum, from a volume $V_1$ to a volume $V_2$. Even if the expansion is broken down into an infinity of elementary expansions, the process remains irreversible and at the end, the entropy variation is the same.

The reason for this difference has to do with the maximum entropy postulate. If the maximum is reached inside the variation interval, the first derivative of entropy is zero at equilibrium. This is what happens in the first example. For an elementary variation of the pressure, the entropy variation is of the second order. and an infinite sum of second-order terms leads to a first-order variation, zero in the limit.

But in the case of Joule expansion, the equilibrium volume is simply the maximum volume available. The derivative of entropy with respect to volume is not zero at equilibrium. For a small variation in volume, the entropy variation is first order and the sum of an infinity of first order terms leads to a finite variation.

Hope it can help and sorry for my poor english.

1

Update
I would like to take here the notch further the point brought forward in the answer by @Themis, and my own answer to my related question What maximizes entropy?

  • As @Themis point out, Gibbs entropy never increases during a Hamiltonian evolution - indeed, each configuration of phase space variables is transformed into one configuration of evolved variables, so that statistical average over configurations remains constant.
  • Gibbs entropy is however not the same as Boltzmann-Einstein-Planck entropy that enters the second law of thermodynamics. The two can be linked via the ergodicity assumption - if we observe a system for long enough, it will sample all available microstates, the average over time becoming equivalent to the ensemble average.
  • One could then guess that, in order to ensure non-increasing of entropy, the change of the parameters should be slow enough to allow the system to scan all the microstates faster than the parameter changes.
  • The idea above can be actually formalized using classical mechanics, e.g., in the vase of a microcanonical ensemble: if the parameters are changing slowly, the entropy is an adiabatic invariant in its mechanical sense! See the discussion and the references in the last paragraph of question Adiabatic invariant and Liouville's theorem.

Hans Henrik Rugh, A Micro-Thermodynamic Formalism
P. Hertz, Ann Phys. (Leipzig) 33, 537 (1910) (in German)


Old answer
Entropy as a state function, dependent on the same parameters as the Hamiltonian (in an appropriate ensemble). If we change parameters with time as $X_i(t)$ (for simplicity of notation, I include $V$ as one of the parameters), then $$ \frac{dS}{dt}=\sum_i\frac{\partial S}{\partial X_i}\dot{X}_i(t). $$ Taking $|X_i(t)|\rightarrow 0$ means that the entropy doesn't change.

If resorting to the entropic argument is not enough, we could instead look at the probabilities of states $$ P_i=\frac{1}{Z(X_i)}e^{-\beta H(X_i)},$$ which likewise would not change in time, if the parameters are changing continuously and very slowly - regardless of the path in the parameter space.

It looks a bit trivial - perhaps, I missed something? (I am willing to discuss it.)

Roger V.
  • 68,984
1

The external parameters of the Hamiltonian can be varied arbitrarily fast or slow, the resulting motion is always isentropic. Quoting Gibbs:

Let us imagine a great number of independent systems, identical in nature, but differing in phase, that is, in their condition with respect to configuration and velocity. The forces are supposed to be determined for every system by the same law, being functions of the coordinates of the system $q_1,\cdots q_n$, either alone or with the coordinates $a_1$, $a_2$, etc. of certain external bodies. It is not necessary that they should be derivable from a force-function. The external coordinates $a_1$, $a_2$, etc. may vary with the time, but at any given time have fixed values. In this they differ from the internal coordinates $q_1,\cdots q_n$, which at the same time have different values in the different systems considered. (Gibbs, Elementary principles of Statistical Mechanics, p5).

Gibbs here is deriving the Liouville equation. There are not conditions placed on the external coordinates other than that at any given time thy must have a fixed value. The resulting Liouville equation conserves entropy, as is well known.

Themis
  • 5,951