2

I am confused about the differences between a reversible process and an equilibrium when considering their energy aspect.

Here is what I know so far.

(1) Equilibrium and Reversibility

Equilibrium processes are processes where the system, the environment, or both nearly reach equilibrium and are infinitesimally close to it but never fully reach it (only at the limit). When the system and the surroundings go through an equilibrium process at the same time, they are in equilibrium with each other. In case they go back to their original states, the cycle is called reversible. When only one of them goes through an equilibrium cycle, it is called irreversible.

(2) Entropy

From my understanding entropy does not have much to do with energy but more with the likelihood that an event will happen. Say there are more ways for a specific event to happen (high entropy) means that the probability it happens is greater than if the number of ways were smaller (low entropy). A system in a state of low entropy always seeks to increase its entropy.

(3) Difference and Relationship

I found a post that explained the difference between reversible and equilibrium processes but I am not sure if the formula the author used is correct because I have not seen it anywhere before. The author explained that reversibility is entropic while equilibrium has to do with the chemical potentials being equal to each other. He used the following formula to prove his point

\begin{equation} dS = \frac{dQ_{rev}}{T} + dS_{irr} \end{equation}

where $dS_{irr}$ is the entropy caused by the amount of irreversibilities the system contains.

(3) Questions

  • Is my stated understanding correct?
  • If entropy has to do with probability, why was the entropy change defined as $dS = \frac{dQ_{rev}}{T}$. What does heat have to do with entropy and how do these definitions relate to each other?
  • Is the term $dS_{irr}$ accurate?

Thank you for your time!

Skaeler
  • 31
  • 3

2 Answers2

2

The equation

\begin{equation} dS = \frac{dQ_{rev}}{T} + dS_{irr} \end{equation}

Is incorrect. It should be

$$dS=\frac{\delta Q}{T} + dS_{irr}\tag{1}$$

That's because the differential change in entropy $dS$ of a system is, for any process, defined (from Clausius)for a reversible transfer of heat, i.e.

$$dS=\frac{\delta Q_{rev}}{T}\tag{2}$$

The first term on the right of eq (1) is the entropy transfer due to heat crossing the boundary where the boundary temperature is $T$, sometimes written $T_B$. The second term is the entropy generated due to irreversible heat and/or work.

  • If entropy has to do with probability, why was the entropy change defined as $dS = \frac{dQ_{rev}}{T}$. What does heat have to do with entropy and how do these definitions relate to each other?

Entropy can be described in the realm of statistical mechanics using Boltzmann's entropy formula, or in the realm of classical thermodynamics using equation 2 from Clausius. Basically, Boltzmann's equation provides a microscopic foundation for the concept of entropy whereas Clausius' equation provides a macroscopic foundation. It's my understanding that Boltzmann's equation can be used to derive the Clausius entropy definition.

  • Is the term $dS_{irr}$ accurate?

The term is correctly shown in eq (1), not as shown by the author you cited. It is important to know that the total entropy change of a system is the sum of entropy transfer (the first term on the right of eq (1) plus any entropy generated if the process is irreversible, the second term on the right. If the process is reversible $dS_{irr}=0$ and all the entropy change is simply due to reversible heat.

Is a process reversible if and only if there is no entropy change?

No. A process can be reversible if the entropy change is due to a reversible transfer of heat. Examples are the reversible isothermal expansion and compression processes of a Carnot cycle. But a process is irreversible if the process generates entropy due to irreversible work or irreversible heat.

Hope this helps.

Bob D
  • 81,786
1

The equation $dS=\frac{dQ}{T}+dS_\mathrm{irr}$ is a great example of combining two aspects of entropy:

  1. Entropy is the "stuff" that shifts upon heat transfer.

    When we look at ways to transfer energy (heat transfer, work, mass transfer), we see a driving force expressed as an unevenness in some intensive property, paired with a driven shift in the conjugate extensive property. The product of each conjugate pair has units of energy. A list of all the relevant conjugate pairs makes up the fundamental relation for the problem of interest.

    As an example, the driving force in pressure–expansion work is some mismatch in pressure, and the result is a shift in volume that tends to eliminate the pressure difference. The driving force in electrostatic work is an electric field, and the result is an acceleration of charge carriers that tends to eliminate that field. The driving force in diffusion is a concentration difference (more precisely, a chemical potential difference), and the result is mass transport that tends to eliminate this difference.

    Temperature differences drive heat transfer. So what is the conjugate to temperature in this framework? Entropy, as it turns out. When one object conductively heats another, for instance, the hotter object loses entropy, and the colder object gains it.

  2. Entropy is also generated whenever energy moves down a gradient, including gradients associated with work and mass transfer, as well as heat transfer. Entropy quantifies the number of microstates consistent with a given macrostate we can measure, and since we tend to more often see scenarios with more ways of occurring, we tend to see an increasing total entropy, regardless of the process or combination of processes.

So if we're examining how entropy might change in a system, we should consider (1) whether the system is undergoing heat transfer, and incorporate the associated entropy transfer along a reversible path, and (2) whether an irreversible process of any type is occurring, and what is the associated entropy increase. This is expressed as the above equation.

How do we consolidate the two aspects? One interpretation possibly useful to you is that entropy represents a lack of information. In a crystal very near 0 K, knowledge of the position of one atom tells us nearly all we need to know about the position of every other atom. This is a low-entropy state. With heating, the atomic movement increases, and some atoms may hop out of place. In this higher-entropy state, we know less about the atomic positions.

When one object heats another, as a larger number of rapidly moving particles in the hotter object encounter and interact with a smaller number in the colder object, the distribution of particle energies contracts in the former and expands in the latter. (You could also express this as a larger and smaller number of slow-moving particles in the colder and hotter objects, respectively; the point is that one distribution, such as a Boltzmann distribution, is encountering a different distribution, with the resulting energy exchanges between particles being mediated by energy conservation.) We now have a better grasp on the possible positions and velocities of the cooled object, in the sense that a certain range is more likely to contain more of them. This is one way to link the entropy shift in heat transfer with entropy generation in all spontaneous processes.

When we discuss equalization of the chemical potential at equilibrium, we're referring to the tendency of the Gibbs free energy to be minimized for systems in thermal and mechanical contact with their surroundings, as the chemical potential is just the partial molar Gibbs free energy. This also brings in both aspects of entropy mentioned above, as (1) strong bonding (as in low-enthalpy condensed matter) exothermally heats the rest of the universe and thus transfers entropy to it, but (2) minimal bonding (as in gases) provides the opportunity for many molecular arrangements. The Gibbs free energy, which contains both the enthalpy and the entropy, is our way of modeling how Nature balances these two counteracting tendencies.

I think this touches on most of the points of your question and could be useful in moving forward. Please let me know what's unclear.

Edit: Bob D made a good point that it's confusing to specify both "reversible" and "irreversible" in the same equation. I've removed the "rev" subscript.