7

To make this question concrete, consider a toy model: a 1D polymer with $N$ units, where $N$ is large enough for fluctuations to be negligible in the following argument. The entropy of this polymer is $S(x) = \ln {N \choose \frac{N}{2} - x}$, which, after a bit of work with Stirling's formula and an approximation that $x \ll N$, gives $S(x) = N \ln 2 - \frac{2}{N} x^2$.

Now, in a lecture, we defined the entropic force $f$ as an analogy to a force arising from a potential, but with free energy $F$ instead of regular energy $E$.

$$ F = E - TS $$

$$ f = - \frac{\mathrm{d}F}{\mathrm{d}x} $$

There are no interactions here, so $E = 0$. With the entropy derived above, this gives:

$$ f = T \frac{d}{dx} (N \ln 2 - \frac{2}{N} x^2) = - \frac{4}{N} T x $$

Thus, the polymer behaves as an entropic spring with spring constant $k = \frac{4}{N} T$.

Where I am confused: In thermodynamics lectures, we always emphasized the fact that there is no time variable in thermodynamics, and though entropy rises, there are no guarantees as to how long it takes to do so. Therefore, I would expect that $f$ provides the right direction of change, but I don't see why its magnitude should have a direct physical meaning.

My specific question is: Does Newton's second law, $a = \frac{F}{m}$, hold in this model (assuming that we put a bead of mass $m$ at the end of the polymer)? If so, why? If not, what is the meaning of the magnitude of $f$?

Kotlopou
  • 433
  • 5
  • 21

2 Answers2

5

There's absolutely nothing magical about entropic forces. They're just ordinary forces, which happen to be easier to compute by thinking about entropy.

For example, the pressure an ideal gas exerts on the wall of a container can be computed like an ordinary force in mechanics (i.e. by considering the momentum imparted by individual collisions), or as an entropic force. In the first case, the derivation goes like $$P = \frac{1}{L^2} \sum_{i=1}^N \frac{\Delta p}{\Delta t} = \frac{1}{L^2} \frac{N \langle p_x v_x \rangle}{L} = \frac{N}{V} \frac{\langle \mathbf{p} \cdot \mathbf{v} \rangle}{3} = \frac{N k_B T}{V}$$ as you've probably seen in an earlier statistical mechanics class. In the second case, it goes like $$P = T \frac{\partial S}{\partial V}\bigg|_T = T \, \frac{N k_B}{V}$$ where we used the Sackur-Tetrode equation, $S(V, T) = N k_B \log V + (\text{terms dependent on } T)$.

For a sparse ideal gas, these are just two ways of deriving the exact same effect. The benefit of thinking about the force "entropically" is that in other situations, it might be easier than the microscopic way. For example, if we had a denser gas with plenty of self-interactions, then the first derivation wouldn't work because the particles can't be treated independently (they would collide with each other before bouncing off opposing walls), but the second might if the entropy is still a simple expression. On the other hand, if the gas wasn't in equilibrium (e.g. if somebody stirred it up by suddenly moving a wall), then the second derivation wouldn't work, and you'd have to fall back to the first, averaging over the gas's nonequilibrium distribution.

When your professor says thermodynamics has "no time variable", they're expressing something which is technically true but kind of misleading. Of course, thermodynamics is meant to describe real systems, which always evolve in time. However, the basic tools you learn in a first course only work for systems in thermodynamic equilibrium. When you're in equilibrium, time doesn't matter since the system essentially isn't changing, and we can often avoid talking about the dynamics that bring about equilibrium because their details won't matter. If you're not in equilibrium (e.g. if a gas in a box has hotter or colder patches, or carries sound waves) those tools won't work.

Once you let an entropic force act, it can accelerate stuff, and that can drive your system out of thermodynamic equilibrium. In order to tell if it does, you'll have to understand the microscopic dynamics of the system. (For example, if I let a piston expand under the pressure of a sparse ideal gas, the ideal gas law certainly won't work if the piston ends up with comparable speed to the gas molecules themselves, because then they won't have time to re-equilibriate with each other.) Your professor is trying to shield you from these subtleties by just avoiding the question.

knzhou
  • 107,105
3

Where I am confused: In thermodynamics lectures, we always emphasized the fact that there is no time variable in thermodynamics, and though entropy rises, there are no guarantees as to how long it takes to do so. Therefore, I would expect that f provides the right direction of change, but I don't see why its magnitude should have a direct physical meaning.

(emphasis is mine)
Indeed, as far as thermodynamic analysis of a polymer is concerned, there is no time variable - we always assume that we have waited "long enough" for the thermodynamic equilibrium to establish (and for entropy to reach its maximum.)

However, when talking about an entropic force and Newton's second law, we have a different setting in mind: e.g., a weight suspended on an elastic cord or a spring (made of a polymer.) The key assumption for using the entropic force here would be that the motion of the system is slow enough for thermodynamic equilibrium to establish - that is, the "long enough" time of thermodynamics is actually very small on the time scale where we apply the Newton's law. In thermodynamics this is called a quasistatic process. Note, that if this were not the case, the force would still exist, but is estimate from equilibrium thermodynamics would provide an incorrect value (the process would not be quastistatic/reversible.)

The assumption about thermodynamic equilibrium establishing fast or being established locally is very common in physics (e.g., whenever we use terms like temperature gradient we imply such an assumption.) Essentially, thermodynamics and Newtonian mechanics are but small building blocks, from which much complex models are constructed to describe real-world phenomena. (Strictly speaking, constructing complex models from known blocks is engineering, whereas science/physics is about discovering and studying the elementary blocks, but in practice there is a significant overlap between what physicists and engineers do.)

Related
Could Navier-Stokes equation be derived directly from Boltzmann equation? - on the place local equilibrium (in time and space) in deriving continuum description of matter (hydrodynamics, elasticity, continuum electrodynamics.)
How does radiation become black-body radiation? - assumption of local equilibrium is essential in describing hot objects as black bodies, even though overall they are not in thermodynamical equilibrium.

Roger V.
  • 68,984