2

I know in an isolated system entropy always tends to increase, but what about the speed of that increase (e.g. the acceleration of entropy-the derivative of its speed)? Is there any law or relation giving the rate of that increase? Can it be decreased? (I am not asking here about a way to decrease entropy itself but its rate of increase-is it always increasing, too, is it constant, or can it be decreased?) Also, is there any difference in the answer of the question if it is asked for isolated and for non-isolated (both closed and open) systems?

Can anybody show me any links to work done on the subject? Or if there isn't any such work explain me what are the reasons why nobody has tackled the issue up until now (e.g. impossibility of experimental verification, lack of theoretical framework to put the question in, difficulty to make a sound mathematical model and so on)?

hft
  • 27,235

5 Answers5

1

In the absence of mass transfer, the local rate of generation of entropy in a system is proportional to the square of the velocity gradient (actually, the double dot product of the rate of deformation tensor with itself) and to the square of the temperature gradient. See Transport Phenomena by Bird, Stewart, and Lightfoot, John Wiley, NY, 2nd Edition, 2002, p372, problem 11D.1 for the exact relationship for the local rate of entropy generation. Since this relationship involves squares of the gradients, the terms involved make positive definite contributions to the entropy generation rate.

Chet Miller
  • 35,124
0

From a thermodynamical point of view, you may consider the following: 1) Entropy is only defined for a system in the state of equilibrium. 2) An isolated system in equilibrium cannot depart from it spontaneously. Therefore, the entropy of an isolated system is constant.

From a quantum mechanical point of view: the Von Neumann entropy is invariant under unitary transformations, and since time evolution for isolated systems is unitary, the entropy remains constant.

However, I understand your question. Imagine a bottle with some gas trapped inside that you might prepare as an isolated system in equilibrium with an entropy $S_1$. Then, if the gas is released to let fill the entire room, after some time, it will reach again the equilibrium with an entropy $S_2>S_1$. As I said before, you can only have entropy at the initial and the final states, because there is no equilibrium intermediate state, therefore there's no rate of change in the entropy. In addition, that system had changed its entropy because it wasn't isolated, there had to be some external agent to open the bottle. Right?

0

There's been a lot of study on trying to reduce entropy production in things like turbines and compressors. The efficiency of turbines and compressors is based on the an isenthalpic prosses (where entropy does not increase). So an ideal turbine or compressor with 100% efficiency would not increase entropy at all. Typical real world efficiencies rarely get into the 90% range.

As Chet Miller noted in their answer: where and how the entropy increases in fluid dynamics has been thoroughly studied analytically. In simple terms shearing of the fluid and temperature gradients both increase entropy, so trying to limit those should result in more efficient processes.

This is just one small area of fluid dynamics where people have spent time and effort trying to improve efficiency by reducing entropy. The general principle applies to many fields of engineering from electrical, to civil, to chemical.

One thing I wanted to highlight that might be a misunderstanding from you question is that a closed system that's already reached equilibrium will not increase in entropy. So the rate is already 0. It's only during a process that the entropy will change, and the rate of change always relates to the efficiency of the process so in general people designing processes will be trying to minimize entropy increase, even if they aren't thinking about it directly.

Eph
  • 4,630
0

in an isolated system entropy always tends to increase, but what about the speed of that increase

As per second law of thermodynamics, average entropy increase rate in a system is bounded by: $$ \tag 1 \Delta S= \int_{0}^{\Delta Q} {\frac {\delta Q}{T}} $$

so it is proportional to an average heat absorbed $\Delta Q$ by typical sub-system part and inversely proportional to an average temperature. As process goes on and temperature becomes more and more homogeneous between system parts, they will exchange heat slower, i.e. $\Delta Q \to 0$ and so system will approach thermodynamic equilibrium, meaning that $\Delta S = 0$ too, i.e. no entropy increase anymore,- it's maximized.

Entropy inverse relation to temperature can be understood like this. Imagine you have some gas in container near absolute zero, let's say in a few $\mu K$ temperature. This means that typical gas is in solid state, so it's molecules are very ordered and except oscillations doesn't move freely. Now you heat container increasing $T$ by $+1000K$. Suddenly perfectly ordered system becomes chaotic,- melts, evaporates and all molecules dances across container like devil. You further raise temperature while gas becomes a plasma, but if you add next $+1000K$ to a plasma,- apart from increased average molecule speed, nothing radically changes, i.e. already highly chaotic system can't be made easily even more chaotic. In such disordered system supplying same heat across it's sub-parts will not involve same overall increase in entropy, because there is not much order which you can "disorder".

Now to achieve understanding for entropy changes over time ${\partial S}/{\partial t}$, we need to analyze typical heat transfer processes between system parts and this is too broad question and many dependent variables, like material local densities, thermal conductivity, radiative heat transfer ratios, convective currents and other things to consider.

-6

One of the best definition of entropy is increase in volume, but wait without pressure there is no energy involve in expansion and the definition of entropy is unavailability of energy to do more work. We know pressure is rate of momentum change or increase in speed, that is some acceleration or force. If there is no pressure, then no force or acceleration but increase in volume possible. Now what this means, this means that when entropy is maximum, there is state of equilibirium that there is no further increase in work available from given energy and all the energy supplied is doing a maximum limit of work done for system.

This also applies for maximum speed can be gain by an object subjected to constant force for infinite time. Thus Newton's law of motion is as good for tansient state. Now to show how this can be expressed, let's take an example of isothermal process in which increase in work done is increase in volume and that is increase in entropy, or change in entropy is proportional to availibilty of work,$$\frac{W}{T}=S=Nk\ln V$$As we see that entropy has maximum value and if this change in entropy is divided by volume, then we have expression of entropy as,$$S_V=\frac{-Nk}{V}$$which shows that change in entropy decreases as volume further increases, it is somewhat similar to Wein's law. The better expression is given first by Planck, which shows both increase and decrease in entropy, contribution of higher modes are negligible or zero.

This shows that what called as instantaneous action in classical mechanics was not answered by relativity, but by quantum mechanics from Boltzmann as treating thermodynamics at particle level and significantly by Planck. The answer is approach of classical mechanics was on applying low rate of energy that is power, equated to instantaneous result im work done. That is possible if very high amount of energy is provided, as at very high temperature, entropy tends to minimum and result is instantaneous. An example is, if capacity of a capacitor is very low, then change in voltage appear at it instantaneously, any voltage can fulfill its capacity or higher voltage can also do so.