26

From the second law of thermodynamics:

The second law of thermodynamics states that the entropy of an isolated system never decreases, because isolated systems always evolve toward thermodynamic equilibrium, a state with maximum entropy.

Now I understand why the entropy can't decrease, but I fail to understand why the entropy tends to increase as the system reach the thermodynamic equilibrium. Since an isolated system can't exchange work and heat with the external environment, and the entropy of a system is the difference of heat divided for the temperature, since the total heat of a system will always be the same for it doesn't receive heat from the external environment, it's natural for me to think that difference of entropy for an isolated system is always zero. Could someone explain me why I am wrong?

PS: There are many questions with a similar title, but they're not asking the same thing.

6 Answers6

28

Take a room and an ice cube as an example. Let's say that the room is the isolated system. The ice will melt and the total entropy inside the room will increase. This may seem like a special case, but it's not. All what I'm really saying is that the room as whole is not at equilibrium meaning that the system is exchanging heat, etc. inside itself increasing entropy. That means that the subsystems of the whole system are increasing their entropy by exchanging heat with each other and since entropy is extensive the system as whole is increasing entropy. The cube and the room will exchange, at any infinitesimal moment, heat $Q$, so the cube will gain entropy $\frac{Q}{T_1}$, where $T_1$ is the temperature of the cube because it gained heat $Q$, and the room will lose entropy $\frac{Q}{T_2}$, where $T_2$ is the temperature of the room because it lost heat $Q$. Since $\frac{1}{T_1}>\frac{1}{T_2}$ the total change in entropy will be positive. This exchange will continue until the temperatures are equal meaning that we have reached equilibrium. If the system is at equilibrium it already has maximum entropy.  

Bubble
  • 2,070
12

For completeness, an information theoretical answer is needed. Entropy is, after all, defined for arbitrary physical states and does not require a notion of thermal equilibrium, temperature, etc. We need to use the general definition of entropy, which is the amount of information that you lack about the exact physical state of the system given its macroscopic specification.

If you knew everything that is to know about the system then the entropy would be zero and it would remain equal to zero at all times. In reality, you will only know a few parameters of the system and there is then ahuge amount of information that you don't know. Now, this still does not explain why the entropy should increase, because the time evolution of an isolated system is unitary (there is a one to one map between final and initial states). So, naively, you would expect that the entropy should remain constant. To see why this is not (necessarily) the case, let's focus on the free expansion experiment caried out inside a perfectly isolated box. In this thought experiment we make the rather unrealsitic assumption that there is no quantum decoherence, so that we don't smuggle in extra randomness from the environment, forcing us to address the problem instead of hiding it.

So, let's assume that before the free expansion the gas can be in one of N states, and we don't know which of the N states the gas actually is in. The entropy is proportional to Log(N) which is prioportional to the number of bits you need to specify the number N. But this N does not come out of thin air, it is the number of different physical states that we cannot tell apart from what we observe. Then after the gas has expanded there are only N possible final states possible. However, there are a larger number of states that will have the same macroscopic properties as those N states. This is because the total number of physical states has increased enormously. While the gas cannot actually be in any of these additional states, the macroscopic properties of the gas would be similar. So, given only the macroscopic properties of the gas after the free expansion there are now a larger number of exact physical states compatible with it, therefore the entropy will have increased.

Count Iblis
  • 10,396
  • 1
  • 25
  • 49
4

While Bubble gave a nice example, let me try to explain this with "Clausius inequality". (You can read this on several sources, I like the explanation from Atkins' Physical Chemistry)

Let's start with the statement: $$ |\delta w_{rev}| \geq |\delta w| \\ $$ Furthermore, for energy leaving the system as work, we can write $$ \rightarrow \delta w - \delta w_{rev} \geq 0 $$ where $\delta w_{rev}$ is the reversible work. The first law states $$ du = \delta q + \delta w = \delta q_{rev} + \delta w_{rev} $$ since the internal energy $u$ is a state function, all paths between two states (reversible or irreversible) lead to the same change in $u$. Let's use the second equation in the first law: $$ \delta w - \delta w_{rev} = \delta q_{rev} - \delta q \geq 0 $$ and therefore $$ \frac{\delta q_{rev}}{T} \geq \frac{\delta q}{T} $$ We know that the change in entropy is: $$ ds = \frac{\delta q_{rev}}{T} $$ We can use the latter equation to state: $$ ds \geq \frac{\delta q}{T} $$ There are alternative expressions for the latter equation. We can introduce a "entropy prodcution" term ($\sigma$). $$ ds = \frac{\delta q_{rev}}{T} + \delta \sigma, ~~\delta \sigma \geq 0 $$ This production accounts for all irreversible changes taking place in our system. For an isolated system, where $\delta q = 0$, it follows: $$ ds \geq 0 \,. $$

g.b.
  • 161
2

We know that $ds_{\rm (universe)}$is equal to $ds_{\rm(system)} + ds_{\rm (surroundings)}$,and for an isolated system $ds_{\rm (surroundings)} = 0$ because $dq_{\rm (reversible)} = 0$; therefore, for an isolated system, $ds_{\rm (universe)}$ is equal to $ds_{\rm (system)}$.

Now, we know that the spontaneity criteria for any process is $ds_{\rm (universe)} > 0$, or if not, at least should be $0$ for equilibrium.

Therefore, $ds_{\rm (system)} \geq 0$.

299792458
  • 3,214
Surya
  • 21
0

This may directly answer your question.

Note that the requirement in the law of entropy is for the heat of the initial and final states of the system to be the same, not that no heat can be exchanged at all with the outside world in the path between them.

Now notice that entropy is defined as $\displaystyle dS=\frac{dQ_{rev}}{T}$, where $\displaystyle dQ_{rev}$ is the reversible heat exchanged with the outside. We must specify a reversible path between the initial and final states of the system to calculate this quantity. If the heat is reversible in this path, it can be exchanged back to the system, and so in general $\displaystyle dQ_{rev}$ is not zero for each step in the reversible path.

But hold on, doesn't the total $\displaystyle dQ_{rev}$ need to still be zero for the system to be isolated, as you said in the beginning? Yes, but that doesn't meant that the integral of $\displaystyle dS=\frac{dQ_{rev}}{T}$ needs to be zero. This means that the total entropy can still sum to a non zero value even though the total heat can't.

Just for completeness, note that $\displaystyle dQ_{irrev}$ needs to be zero at all points along any path, unlike $\displaystyle dQ_{rev}$, because it is heat that cannot be recovered from going between the initial and final states and therefore doesn't fulfil our requirement for an isolated system.

Cr0xx
  • 41
0

First, we note that in an isolated system, entropy increases as long as the system is not in equilibrium. It reaches maximum and stops changing when the system reaches equilibrium. So, entropy can be equivalently described as the degree of uniformity of a system. Second, note that entropy is essentially a non-dimensional quantity despite the use of J/K units. That is clear from the definition; dS=dQ/T. Both terms on the righ can be expressed as heat energy or temperature because the two are connected by a constant as temperature is a measure of the average energy in a region- as in the average temperature of a block of ice inside an insulated room. Thus we can say that entropy is a non-dimensional number that describes the statistical distribution of temperature in the system or the configuration of temperature distribution. If we have a room full of people and money is held by only few, the uniformity/entropy is low and work can easily be done. If this money is distributed equally, the uniformity/entropy is high and there is less need to do work. In statistics, a distribution is described by summing the deviation from the mean(less the sign) or the absolute deviation then dividing it by the actual mean. This is essentially what we do when we write; S= mean(∑|dQ|) / T, (T is positive and constant). In the usual standard deviation calculations we sum the squares then take the square root to leave only positive contributions to the sum. Thus entropy is not a quantity like heat for example, but rather a descriptor of the shape of a distribution.

Riad
  • 583