2

I am currently doing some research on entropy and I am trying to get my head around the concept.

One thing that I am getting right now is that entropy is just an application of probability and statistics right? So it's not a force like gravity, but simply a result of a mathematical rule that says that, because there are so many probabilities of a system being arranged in a way, then this system will move towards the state where it is the most "spread around". I am getting this right?

What then would be some physical manifestations of entropy? (I've seen the example of an ice cube or a cup of hot tea, but I don't quite see how these two relate to entropy. Could you guys tell me how these two are related to entropy?).

Thank you!

2 Answers2

4

Edit: Looks like Steeven beat me to it with a similar explanation, but I'll leave mine here for posterity.

Entropy is just a property of a system (in a given state), in other words a state function.

I hope that by elaborating on the statistical interpretation of entropy you will gain some intuitive notion of the meaning of entropy. Consider the following example:

Suppose that I flip three coins. There are eight possible outcomes for the faces of the coins (HHH, HHT, etc.). If the coins are fair, each outcome is equally probable, so the probability of getting three heads or three tails is one in eight. But there are three different ways of getting two heads and a tail, so the probability of getting exactly two heads is 3/8, as is the probability of getting exactly one head and two tails.

In the language of statistical mechanics, each outcome, or way of arranging the coins, is known as a microstate. On the other hand, the result that is observed (e.g. two heads and a tail) is known as a macrostate. Since there are three ways in which the macrostate with two heads and one tail can occur, we say that the macrostate has a multiplicity of 3. (The clause about equally probable outcomes is actually known as the fundamental assumption of statistical mechanics.)

Now, about entropy. In statistical mechanics, entropy is denoted $$S\equiv k_B\ln g.$$ In this equation, $S$ is the entropy of the system in a given state and $g$ is its multiplicity (Boltzmann's constant $k_B$ is present mainly for historical reasons). Typically $g$ is very large so it is convenient to deal with its logarithm. From this definition it is clear that entropy is just a measure of the number of ways in which the system may be arranged.

This gives rise to a natural explanation of the second law of thermodynamics:

Any large system in equilibrium will be found in the macrostate with the greatest multiplicity (aside from fluctuations that are normally too small to measure). This is the state of maximum entropy.

sxwzd
  • 191
3

Entropy is not a force, no. It is a "chaos factor", if you will. The more entropy, the less structured a system is.

A system will always move towards the state that is most probable. The most probable state (macro-state) will be the state with most micro-state configurations.

Consider as an example four coins of heads H and tails T. Flip them and you can have the following five outcomes:

1: HHHH
2: HTHH, HHHT, HHTH, THHH
3: HHTT, HTHT, HTTH, THTH, TTHH, THHT
4: HTTT, THTT, TTTH, TTHT
5: TTTT

The five different results corresponds to five different macro-states. And they are all equally likely.

Macro-state 3 is most probable since it happens for most micro-state configurations. So, if your outcome is macro-state 3, and you flipped the coins without keeping track of each of them, then you now know the least about the micro-state configuration because there are six different possibilities of which coins gave which result. If the outcome is macro-state 1 then you know very much about the micro-state configuration because there is way fewer possibilities.

In macro-state 3 we have more chaos or higher entropy. If you make the coins flip 100 times, then it is highly likely that you reach macro-state 3 in most of the cases. It tells us that the system tends to end at the macro-state where the entropy or the chaos is higher, because that is the same as saying the macro-state with most possible micro-states and therefor the macro-state that will enclose most of the outcomes.

I've seen the example of an ice cube or a cup of hot tea, but I don't quite see how these two relate to entropy

If you have an ice cube then the structure is more "ordered" the colder it gets and less ordered if it melts. Simply because you know less and less about the position of atoms and particles within this structure the hotter it gets or the more loose they are bound.

If an ice cube melts, then it has gained entropy. The room around it has lost entropy. The entropy for either ice cube or room can be calculated from the heat energy added and the temperature:

$$S=\int \frac{1}{T}\mathrm dQ.$$

Total entropy $\sum S=S_{object}+S_{surroundings}$ turns out to be never smaller than zero. This is the 2nd law of thermodynamics. The total entropy will always increase (or be zero for a perfectly reversible process) in any process that happens.

The cup of coffe is the other way around. The coffe looses heat, and the surroundings gain that same heat. Entropy is lowered in the coffee and increased for the surroundings, but will overall in total be positive.

Steeven
  • 53,191