4

Truesdell explains in his essay of 1966, “Method and taste in natural philosophy” (C. Truesdell, Six Lectures on Modern Natural Philosophy, Springer-Verlag, Berlin, 1966.):

As mechanics is the science of motions and forces, so thermodynamics is the science of forces and entropy. What is entropy? Heads have split for a century trying to define entropy in terms of other things. Entropy, like force, is an undefined object, and if you try to define it, you will suffer the same fate as the force-definers of the seventeenth and eighteenth centuries: Either you will get something too special or you will run around in a circle.

I cannot find his book for less than about $90 so I was just wondering if anyone could explain a little bit further why he thinks that?

Jerome
  • 41

2 Answers2

4

You are probably familiar with the way mathematical theories are formally constructed: we have a collection of mathematical objects or notions that have particular properties and are related in particular ways. Some of these notions are taken as primitives: that is, we don't define them, but only give a list of their properties. For example, the notion of "point" in geometry or the notion of "set" in set theory. Other notions are then defined in terms of the primitive ones. For example, the notion of "circle" in Euclidean geometry, defined as a set of points satisfying a particular property. This process is called the "axiomatization" of a theory.

Let's stress two points about axiomatizations.

First, we often have several different choices of primitive notions. So we can have an axiomatization that uses notions $A$ and $B$ as primitives and notion $C$ as a defined one, and an axiomatization that uses notions $A$ and $C$ as primitives and $B$ as defined. The choice between different axiomatizations is a matter of mathematical economy and subjective taste.

Second, just because one notion is taken as primitive and another as defined, it doesn't mean that the primitive one is more intuitive of the defined one. In fact, sometimes primitive notions are less intuitive than defined ones. They are nevertheless chosen as primitives for reasons of economy – for example, they lead to theorems that have more compact statements or easier proofs, as compared with equivalent theorems stated in terms of another set of primitives.

Now, Truesdell's remarks largely concern the axiomatization of a physical theory – which is also a mathematical theory. We can choose a set of physical notions as primitives, in the mathematical sense; and other as defined in terms of the primitives. The choice is again a matter of mathematical economy, subjective taste, and sometimes pedagogical value. See the examples in Suppes's book (1957).

The fact that a physical notion is chosen as primitive doesn't mean that it's historically older or that it's easier to grasp with intuition or experiment.

Experimentally, the second law concerns cyclic processes and exchanges of work and heat. We could use these notions as primitives and do without speaking about entropy. The notion of entropy appears as a sort of convenient book-keeping device when we do not want to consider cyclic processes (see the discussion in Ricou 1986).

As it turns out, if we use it as a primitive concept (again, in the mathematical axiomatization sense) we obtain a mathematical system that's more compact and doesn't force us to consider cyclic processes. But, as Truesdell remarks and we all know from our studies in school, entropy is much less intuitive than heat, work, and cyclic processes – and these have their own intuitive difficulties already!

In fact, entropy cannot even be measured in general. And, contrary to what many basic thermodynamic courses say, it isn't true that we can define it except for a constant term. It turns out that there are many entropy functions that are equivalent in the description of the thermodynamics of particular systems, and they don't differ by only a constant term. This was pointed out by Day (1977, 1988) and is discussed with examples in Owen's very mathematical textbook (1984) and very neatly in Samohýl & Pekař's book (2014). Only for some systems and processes we have an entropy function defined but by a constant term.

There are some works that try to avoid entropy as a primitive and instead derive it as a defined notion; see e.g. Day (1968), Coleman & Owen (1974), and especially Ricou (1986), the discussion by Serrin (1986), and other contributions to the book edited by Serrin (1986b).

I think that Truesdell's quote must, at least partially, be understood in the axiomatization sense above. Entropy is not an intuitive notion, as he himself says in the quote from hyportnex's answer. But it can be a convenient primitive. When we try to define it, e.g. in microscopic terms, we usually obtain something that can't cover all of its uses in continuum mechanics, or something that turns out to implicitly assume a notion similar to that of entropy, so that the definition is circular.

There is research going on about this, and maybe things will change one day. Jaynes's (1979, 1985) generalized notion of entropy, for example, has many interesting properties shared by the entropy functions used in more general continuum thermomechanics.

References

  • Coleman, B. D., D. R. Owen (1974): A mathematical foundation for thermodynamics https://doi.org/10.1007/BF00251256

  • Day, A. W. (1968): Thermodynamics based on a work axiom https://doi.org/10.1007/BF00251512

  • Day, A. W. (1977): An objection to using entropy as a primitive concept in continuum thermodynamics https://doi.org/10.1007/BF01180089

  • Day, A. W. (1988): A Commentary on Thermodynamics (Springer).

  • Jaynes, E. T. (1979): Where do we Stand on Maximum Entropy? http://bayes.wustl.edu/etj/articles/stand.on.entropy.pdf

  • Jaynes, E. T. (1985): Macroscopic prediction http://bayes.wustl.edu/etj/articles/macroscopic.prediction.pdf

  • Owen, D. R. (1984): A First Course in the Mathematical Foundations of Thermodynamics (Springer).

  • Ricou, M. (1986): The laws of thermodynamics for non-cyclic processes, in Serrin (1986b) pp. 85–97.

  • Samohýl, I., M. Pekař (2014): The Thermodynamics of Linear Fluids and Fluid Mixtures (Springer), especially chap. 2.

  • Serrin, J. (1986): An outline of thermodynamical structure, in Serrin (1986b) pp. 3–32.

  • Serrin, J. (ed.) (1986b): New Perspectives in Thermodynamics (Springer).

  • Suppes, P. (1957): Introduction to Logic (Van Nostrand), especially chap. 12.

pglpm
  • 4,109
2

The quoted text below is from the same Truesdell book and follows your quote immediately, and it might help illuminate its meaning. I just wish I had been taught this many years ago with Jaynes' (or Wigner's) idea of the anthropomorphic nature of entropy when I first heard of the subject together...

"The history of thermostatics has put a block between us and the understanding of entropy by denying that a substance called "heat" exists. If we were to start afresh, free of the bad associations originating in word-fights a century ago and devoutly embalmed in the physics texts today, we might call "heat" what now is called "entropy", thus bringing the concept as close to ordinary experience as are "force" and "mass". To say that heat is what goes into a body when it gets hotter or is deformed and usually goes out when it gets colder, and that the mean heat increases at least as fast as the mean increase of energy divided by temperature, seems to me as good as the explanation of force as being a push or pull in the amount of the mass times the acceleration produced, or as the explanation of mass as being the quantity of matter in a body.

Just as force can be defined kinematically in certain special and common systems, such as a pendulum, entropy can be, and is, defined thermally for certain special systems, such as a perfect gas. Just as force can never be measured directly — for all measurements of force presume some special mechanical constitutive equation such as that of the rigid body or the linear spring or the gravitational field — so entropy cannot be measured except indirectly through use of special thermal constitutive equations. While we can never define entropy except in cases where we do not need it, and while we cannot measure it directly, we can, in time, get used to it, as we have, despite the opposition of some of the greatest of all scientists, gotten used to an a priori concept of force. We are not yet sufficiently used to entropy to state, with any measure of agreement, what axioms it obeys."

hyportnex
  • 21,193