According to reference [1] in Does the act of storing information (not its erasure) locally increase entropy in Maxwell's demon's memory?, information-theoretic entropy and thermodynamic entropy may not be the same. My question comes in 2 parts: i) If they are not the same, then which is more fundamental and ii) must both still be attributed to an observable?
3 Answers
On the one hand, thermodynamic entropy is just a special kind of information entropy, so the latter is more fundamental.
On the other hand, informational entropy is generally subjective and depends on the observer, but the thermodynamic entropy is the same for all observers (up to a shift by a constant, what is not significant) and therefore it is in fact objective and because of that plays a fundamental role in physics.
Note that, in principle, one can imagine an observer for whom the thermodynamic entropy is meaningless, but such observer must be nonlocal, and the dynamic laws of physics, apparently, do not allow the existence of such observers.
You can find some details in "The Physical Basis of The Direction of Time" by Zeh.
- 749
More fundamental is not a clear and unambiguous concept. If by concept A is more fundamental than concept B one means that concept B can be deduced from concept A but not vice versa, it may seem that information theory entropy is more fundamental than thermodynamic entropy. However, I think that after more scrutiny, this point of view appears inconsistent with the facts.
Indeed, the "derivation" of Thermodynamics from Information Theory assumes the concepts and relations of Thermodynamics and shows that the Statistical Mechanic formulae for the fundamental equations of Thermodynamics can be derived from Information Theory. However, Physical Theories are something more than a set of symbols. They must include the interpretation rules for the formalism. Therefore, it is enough to think of the concept of heat. How is it defined within Information Theory without using Thermodynamics?
Therefore, I wouldn't speak of more fundamental. I would rather say that they are two different entropies, with different scopes. In some cases, Information Theory may provide results consistent with thermodynamics, provided the interpretation rules of Thermodynamics have been used.
A different but related observation is that the same name entropy to two different entities in Thermodynamic and Information Theory does not imply necessarily a hierarchical relation.
About your second question, the answer is negative. There are cases where only one of them can be meaningful for a physical system. I'll try to explain the previous statement with a couple of examples:
- Shannon formula only requires the assignment of a probability to each microstate of a system. Therefore, it is a meaningful concept even in the case of a non-equilibrium probability distribution. Thermodynamic entropy is defined only for equilibrium systems.
- Thermodynamic systems are based on physical systems with underlying dynamics controlling the evolution of microscopic states. Information entropy can be assigned even to static (non-evolving systems).
- 39,465
- 9
- 53
- 119
The relationship between information and thermodynamic entropy is that they are conceptually distinct but equivalent in a sense similar to how that mass and energy are, meaning that there is a one-to-one proportionality between the two, but their semantic meanings are quite different. This can be summarized in the form of an equation
$$\Delta S = [k_B \ln(2)]\ \Delta H$$
(if one measures $H$ in shannons, instead of nats) and most importantly, this pertains to changes in the entropy, because there may be an arbitrary constant shift that relates to the amount of information the agent to whom the informational entropy is ascribed, has.
Conceptually, the difference is that thermodynamic entropy applies to a physical system, but informational entropy applies to a message, which is more abstract, or better, the relationship between a message and a receiving agent. Because they apply to different things, it is hard to call one more fundamental or, if anything, we might call thermal entropy the "more fundamental" one from a physical point of view because it's a direct physical property of a system. The equivalence arises when we consider that the "message" in question is the microscopic description of the physical system, that is, the message that completely specifies the system microstate.
The objections cited by other posters here deal with agents learning/acquiring information - those kind of changes are not the ones to which the above equivalence refers: it refers to how the agent's knowledge, without acquiring new information, "dates" over time as the system undergoes physical evolution according to the laws of physics. If it learns new information, which means it expands its description of the macrostate to be less "macro" and more "micro", then $H$ jumps in a way inconsistent to $S$ because $S$ is based on a fixed, standard macrostate (e.g. energy and volume, for a gas). Another way of looking at this is that when the system and agent interact in the measurement, the isolation on which the attributions of $S$ and $H$ depend, breaks down. One should, in such situations, then ask about the $S$ and $H$ attributed to the composite system/agent pair by a second agent, if one wants to really discuss the equivalency.
(P.S. the usage of $H$ here should not be confused with enthalpy. That is an unfortunate notational collision in this context but hey - the pigeonhole principle after all ...)
- 20,929