2

Is there a relation between Information and Entropy (as defined by Shannon) and Entropy in Physics? Other than that they have similar formulas

For example: Is information physical? Is it conserved for a closed system?

When Entropy increases in a closed system, is information lost (in some sense)? Can we answer as to where that information went (if it was ever there)?

Qmechanic
  • 220,844
PhyEnthusiast
  • 3,016
  • 3
  • 20
  • 43

3 Answers3

1

Shannon entropy is just a generalization of thermodynamic entropy.

Yes information is physical.

if you have all the information about a system you can use it to extract energy from it and do work by manipulating the particles of the system out of thermal equilibrium. But increasing information requires some kind of computation which increases entropy because it requires energy, so second law cant be avoided.

If you have a system in thermal equilibrium and you separate two parts of it with a door, and you have all the information about the system you can open the door each time a fast particle comes to it and let it in one part and when a slow one comes let it the other part. So you will make one part hoter and if you get rid of the door you can extract energy from the heat flow. But to get that information you need a lot of computation and for that a lot if energy

Kugutsu-o
  • 874
1

Yes, information is physical.

Yes, information is conserved. But you have to remember that all information is conserved, not just that tiny part which is known to you. Microscopic information tends to be very sensitive and fragile.

Yes, entropy relates to information, though not directly in physics because physical information is quite difficult to extract on microscopic level. We cannot observe single atoms yet, because they are smaller then X-ray resolution. We can only observe quantum-mechanical tunnel current disturbances, but it is indirect approach.

Also the nature of molecular physics lies is chaotic movements, what makes observation of single atoms even more difficult. Add here mathematical chaos which exists perpetually, and you will see that retrieving information is still impossible task even today. Even more, add here quantum mechanical chaos which also exists perpetually and breaks the link between you as observer from the system you observe.

sanaris
  • 885
  • 5
  • 8
1

Is there a relation between Information and Entropy (as defined by Shannon) and Entropy in Physics?

Yes, as far as I know, the relation is given by Landauer's principle, which states that in order to erase information, it is necessary to dissipate energy or, in other words, erasing a bit of information creates $kT \ln{2}$ entropy. This states the equivalence of logical irreversibility and thermodynamic physical irreversibility.

(taken from: The physics of forgetting: Thermodynamics of Information at IBM 1959–1982 by Aaron Sidney Wright https://www.mitpressjournals.org/doi/pdfplus/10.1162/POSC_a_00194)

Is information physical?

Yes, information is physical. As far as I concern, Swanson asserted that information can be defined per unit volume rather than per symbol as in Shannon’s information theory (sic).

(taken from: The physics of forgetting: Thermodynamics of Information at IBM 1959–1982 by Aaron Sidney Wright https://www.mitpressjournals.org/doi/pdfplus/10.1162/POSC_a_00194)

Is it (information) conserved for a closed system?

Yes. You can see this previous post (Where does deleted information go?) for more details about it.

When Entropy increases in a closed system, is information lost (in some sense)?

Yes, that's because of Landauer's principle: 'erasing a bit of information creates $kT ln 2$ entropy.'

Can we answer as to where that information went (if it was ever there)?

Yes, we can. You can find the answer to that question in this post Where does deleted information go?

Peter Peter
  • 116
  • 4