Questions tagged [entropy]

An important extensive property of all systems in thermodynamics, statistical mechanics, and information theory, quantifying their disorder (randomness), i.e., our lack of information about them. It characterizes the degree to which the energy of the system is not available to do useful work.

Entropy is defined to be extensive (that is additive when two systems are considered together).

In statistical mechanics the entropy is identified with the logarithm of the number of microscopic states of a system which has a given set of macroscopic properties.

3145 questions
119
votes
2 answers

Is it necessary to consume energy to perform computation?

As far as I know, today most of the computers are made from semiconductor devices, so the energy consumed all turns into the heat emitted into space. But I wonder, is it necessary to consume energy to perform computation? If so, is there a…
jiakai
  • 1,243
109
votes
10 answers

What is time, does it flow, and if so what defines its direction?

This is an attempt to gather together the various questions about time that have been asked on this site and provide a single set of hopefully authoritative answers. Specifically we attempt to address issues such as: What do physicists mean by…
98
votes
3 answers

Where is the flaw in this machine that decreases the entropy of a closed system?

I was thinking about a completely unrelated problem (Quantum Field Theory Peskin & Schroeder kind of unrelated!) when the diagram below sprang into my mind for no apparent reason. After some thinking, I can't figure out why it wouldn't work, other…
QuantumFool
  • 1,274
89
votes
8 answers

Why is information indestructible?

I really can't understand what Leonard Susskind means when he says in the video Leonard Susskind on The World As Hologram that information is indestructible. Is that information that is lost, through the increase of entropy really recoverable? He…
79
votes
6 answers

What is information?

We're all familiar with basic tenets such as "information cannot be transmitted faster than light" and ideas such as information conservation in scenarios like Hawking radiation (and in general, obviously). The Holographic Principle says, loosely,…
Mitchell
  • 1,025
65
votes
5 answers

The Sun is giving us a low entropy, not energy

While I was watching a popular science lecture on YouTube, I came across this sentence "Sun is giving us a low entropy, not energy" which was said by Prof. Krzysztof Meissner. I am not a physicist, but this sounds to me like a huge leap. I would…
janusz
  • 1,013
61
votes
10 answers

How do different definitions of entropy connect with each other?

In many places over the Internet, I have tried to understand entropy. Many definitions are presented, among which I can formulate three (please correct me if any definition is wrong): Entropy = disorder, and systems tend to the most possible…
60
votes
11 answers

Is a spontaneous decrease in entropy *impossible* or just extremely unlikely?

I was reading this article from Ethan Siegel and I got some doubts about a sentence about entropy, specifically when Ethan explains the irreversibility of the conditions of the hot-and-cold room, as in this figure: In his words: It's like taking a…
Andy4983948
  • 805
  • 6
  • 4
59
votes
2 answers

Does entropy depend on the observer?

Entropy as it is explained on this site is a Lorentz invariant. But, we can define it as a measure of information hidden from an observer in a physical system. In that sense, is entropy a relative quantity depending on the computation, measurement…
59
votes
7 answers

Is there any proof for the 2nd law of thermodynamics?

Are there any analytical proofs for the 2nd law of thermodynamics? Or is it based entirely on empirical evidence?
AIB
  • 1,394
55
votes
4 answers

Maximum theoretical data density

Our ability to store data on or in physical media continues to grow, with the maximum amount a data you can store in a given volume increasing exponentially from year to year. Storage devices continue to get smaller and their capacity gets bigger.…
49
votes
14 answers

What is entropy really?

On this site, change in entropy is defined as the amount of energy dispersed divided by the absolute temperature. But I want to know: What is the definition of entropy? Here, entropy is defined as average heat capacity averaged over the specific…
user36790
47
votes
5 answers

Where does deleted information go?

I've heard that, in classical and quantum mechanics, the law of conservation of information holds. I always wonder where my deleted files and folders have gone on my computer. It must be somewhere I think. Can anyone in principle recover it even if…
46
votes
7 answers

How is $\frac{dQ}{T}$ measure of randomness of system?

I am studying entropy and its hard for me to catch up what exactly is entropy. Many articles and books write that entropy is the measure of randomness or disorder of the system. They say when a gas system is let expand the randomness increases etc.…
pranphy
  • 734
45
votes
3 answers

Why isn't the entropy of gas infinite if there are infinite microstates available?

How could it be that the entropy of gas is not infinite? If we use the formula $S=k\ln(\Omega)$ we get infinity because there are an infinite number of possible situations (because there are infinite possibilities of position and momentum for each…
Jacob
  • 1,563
1
2 3
99 100