34

Computers generate heat when they work. Is it a result of information processing or friction (resistance)? Are these just different ways to describe the same thing? Or does some definite part of the heat "come from each explanation"?

I often read that it's a necessary byproduct of information processing. There are irreversible operations such as AND gates and the remaining information goes to heat.

But so many other things generate heat as well! A light bulb, electric hotplates, gears, etc. (These probably don't process information the way the computer does, but I may be wrong from a physical perspective.) Earlier I had always assumed the computer is like this as well. It basically has small wires in the processor and the resistance could explain the heat.

Maybe these are parallel explanations. The information processing aspect may say that there has to be some heat as byproduct in some way in any realization of an abstract computer, and the friction aspect could then describe how this actually happens in this concrete wires-and-transistors-type physical implementation of the abstract computer.

But maybe the two explanations account for separate amounts of the heat. Or maybe one accounts for a subset of the other, again in a partially parallel explanation way.

Can someone clarify?

isarandi
  • 915
  • 2
  • 8
  • 14

4 Answers4

35

Landauer's principle (original paper pdf | doi) expresses a non-zero lower bound on the amount of heat that must be generated by computers.

However, this entropy-necessitated heat is dwarfed by the heat generated through ordinary electrical resistance of the circuitry (the same reason light bulbs give off heat).

lemon
  • 13,320
6

Computers manipulate internal stored values "0" and "1" represented as different voltages. Every change 0-to-1 and 1-to-0 involves an electric current I passing through a circuit resistance R, which gives rise to ohmic or "Joule" heating.

1

The heat generated in a computer has nothing to do with the reversibility condition in Landauer's principle. Computations can be carried out reversibly, if required. What can not be made reversible is the RESET of the computer.

The first time we turn the machine on, the memory is in a random state, and it takes energy and entropy to turn that random state into a well-defined initial state, without which computations can not be carried out.

The same is true for the state of the program memory... so writing the program would, as one would naively expect, take a non-zero amount of energy and entropy. As has been pointed out, technological implementations are many orders of magnitude away from these limits (which, by the way, are far, far lower than the power demands of the human brain) without which the computer's output is utterly meaningless.

CuriousOne
  • 16,486
0

There are two main components to the heat produced by a computer. The first is due to the "steady state" power usage. This is the power required when the computer is doing nothing. The second component is proportional, and due, to the amount of computing being done. The more computations per second, the more power is required, thereby generating additional heat.

Guill
  • 2,573