I've been using a Bitcoin miner as a heater in the winter. I figure if I'm going to use resistive heat, I might as well generated some possibly-valuable wooden nickels as a side effect of the conversion from electrical to heat energy.
But I'm curious: does all the energy get turned into heat during computation? In practice, I understand that not all the energy gets truly applied to the computation in the first place, due to power supply inefficiencies and fans and if there are any spinning platters. And also, a little bit of energy leaves the system through LED indicators, EMI radiation, and along network cables, etc.
But does any of the energy actually get "consumed" in the computation itself, turned into information or something? From a thermodynamic perspective, is there any difference between a chip that draws 5W while quietly calculating digits of π inside, versus a length of nichrome wire drawing 5W? Do you get exactly 5W of heat out in both cases?
There's a related question at Is it necessary to consume energy to perform computation?, but I'm sort of asking the opposite: when energy is consumed performing (purely) computation, does every last bit of it end up as heat?