25

I am trying to find a good rough guesstimate of how much of the maximum wattage usage of a graphics card is turned into hot air.

Say a graphics card is rated at using X watts at maximum usage. How much of that is released as heat into the air via the cooling setup, and how much is not (If not where does it go? To ground?)

Yes I know there are a lot of variables here, I am just looking for a guesstimate from someone who would understand better than me. I'm specifically using this GPU as the example (Radeon 6990 375 watts max)

KDecker
  • 451
  • 5
  • 10

3 Answers3

28

My numbers might not be exactly accurate, but I would say that about

99.99% of the energy that enters the GPU, and even CPU is converted into heat.

The other .01%, is the actual signal out of the GPU to your display.

The job of the GPU is to take in a lot of data, and process it, requiring a lot of calculations. These calculations consume energy, producing heat, and eventually a result.


Now it is important to note that while this says that it is a 375W card, it will not be drawing 375W the entire time it is in operation. Just like your CPU, your GPU will only do as much as you need it to, and may step down <100W.

Simply browsing around your windows desktop, the card it doing next to nothing, and will draw next to nothing; but launch Crysis, and your frequency mill max out and the card will start drawing its maximum rating.

Matt Clark
  • 485
  • 1
  • 5
  • 15
  • 8
    And in turn, that 0.01% of energy in the signal is consumed by the display - it's also turned into heat, just in a different place. – Li-aung Yip Oct 30 '13 at 02:23
  • Some of it is turned into light, though! – rjp Oct 30 '13 at 03:39
  • 11
    @RJP The energy that goes to the panel? No, it does not. It will get dissipated in termination resistors in the HDMI or DVI receiver chip. From there, the signal is decoded, stored in a buffer, and finally sent to the panel drivers where it is driven out onto the panel matrix. However, the original power from the graphics card is basically completely lost in the termination resistors, and what little is left does not leave the TMDS receiver circuitry. – alex.forencich Oct 30 '13 at 04:12
  • 4
    Yeah, somehow I managed to forget that the display is not powered by the graphics card. That would be a really terrible design. – rjp Oct 30 '13 at 04:46
  • 13
    Even the /light/ is eventually turned into heat, when it hits an object in your house and heats it up a little bit. The universe is marching towards maximum entropy, which usually involves all forms of energy being dissipated as heat... eventually. – Li-aung Yip Oct 30 '13 at 05:53
  • @rjp it's not unheard of - I had a G4 with one of these https://en.wikipedia.org/wiki/Apple_Display_Connector – Pete Kirkham Sep 23 '16 at 09:28
13

Basically it all goes to heat. Inside the chip there are billions of tiny transistors, each one acts like a really tiny switch. These transistors are connected to form various logic functions. Due to the geometry of the transistors, they all have small parisitic capacitances. Whenever the logic changes state, which is on average the same rate as the clock, these capacitances need to be charged and discharged. The current to do this has to pass through the transistors, which have some resistance. Resistance and current flow mean voltage drop, and volts times amps is power. Every time the clock cycles, power is consumed charging and discharging these capacitors. This happens in every integrated circuit - your GPU, your CPU, your RAM, your hard drive controller, your cell phone, etc. This is why when you overclock your CPU or GPU, it uses more power and generates more heat. This is also why your laptop will vary its clock speed on the fly to save power instead of just letting the CPU run idle.

In terms of how much doesn't go to heat, it isn't very much. A watt or two for the fan. Maybe. Perhaps hundred milliwatts for the LEDs, if the card has a bunch. A couple of milliwats get moved over the PCI express bus, to get absorbed in terminations on the other end of the bus. A couple more milliwats get sent out over the DVI or HDMI port to get absorbed in terminations on the other end. All in all, with a TDP off 375 watts, less than 0.1 percent of the power coming from the PSU doesn't get directly dissipated as heat.

alex.forencich
  • 40,922
  • 1
  • 69
  • 109
  • And the rest gets indirectly dissipated as heat, as the photons and phonons get absorbed by other materials. – Ignacio Vazquez-Abrams Oct 30 '13 at 03:23
  • 1
    Well, you could probably make that argument about a lot of the electricity we consume in general. Lightbulbs, stove, microwave, hot water heater, refrigerator, stereo, tv, etc. The washer and dryer might be a slightly different story though. – alex.forencich Oct 30 '13 at 03:45
3

It all gets turned into hot air, except for the tiny amount of light generated by any LEDs. There's no place else for the energy to go.

Joe Hass
  • 8,487
  • 1
  • 29
  • 41
  • All electrical energy sent into an IC is released as heat? – KDecker Oct 30 '13 at 01:50
  • 1
    @BumSkeeter: All electrical energy consumed by an IC is released as heat. – Ignacio Vazquez-Abrams Oct 30 '13 at 01:51
  • So if (as in the GPU's case) the IC in question is rated at 375w TDP (Thermal design power; amount the cooling system is required to dissipate), then it will release all 375watts into the air. (After looking up the definition of TDP that makes sense). What electrical energy might not be consumed? – KDecker Oct 30 '13 at 01:54
  • 1
    All of the electrical energy supplied to the GPU is "consumed". The question is what form of energy does it become. Virtually all of it turns into heat. Of course, a GPU rated at 375W may typically consume much less than that. – Joe Hass Oct 30 '13 at 01:57
  • 1
    In the context I am trying to use this value for the GPU is running all out (bitcoin mining). I'm trying to figure (for fun) the viability of heating a house with graphics cards. – KDecker Oct 30 '13 at 02:01
  • It is most certainly "viable" to heat a space with computer hardware...one of the biggest problems in large server farms is getting rid of the excess heat. However, it is a very inefficient way to heat a space. That's why we replace our electric baseboard heaters with a heat pump. – Joe Hass Oct 30 '13 at 02:03
  • 2
    @BumSkeeter, using a circuit to heat a room is about as efficient as an electric heater. That said, if your purpose was to mine instead of heat a room, GPU is no longer viable. You have to use FPGA or ASICs to make any money over your electricity bill. Or better yet, don't fall for the scam at all. – travisbartley Oct 30 '13 at 02:06
  • There is an added benefit to using the GPU though is that it will produce bitcoins (if you know or heard of them) which can be sold or used elsewhere. If the value produced by the GPU's outweighs the electricity usage it might be worth it. (probably not) – KDecker Oct 30 '13 at 02:07
  • 1
    @trav1s I'm well aware of that. Also its not a scam. – KDecker Oct 30 '13 at 02:07
  • FPGA and ASICs also turn almost all the energy they consume into heat - you can basically assume this about almost any electrical circuit. Of course, the fact that an electric heater isn't the cheapest way to heat a house still applies here. – Random832 Oct 30 '13 at 13:37
  • 1
    @Random832: An electric resistance heater can only turn 100% of its input into heat. Heatpumps can go past 100% since they move cold outside :) (agreeing with you) – Bryan Boettcher Oct 30 '13 at 15:48
  • I find the idea appealing :) If you're heating your home by electricity anyway, why not let the "heater" generate some free BTC in the process? - As was said above: some 99% of the energy will leave a PC as heated air; the rest may be other emissions: airflow, sound (noise), RF (WLAN,Bluetooth,RF noise), light maybe. Except for the RF part it all will probably be turned into heat before leaving the room. – JimmyB Oct 30 '13 at 17:11
  • We determined that it would be super unworth it to do this @HannoBinder ( https://bitcointalk.org/index.php?topic=320984.msg3439302#msg3439302 ) – KDecker Oct 30 '13 at 20:34
  • 1
    And I agree. While the energy efficiency will be at almost 100%, hardware costs and time/money spent for maintenance will yield big red numbers. – JimmyB Oct 31 '13 at 11:08
  • One might note though that if an (existing) BTC miner is moved into a room that's already electrically heated every single Joule put into mining will be saved in heating, causing basically no extra cost in energy consumption of the room. – JimmyB Oct 31 '13 at 11:16