I am trying to find a good rough guesstimate of how much of the maximum wattage usage of a graphics card is turned into hot air.
Say a graphics card is rated at using X watts at maximum usage. How much of that is released as heat into the air via the cooling setup, and how much is not (If not where does it go? To ground?)
Yes I know there are a lot of variables here, I am just looking for a guesstimate from someone who would understand better than me. I'm specifically using this GPU as the example (Radeon 6990 375 watts max)