When I took my first thermo class a tucked away chapter introduced Exergy in terms of electrical energy, meaning that the amount of electrical energy you could get from something is functionally its exergy. Wikipedia doesn't go so far, but this has bothered me for years.
Forms of energy such as macroscopic kinetic energy, electrical energy, and chemical Gibbs free energy are 100% recoverable as work, and therefore have an exergy equal to their energy. However, forms of energy such as radiation and thermal energy can not be converted completely to work, and have exergy content less than their energy content.
Let's say that we have 1 Volt of potential. The smallest unit of energy we could isolate from this would be a single electron accelerated over the potential. Numerically and naively I could write:
$$ 2/3 k T = V e$$
Where $e$ is the electron charge. Doing this I would get something like $17,000 K$ for the temperature. This is high enough to approximate to infinity for conversion processes where your sink is room temperature.
I can imagine accelerating something with a greater charge than electron over that potential. This method would find a higher temperature, which presents a counter-argument. So what is it, is there a theoretical limit to the energy you can get from electricity, giving it a functional temperature? That would imply an unavoidable loss from DC-DC voltage converters.
Another thing that bothers me is saying that radiation can't be converted completely to work. Thermal radiation, yes, but most nuclear radiation is at $keV$ or $MeV$ energies, which would give a dramatically higher temperature than electrical energy (if I've understood correctly).