5

I have a red LED (623 nm peak wavelength) which I am able to light up at 1.48 V. I thought the switch-on voltage should be determined by $hc/(e\lambda)$, but that would give 1.99 V for 623 nm. Even assuming the LED spectrum is really wide, and I'm only seeing the the deepest visible red (~720 nm), the voltage required should still be at least 1.72 V.

Does the missing energy come from heat? There is "thermal voltage" $(kT/e)$, but it's only 26 mV at room temperature — an order of magnitude smaller than the discrepancy between the prediction and the measurement.

ex-punctis
  • 153
  • 3

1 Answers1

4

There is nothing strange in having a forward voltage drop lower than $E_{gap}/e$. According to the Shockley equation of the diode, any voltage bias $V_D \gt 0$ is able to induce forward conduction in the diode:

$$ I(V_D) =I_0 \left(e^{eV_D/kT}-1 \right), $$

and as long as there is some current flowing across the LED, light emission is possible although possibly faint.

However, from the point of view of energy conservation, every photon of energy $E_{gap}$ is created at the expense of an injected electron of energy $eV_D$. When at low current injection $E_{gap} \gt eV_D$, the missing energy has to come from heat. It is similar as the evaporation of a liquid: only the few electrons that happen to have enough energy to overcome the built-in potential recombine, but by doing that they subtract thermal energy.

This doesn't necessarily mean that the LED cools down. Most of the electrons recombine non-radiatively emitting heat instead of light. There is also joule heating due to the flow of current. But in some special conditions cooling of the device has been indeed observed.

iwakun
  • 249
  • 2
  • 7