4

While I understand the effect of varying wavelength and frequencies on the photoelectric effect, I can't seem to turn my mind around that question... I suspect it has to do with quantas and the non continuous aspect of the electron's nature but I am really not sure...

Thanking you in advance.

4 Answers4

1

Let us first describe a relevant experiment: You have a photomultiplier tube, hooked to a loudspeaker for convenience. If you shine on the detector with light you hear noise, which is louder if the light source is brighter. But if you only take a very feeble light, you'll notice a peculiar thing: The loudspeaker does not make noise anymore but produces distinct clicks (in apparently random intervals)! Also, all those clicks are equally loud and long (except perhaps if two are too close toghether and one cannot separate them), i.e. in every way identical. Getting curious you might experiment with different colors and notice that only blue light makes clicks, red does not etc. The intensity of the light source only influences the amount of clicks.

The above can obviously not be explained with the behaviour of ordinary waves, waves are inherently very continuous.

This chapter describes the mentioned things in more detail on an intuitive level and is certainly worth reading.

caconyrn
  • 218
0

The photoelectric effect is explained by a cool experiment which exactly displays why the wave model doesn't apply to the idea of photons.

Here's a diagram of the experiment: image

The wave model insinuates that the strength or amplitude of a light wave is proportional to its brightness, suggesting that a bright light should be strong enough to generate a large current.

EDIT: Physically, this means that the energy of the electrons increases as the intensity of the lamp increases, which implies that the speed of the electrons travelling in the tube from the cathode to the anode is also higher.

However, the experiment shows that, as long as the lamp is on (intensity $> 0\%$), the energy of the electrons remain the same regardless of the intensity of the lamp (energy = $x$ Joules whether the intensity is 50 or 100%). Only the number of electrons emitted from the cathode increases, causing an increase in the current of the circuit. This implies that there's a relation between current and intensity (which is directly proportional).

The experiment performed with varying frequencies of light shows that the energy of the electrons increases with higher frequencies (a linearly proportional relationship exists). This relationship is depicted by $ E_e = hf - \phi =\frac{hc}{\lambda} - \phi$, where $h$ is Planck's constant and $\phi$ is the work function of the metal (the minimum amount of energy required by electrons to escape the surface of the cathode).

EDIT: Einstein was inspired by Planck's black body radiation results and used the idea of obtaining energy from the electromagnetic field in discrete portions, indicating to him of the existence of quanta called photons.

These experimental results strongly indicate that the wave model for electromagnetic waves is not compatible with the photoelectric effect, as no energy variation in the electrons is observed with a change in intensity.

EDIT: Here's a link to an interactive Java animation which allows you to "play" around with this experiment beautifully.

0

With a classical wave model for light and a classical mechanics model for matter, the energy which is absorbed by one particle will be a function of the wave amplitude $E$. If we reduce the wave amplitude the absorbed energy $W$ will smoothly go to zero. In mathematical terms: $$ W(E\rightarrow0,\omega) \rightarrow 0 $$ If an electron requires a minimum amount of energy to escape a material, a minimum wave amplitude is required to do that.

However this is not obsorved in the photoelectric effect. There is no minimum light intensity i.e. wave amplitude required to extract electrons from the cathode.

It turns out that both a quantum description of matter or radiation can explain this behaviour. Former allows particles to only absorb quantized energy bits of $W=h f$ (for more details see Fermi's golden rule) while latter provides only quantized energy bits, which we call photons, with energy $E=hf$. For both cases the energy absorbed by one particle is only a function of $f$ and does not depend on E.

In reality we observe both quantization of matter and radiation.

Jannick
  • 1,283
  • 8
  • 15
-1

Classical electromagnetism has no way of linking energy of a wave to its frequency. In classical E-Mag energy density is usually given by $$ u = \frac{1}{2}( \epsilon E\cdot E + \frac {B\cdot B}{\mu} )$$ As you can see this has no frequency dependence in it at all. And in photoelectric effect experiment suggests that frequency affects the energy carried by the light.

Ilya Lapan
  • 1,169
  • 9
  • 22