If we assume that a monochromatic source of light produces waves with uniformly distributed phase differences then the sum of all those waves should be zero and thus also the intensity?
1 Answers
Short answer: Incoherent light is not a sum of monochromatic waves with randomly chosen phases, but is instead a random process where the value of the field is random from point to point in time. A better model for this would be a wave with a single definite but random phase at any given point that changes from moment to moment.
Long answer:
The answer boils down to a couple of facts. The first point is that no source
is truly monochromatic, the idea of a "monochromatic" light source is an
idealization to mean an infinitesimal bandwidth (which also implies a limit of
existing for an infinite amount of time).
The second point is that nothing is either perfectly coherent or perfectly incoherent. Instead it's useful to think of everything as partially coherent with a characteristic scale that defines how much a field is coherent. If we only think of a one-dimensional field (i.e. think of a ray of light moving from a source to some detector or point in space), then the 'characteristic scale' is what we call the Coherence time.
Putting some math to this, we can analytically represent a monochromatic electric field at a point by the equation $$E(t) = A\exp(-i\omega t),$$ where $A$ is the complex amplitude of the light ($|A|^2$ is proportional to the energy density and thus brightness of the light). Now in reality $A$ is not just a constant but must change in time, so $A \to A(t)$ (one way to see this is no source can exist from an infinite time in the past to an infinite time in the future). For a random field, $A(t)$ will vary randomly in time. Now any function of time automatically can be represented by a sum of frequencies (which can be computed by the Fourier_transform $$A(t) = \int_{-\infty}^\infty \tilde A(\omega') \exp(-i\omega't)d\omega'$$ and thus $$E(t) = \int_{-\infty}^\infty \tilde A(\omega') \exp(-i(\omega+\omega')t)d\omega'.$$ Therefore we can see that $E(t)$ is not simply monochromatic but has components at all frequencies $\omega+\omega'$ for which $\tilde A(\omega')\ne 0$.
Now $|\tilde A(\omega')|^2$ is the spectral energy density and for stochastic (random) fields can be represented by random numbers whose ensemble average encodes the degree of coherence (how coherent/incoherent the field is). For stochastic fields each $\tilde A(\omega')$ will be uncorrelated, i.e. $\langle \tilde A^*(\omega_1)\tilde A(\omega_2)\rangle = \langle|\tilde A(\omega_1)|^2\rangle\delta(\omega_1-\omega_2)$. What this means is that incoherent fields are made of up of a sum of mutually unrelated frequency components, each of which is coherent only with itself (i.e. can only interfere with itself). In addition, the coherence time $\tau$ will simply be related to the bandwidth $\Delta\omega = 1/\tau$ of $\tilde A(\omega)$ (i.e. the range of frequencies for which $|\tilde A(\omega)|^2\ne 0$ is roughly $\omega \le \Delta\omega$).
Now the coherence time $\tau$ means that the field is only coherent during a time $\Delta t <\tau$. So for instance if you have very fast detectors (small $\Delta t$ to make a measurement), you might be able to see coherence effects from a source that looks incoherent when you average for longer time scales. Now a fully incoherent source of light means that $\tau = 0$, i.e. for any time scale $\Delta t > 0$ the light looks incoherent. But for $\tau =0$ means that $|\tilde A(\omega)|^2$ has infinite bandwidth (such as for ideal white noise). Now if you filter light to some bandwidth $\delta \omega$ you get a longer $\tau$, but your energy is proportional to $\int_{-\delta\omega}^{\delta\omega}|\tilde A(\omega)|^2d\omega$ and in the limit of monochromatic light $\tau\to\infty$ but the integral $$\lim_{\delta\omega\to 0} \int_{-\delta\omega}^{\delta\omega}|\tilde A(\omega)|^2d\omega \to 0,$$ and you have no energy (in the limit).
Reference:
"Optical Coherence and Quantum Optics" by Leonard Mandel and Emil Wolf
- 2,948