When light is released from a faraway star (that is, most noticeably a faraway star), the wavelength is stretched due to the expansion of the universe. The photons/light wave lose energy in that context. I have heard some claim that within GR, this is an example of a violation of conservation of energy. Can you explain whether the energy is lost to redshift or how it is conserved?
2 Answers
The total energy of the photon is conserved in redshift (as total energy is always conserved by Noether's theorem)... maybe.
Redshift occurs for one of two reasons:
- The emitter is moving away w.r.t. the receiver. If you imagine a car firing a burst of light at an observer from which it is moving away, and see the photon as a wave packet (maybe think in terms of the $\mathrm{sinc}$ function instead of the complex wavefunction), the photon's apparent frequency is reduced, but the burst also appears to take longer to finish from the perspective of the observer, so the total energy is the same. In fact, this appearance of the burst lasting longer is exactly in line with relativistic time dilation caused by the emitter's motion relative to the receiver.
This is the case that you're talking about. Universal expansion causes distant emitters to seem to recede at a speed proportional to their distance.
However, there might be an issue in this argument for energy conservation, because energy conservation entirely depends on Noether's theorem: energy is conserved if and only if physics is invariant under time-translation (how you get between those two things is somewhat convoluted and outside the scope of the question). In other words, if you can do the same experiment at two different times and get an identical result, energy is conserved throughout the system.
But, if the Universe itself is expanding, it does matter when you do the experiment, because the scale factor (and thus the spacetime metric) will be different at two different times. This would suggest that energy isn't necessarily conserved over cosmological distance and time scales, but relativity does adequately explain photon redshift through time dilation, and if we consider distant emitters to actually be moving away, then relativity saves the day and ends up conserving energy anyway.
Redshift is also caused by something else that I think is relevant to note:
- The emitter is in a deeper gravity well than the receiver. To move a thing with mass (or, equivalently, energy) out of a gravity well, you need to do work, which requires expending energy. The photon has energy to begin with, and it loses energy leaving the gravity well. In doing so, its frequency decreases.
This doesn't violate energy conservation, because the photon sort of gains gravitational potential energy as it leaves the gravity well, insofar as it requires work to remove the photon from the well entirely.
- 8,801
If the light rays from the source is described by vector field $k^a$ then the redshift $z$ measured at the detector is simply $$1+z = \frac{(u^ak_a)_{source}}{(u^ak_a)_{obs}}$$ where $u^a_{source}$ and $u^a_{obs}$ are velocity field of the source and detector respectively. And energy of photon measured by an observer $u^a$ is $E=u^ak_a$. Thus the energy of photon as measured by the observers is not conserved.
However, if the spacetime has a time-translation symmetry or some time-like Killing vector field $K^a$, then the quantity $Q=K^ak_a$ is conserved along the world-lines of photons. That is, for special observers travelling with timelike vector $K^a$ there will be no redshift of photon.
- 2,027
- 1
- 5
- 18