The red-shift of the light of a star in a galaxy or that of a galaxy in a cluster of galaxies is generally interpreted as how fast the star or the galaxy is moving, i.e. it is interpreted in a purely special-relativistic way. However, general relativity predicts that a light produced in a gravitational field gets red-shifted when comes out of the field. I wonder why the excess red-shift (over the Hubble red-shift) of stars and galaxies is only interpreted as how fast the object moves, not that how strong the ambient gravitational field is.
2 Answers
A Galaxy cluster could have $10^{14}$ solar masses within a radius of 5 Mpc.
In this case $GM/Rc^2 \sim 10^{-6}$, equivalent to a velocity shift of less than 1 km/s.
Our own Milky Way has a mass of around $10^{12}$ solar masses within 100 kpc. This gives a gravitational redshift of about 100 m/s.
These are completely negligible compared to cosmological redshifts and the peculiar velocities within clusters or groups of galaxies (the latter are of order 100-1000 km/s).
EDIT: To clarify, as a result of Lubos's comment (see below). Because the Earth is in the Milky Way, light is gravitationally blueshifted on it's way into the Milky Way and also into the "local group" of galaxies. However that light has emerged from a different galaxy in a different environment that will be redshifted as it makes its way out of that potential. These two shifts will not in general cancel because galaxies and galaxy clusters have a variety of masses, sizes and potential depths. Hence the numbers I give are the correct orders of magnitude for the errors introduced by ignoring gravitational redshift, but any exact correction needs to be calculated on a case-by-case basis.
FURTHER EDIT: The comment above is even more apt in the light of the literature referred to by Pulsar. For example Cappi (1995) model the (more realistic) potentials of rich clusters and show that the redshift is a strong function of where the galaxy is in the cluster, but could be anywhere in the range from less than 1 km/s to 300 km/s at the centres of the most massive clusters. This is a lot larger than my estimate above because densities in clusters vary more steeply than $r^{-2}$. However, this is still small compared to their velocity dispersions within the same clusters, because more massive clusters also have higher intrinsic velocity dispersions.
- 141,325
The gravitational red shift is only significant for black holes – where the coefficient may grow arbitrarily large in the vicinity of the horizon – and the neutron stars – where the frequency drops to something comparable to 50%.
For all other celestial objects, the red shift is much smaller than one. And only planets and white dwarfs are objects for which the red shift may be easily detectable. It's helpful to calculate the red shift for the Sun. The red shift is given by the gravitational potential $$ \frac{\Delta f}{f} = - \frac{GM}{Rc^2} $$ For the Sun, the relative decrease of the frequency may be calculated e.g. via Wolfram Alpha
and it is just $2\times 10^{-6}$; two parts per million. Note that the Hubble constant is about $10^{-10}$ per year – the inverse of 14 billion years, more precisely, so the distance needed to reduce the frequency relatively by two parts per million is $2\times 10^{-6}$ times 14 billion light years which is just 28,000 light years. That's still inside our galaxy!
For all other galaxies, the cosmological red shift is much greater than the gravitational red shift from Sun-like stars. The Sun is just too large, by radius, too diluted. This smallness of the gravitational potential becomes even more extreme if you consider the "groups" of stars such as galaxies themselves.
- 182,599