In 1676, Rømer determined that the speed of light must be finite.
His experiment consisted on observing the eclipses of Io, one of Jupiter's moons, by Jupiter itself. He timed these eclipses over a period of half a year, starting when the earth was closest up to when the earth was farthest from Jupiter. Since the orbital period of Jupiter is 11.8 years, and that of Io is 1.77 days the motion of Jupiter can be ignored in the argument.
At the end of this 6 month period, Rømer found that the eclipses were delayed by 22 minutes from what would have been if Io orbited Jupiter at a constant rate. Rømer interpreted this as the time taken by the light from Io to travel the distance that the Earth had moved away from it over the half year.
What I don't understand from these results is the figure of 22 minutes. The distance the Earth should have moved is twice the distance from it to the sun, and the time light would take to travel this distance is around 16.67 minutes, not 22. I cannot think of a physical explanation for such a large delay, because:
- The difference between Earth's perihelion and aphelion is only ~3%. (It is very small compared to the ~32% difference in times).
- If we do consider the motion of Jupiter, we find that it actually gets closer to Earth, because it orbits in the same direction.
- The orbit of Jupiter is inclined 1.3° w.r.t. Earth's, this means that the distance should be a little smaller than the 16.67 light minutes, rather than larger.
So my question is, why did Rømer find the figure of 22 minutes? Am I overlooking some part of the mechanics? Or is it that his lab equipment was very imprecise because it was almost 350 years ago?
If it is the latter, could I then repeat the experiment with modern equipment and find a reasonable figure for the speed of light? Has anyone done this?