I am trying to replicate Rømer's experiment where he determined the speed of light by observing and measuring the eclipse timings of Io by Jupiter. I'm using Stellarium for this experiment and no matter how I try I can't seem to land on the correct value.
Here is my setup.
- Using Stellarium I noted the date and time of when Io just begins to emerge from behind Jupiter and took this as a baseline.
- I then predict the next eclipse by adding 42.5 h to the previous time.
- I use Stellarium to see when the next eclipse will happen and note the date and time.
- I then calculate the distance travelled by using data from NASA'a Horizon website for the respective distances of Io from Earth, and subtract those 2 values, which gives the distance.
- The predicted and actual eclipse timings are different by 104.5 s
- Using these values I then divide the distance by the time.
Here are my values:
- Baseline eclipse - 2023-11-01 07:00:46
- Next predicted eclipse - 2023-11-01 01:28:20
- Actual eclipse - 2023-11-01 01:26:35
- Time difference = 104.5 sec
Distance of Io from Earth:
- Baseline eclipse - 3.98457905366747 au = 596093026.42865 km (using 1 au=$1.496 \times 10^8$ km)
- Actual eclipse - 3.98258042749088 au = 595794031.95264 km
So distance travelled = 298994.47602 km
Then to calculate speed:
$$ s = d / t \\ s = \frac{298994.47602 \text{ km}}{104.5 \text{ s}} \\ s = 2861.06 \text{ km/s} $$
This is obviously incorrect and looks to be out by a factor of 100. I've double and triple checked all of my measurements and calculations, but I can't figure out where I've gone wrong.