0

Veritasium's video explains why we can't measure the one-way speed of light: https://youtu.be/pTn6Ewhb27k?si=60Q0AffVdt09lJSb

However, I still don't completely get why this is the case.

Let's say that we have two clocks at point A (starting point) and point B. Let's say that these points are reasonably separated to allow for proper measurements. We know the distance between points A and B.

Both clocks initially read the value of 0 and are not counting. The moment that we send a laser from clock A, clock A will start ticking. Now, the moment that the laser reaches clock B, clock B will start ticking. Clock B will be slightly behind since it takes time for light to go from A to B. But the time delay ($\Delta t = t_a - t_b$) should be exactly equal to the time it takes for light to travel that distance.

Perhaps as a stricter requirement, let's assume that there is no relative motion between clocks A and B, so that we don't have to worry about time dilation.

So now, why can't we simply say that $v_{light} = \frac{d}{\Delta t}$? What is wrong with this argument? Doesn't this allow for the measurement of the one-way speed of light, without the usage of a mirror?

Qmechanic
  • 220,844
Stallmp
  • 879

1 Answers1

2

The problem with that scheme is that you must then compare clock B's time with clock A's. How do you do that without sending a signal B->A and correcting for the delay?

But the whole discussion ignores the fact that the very first measurement of the speed of light by Ole Rømer was a one-way measurement. He used a remote clock, the orbit of Jupiter's moon Io, whose distance from his local clock varied periodically (because of Earth's orbit). He observed a periodic variation in the apparent timing of Io's orbit, which he attributed to the varying delay of the light signal between Earth and Jupiter due to the varying distance. No return signal was required.

John Doty
  • 22,119