9

Suppose a clock that ticks every second is moving at half the speed of light away from me.

Without special relativity, I would see the ticks at every 1.5 seconds, since the light from the next tick has to travel 0.5 light-seconds further than the light from the current tick.

Using the Lorentz factor, the time between ticks would appear $\frac{1}{\sqrt{1 - (\frac{1}{2})^2}} \approx 1.155$ times longer to me. Now, if I want to include the time it takes for the light to travel to me, I could either calculate it as

  1. The clock ticks (in its reference frame) every second, so it travels 0.5 light seconds between ticks, making each tick 0.5 seconds longer for a total of 1.155 + 0.5 = 1.655 seconds between ticks.
  2. The clock ticks (in my reference frame) every 1.155 seconds, so it travels 0.578 light seconds between ticks, making each tick 0.578 seconds longer for a total of 1.155 + 0.578 = 1.733 seconds between ticks.

Which of these is the correct calculation?

John
  • 93

3 Answers3

14

Calculation 2 is correct. That is how you determine the tick frequency working entirely in your reference frame.

The same calculation can be done in the reference frame of the clock, but you must perform the change of coordinates correctly. In that reference frame, you are moving away from the clock at 0.5c; that is, every second, you move 0.5 light-seconds further away from the clock. When one tick has reached you, the next tick is 1 light-second behind. It takes 2 seconds to catch up with you (because you have moved an additional 1 light-second in that time). Thus the ticks reach you every 2 seconds in the clock's reference frame. However, your clock is slowed down because you are moving, so according to your clock, the ticks reach you every 2/1.1547 = 1.732 seconds.

Of course, you get the same answer in both reference frames, as you must.

anon
  • 3,201
  • 11
  • 14
6
  1. The clock ticks (in its reference frame) every second, so it travels $0.5$ light seconds between ticks, making each tick $0.5$ seconds longer for a total of $1.155 + 0.5 = 1.655$ seconds between ticks.

This first method is incorrect because you are using the time of the moving clock ($1 \ \text{second}$) as measured in its own rest frame to measure the time interval used to calculate the distance in the frame that sees the clock moving. The distance is measured in the reference frame of the observer that is receiving the signal, so we must use the time interval between emitted signals as measured by the receiving observer. If observer $\text{A}$ is observing signals from clock $\text{B}$, that is moving away from him, the distance clock $\text{A}$ moves in $\text{A}$'s reference frame must be calculated using the interval measured on $\text{A}$'s clock ($1.155 \ \text{seconds}$).

The second method is the correct one and is in agreement with the result given by the relativistic Doppler effect.

$${f_r} = {f_s}\sqrt{\frac{1-v/c}{1+v/c}} = \sqrt{\frac{0.5}{1.5}} \approx 0.577,$$

where $v$ is the velocity of the clock going away from the observer, $f_r$ is the frequency of the signal received by the observer, and $f_s$ is the frequency of the signal emitted by the clock that is going away. Since $f_r$ is a frequency, we need the inverse to find the time interval between received signals, which is $\Delta T_r = 1/f_r \approx 1.732 \ \text{seconds}$, if the source clock that is going away emits signals at the frequency of once per second.

M. A.
  • 2,039
  • 5
  • 11
  • 28
KDP
  • 10,288
2

The right answer is 2), as already pointed out. The mistake in 1) is that you forgot to time-dilate the $0.5 \ \text{s}$.

$$\text{Total period in your reference frame} = 1 \ \text{s} \times \gamma + 0.5 \ \text{s} \times \gamma = 1.5 \times 1.155 \ \text{s} = 1.732 \ \text{s}.$$

M. A.
  • 2,039
  • 5
  • 11
  • 28
harry
  • 292