0

I was reading a question involving an ultracentrifuge to test General Relativity. Instead of using an atomic clock the asker posited using radioactive decay as the metric to evaluate time dilation effects. The answers did a good job of explaining that the acceleration due to rotation was more in the realm of Special Relativity so I wondered if the following experiment would be a valid / applicable test of General Relativity.

Take a quantity of a radioactive isotope, split it in half. One half stays on Earth, the other half is sent to the Moon, along with the appropriate detector(s). Since the gravity field experienced by the Moon is both different and smaller in total than the field on Earth, would the sample on the Moon decay at a slower rate and could that difference in rate be attributed to General Relativity? I'm assuming that the gravity field generated by the Moon's mass is less than the lessened field due to the increased distance from Earth.

1 Answers1

3

The problem is that radioactivity is a poor way to measure time.

Suppose you are trying to measure the current level of radioactivity from some source. If you measure $N$ counts then the standard error will be of order $\sqrt{N}$. The time dilation effects we are interested in are of order one part in $10^9$, so the number of counts we need to measure is of order:

$$ \frac{\sqrt{N}}{N} = O\left(10^{-9}\right) $$

Giving us:

$$ N = O\left(10^{18}\right) $$

So you need to measure a minimum of 1,000,000,000,000,000,000 radioactive decays to get the accuracy needed, and that makes the experiment impractical.

As several of the comments have said, if you want to measure time accurately you'd simply use a caesium clock or some similar device.

John Rennie
  • 367,598