52

I could understand that the definition of a second wouldn't have an uncertainty when related to the transition of the Cs atom, so it doesn't have an error because it's an absolute reference and we measure other stuff using the physical definition of a second, like atomic clocks do.

But why doesn't the speed of light have uncertainty? Isn't the speed of light something that's measured physically?

Check out that at NIST.

Qmechanic
  • 220,844

8 Answers8

62

The second and the speed of light are precisely defined, and the metre is then specified as a function of $c$ and the second. So when you experimentally measure the speed of light you are effectively measuring the length of the metre i.e. the experimental error is the error in the measurement of the metre not the error in the speed of light or the second.

It may seem odd to treat the metre as variable and the speed of light as a fixed quantity, but it's not as odd as you may think. The speed of light is not just some number, it's a fundamental property of the universe and is related to its geometry. By contrast the metre is just a length that happens to be convenient for humans. See What is so special about speed of light in vacuum? for more info.

John Rennie
  • 367,598
19

To repeat Wikipedia:

The speed of light in vacuum, commonly denoted c, is a universal physical constant important in many areas of physics. Its value is exactly 299,792,458 metres per second, a figure that is exact because the length of the metre is defined from this constant and the international standard for time.

In other words, it's exact because we have a definition of the second:

the duration of 9,192,631,770 periods of the radiation corresponding to the transition between the two hyperfine levels of the ground state of the caesium 133 atom

and the metre is the distance light travels in $1/299,792,458$ of a second.

That leaves no room for error in the definition of the speed of light.

hdhondt
  • 11,338
8

As you can read in this Wikipedia article, it was decided recently to base all SI units on seven constants of nature. To be able to do so, these constants have to be set to absolute values. Therefore it was decided, that these constants are fixed without error margin at their commonly accepted values to derive all other SI units from those now fundamental constants.

5

In SI system, a meter is defined to be 1/299,792,458 light-second (in other words, the distance traveled by light in vacuum in 1/299,792,458 second), and the speed of light in vacuum therefore is defined to be 299,792,458 m/s.

3

The speed of light indeed fluctuates in vacuum. A single photon can propagate slightly faster or slower than light. This can be interpreted as appearance of virtual photons ahead of the propagating one and consequent annihilation of the first one with one of the appeared. Only statistically the speed of light is constant.

Anixx
  • 11,524
2

The reason is that measurements of speed of light became very, very precise. Much more than measurements of Earth's diameter or any physical object like 1 metre rod. So it is better to settle on some fixed value of metres per second in c. Something has to be fixed, let it be something we can easily measure in any laboratory.

1

the definition of a second wouldn't have an uncertainty when related to the transition of the Cs atom,

The definition of the SI unit "second" does not refer to just any given sample of Cs atoms, and specificly, not to transitions between the two hyperfine levels of the ground state of just any given sample of caesium 133 atoms;
but it refers to an idealization: a caesium atom at rest at a temperature of 0 K and free of any perturbation.

As far as this idealization is unambiguously defined, such that for given samples of caesium 133 atoms it may be unambiguously measured by how much they differ from the idealization, the SI unit "second" has no uncertainty.

But why doesn't the speed of light have uncertainty?

That's due to our definition of (how to measure) "speed";
and, in the first place, due to our definition of (how to measure) "distance" between participants ("ends") who were and remained at rest to each other, and (therefore also) due to our definition of (how to measure) whether a given pair of participants is "at rest" to each other, or not.

Specificly, within the framework of (special) relativity and thus of contemporary physics in general, we define the distance between two suitable participants (i.e. who were and remained at rest with respect to each other), say $A$ and $B$, through the ping duration between them i.e. the duration of either participant from indicating a signal until indicating the reception of the corresponding reflection off the other participant. (By the definition of how to measure mutual rest, these mutual ping durations, trial by trial, are equal, and constant.)

The distance of $A$ and $B$ with respect to each other is then expressed as $$\ell[~A, B~] = \ell[~B, A~] := \frac{c}{2} ~ \tau A[~\text{signal}, \circledR B \circledR \text{signal}~] = \frac{c}{2} ~ \tau B[~\text{signal}, \circledR A \circledR \text{signal}~],$$

where "$c$" is (just) a distinctive symbol (for distinguishing ping durations between a suitable pair of participants from other durations) which is (evidently) not zero; and the factor $\frac{1}{2}$ is included by convention.

Further, using the definition of "average speed of a trip from $A$ to $B$" as the ratio between "distance between start and finish" and "duration of the course having been occupied",
the (average) signal front speed of a signal being exchanged between $A$ and $B$ is evaluated as ratio between $\ell[~A, B~]$ and half of the ping duration between $A$ to $B$; explicitlty therefore:

$$\ell[~A, B~] ~/~ \frac{\tau A[~\text{signal}, \circledR B \circledR \text{signal}~]}{2} = $$ $$\frac{c}{2} ~ \tau A[~\text{signal}, \circledR B \circledR \text{signal}~] ~/~ \frac{\tau A[~\text{signal}, \circledR B \circledR \text{signal}~]}{2} = c.$$

So the symbol "$c$" which had been formally introduced in the distance definition is (subsequently) identified as the value of (average) signal front speed (or colloqially: the "speed of light in vacuum").

Isn't the speed of light something that's measured physically?

No: there is nothing genuinely to be measured; the result is necessarily "$c$", as sketched above; plainly, and without any uncertainty. (Therefore,"$c$" also lends itself as an "obvious, natural unit of speed". But, of course, speed values are independent of any particular choice of units in which they are expressed.)

What can and should be measured, trial by trial, is instead foremost: whether two particular "ends" under consideration were and remained indeed at rest with respect each other (or to quantify as far as they were not).

user12262
  • 4,360
-1

In order to answer this question, one has to realize that the term "speed of light" has two components. There is the actual physical speed of light (electromagnetic radiation), and the value associated with it.

It should be readily apparent that the value is not a constant because it depends on the system of units used. It is 186,000 miles per second in one system, 299,792,458 m/s in another system, etc.. If we redefine the length and/or the time, we obtain different values.

However, since the actual speed of light depends only on the properties of the medium through which it propagates, therefore, the speed of light is "constant" (or absolute) in a homogeneous medium.

Guill
  • 2,573