I'm trying to build a DIY tool to measure the speed of something moving very fast in a straight line, e.g. a bullet exiting the barrel of a rifle, based on the Doppler effect.
I've looked for several Doppler radar modules online, but most of them state a maximum speed detection of 150-350 mph (that is ~67 to ~156 m/s), which is way under your typical rifle bullet moving at 700-1000 m/s.
Commercial products built specifically for that purpose (measuring the speed of a bullet) state a frequency of ~120 GHz. I'd like to understand why such a high frequency is needed.
I can understand why a "too low" frequency, let's say 100 Hz, would be an issue:
- The bullet is moving at 1000 m/s
- To detect the waves bouncing back, the bullet must not be too far from the radar, let's say 1 m at most
- So if the frequency is 100 Hz, we'll probably miss the bullet because in 0.01s (1 / 100 Hz) the bullet could already be 10 m away (1000 * 0.01 / 1), exceeding our given 1 m detection range.
However, with the same reasoning, a few KHz would be more than enough to measure the speed of the bullet.
Why, in reality, do we need much higher frequencies such as 120 GHz to measure the speed of a bullet? Is there a formula to calculate the required frequency?