84

Is there a physical reason behind the frequency and voltage in the mains electricity? I do not want to know why exactly a certain value was chosen; I am rather interested to know why that range/order of magnitude was selected. I.e., why 50 Hz and not 50000 Hz or 0.005 Hz?

For example, is 50 Hz the actual frequency at which a turbine rotates, and is it not practical to build one that rotates much faster or slower?

David Z
  • 77,804
SuperCiocia
  • 25,602

6 Answers6

152

Why is mains frequency 50Hz and not 500 or 5?

Engine efficiency, rotational stress, flicker, the skin effect, and the limitations of 19th century material engineering.

50Hz corresponds to 3000 RPM. That range is a convenient, efficient speed for the steam turbine engines which power most generators and thus avoids a lot of extra gearing.

3000 RPM is also a fast, but doesn't put too much mechanical stress on the rotating turbine nor AC generator. 500Hz would be 30,000 RPM and at that speed your generator would likely tear itself apart. Here's what happens when you spin a CD at that speed, and for funsies at 62,000 FPS and 170,000 FPS.

Why not slower? Flicker. Even at 40Hz an incandescent bulb cools slightly on each half cycle reducing brightness and producing a noticeable flicker. Transformer and motor size is also directly proportional to frequency, higher frequency means smaller transformers and motors.

Finally there is the skin effect. At higher frequencies AC power tends to travel at the surface of a conductor. This reduces the effective cross-section of the conductor and increases its resistance causing more heating and power loss. There are ways to mitigate this effect, and they're used in high tension wires, but they are more expensive and so are avoided in home wiring.

Could we do it differently today? Probably. But these standards were laid down in the late 19th century and they were convenient and economical for the electrical and material knowledge of the time.

Some systems do run at an order of magnitude higher frequency than 50Hz. Many enclosed systems such as ships, computer server farms, and aircraft use 400 Hz. They have their own generator, so the transmission loss due to the higher frequency is of less consequence. At higher frequencies transformers and motors can be made smaller and lighter, of great consequence in an enclosed space.

Why is mains voltage 110-240V and not 10V or 2000V?

Higher voltage means lower current for the same power. Lower current means less loss due to resistance. So you want to get your voltage as high as possible for efficient power distribution and less heating with thinner (and cheaper) wires. For this reason, power is often distributed over long distances in dozens to hundreds of kilovolts.

Why isn't it lower? AC power is directly related to its voltage. AC power at 10 volts would have trouble running your higher energy household appliances like lights, heating or refrigerator compressor motor. At the time this was being developed, the voltage choice was a compromise between the voltage to run lights, motors and appliances.

Why isn't it higher? Insulation and safety. High voltage AC wires need additional insulation to make them both safe to touch and to avoid interference with other wiring or radio receivers. Cost of home wiring was a major concern in the early adoption of electricity. Higher voltages would make home wiring more bulky, expensive and more dangerous.

Schwern
  • 4,534
27

In the end, the choice of a single specific number comes from the necessity to standardize. However, we can make some physical observations to understand why that final choice had to fall in a certain range.

Frequency

Why a standard?

First of all, why do we even need a standard? Can't individual appliances convert the incoming electricity to whatever frequency they want? Well, in principle it's possible, but it's rather difficult. Electromagnetism is fundamentally time invariant and linear; the differential equations we use to describe it Maxwells' equations are such that a system driven by a sinusoidal input at frequency $\omega$ responds only at that same frequency. In order to get out a frequency different from $\omega$ the electromagnetic fields have to interact with something else, notably charged matter. This can come in the form of a mechanical gear box or a nonlinear electrical elements such as transistors. Nonlinear elements such as the transistor can generate harmonics of the input, i.e. frequencies $2 \omega$, $3 \omega$, etc. However, in any case, frequency conversion introduces efficiency loss, cost, and bulkiness to the system.

In summary, because of the time invariance and linearity of electromagnetism, it is considerably more practical to choose a single frequency and stick to it

Light flicker

In a historical note by E. L. Owen (see references), it is noted that the final decision between 50 and 60 Hz was somewhat arbitrary, but based partially on the consideration of light flicker.

During the lecture, while Bibber recounted Steinmecz’s contributions to technical standards, he briefly repeated the story of the frequencies. By his account, “the choice was between 50- and 60-Hz, and both were equally suited to the needs. When all factors were considered, there was no compelling reason to select either frequency. Finally, the decision was made to standardize on 60-Hz as it was felt to be less likely to produce annoying light flicker.”

The consideration of light flicker comes up elsewhere in historical accounts and explains why very low frequencies could not be used. When we drive a pure resistance with an ac current $I(t) = I_0 \cos(\omega t)$, the instantaneous power dissipation is proportional to $I(t)^2$. This signal oscillates in time at a frequency $2\omega$ (remember your trig identities). Therefore, if $\omega$ is lower than around $40 \, \text{Hz}$$^{[a]}$, the power dissipated varies slowly enough that as a visual stimulus you could perceive it. This sets a rough lower limit on the frequency you can use for driving a light source. Note that the arc lamps in use when electrical standards were developed may not have had purely resistive electrical response (see Schwern's answer where cooling on each cycle is mentioned) but the source frequency is always present in the output even in nonlinear and filtered systems.

Reflections / impedance matching

Alternating current signals travelling on a wire obey wave-like behavior. In a rough sense, the higher the frequency the more wavy the signal. A good rule of thumb is that if the length of wires is comparable to or much longer than the wavelength of the signal, then you have to worry about wave-like phenomena such as reflection. The wavelength $\lambda$ of an electrical signal is roughly $$\lambda = c / f$$ where $c$ is the speed of light and $f$ is the frequency. Suppose we'd like to transmit the electricity from an electrical substation to a house and we want to keep the wavelength big enough to prevent reflection physics without having to deal with careful impedance matching. Let's put in a length of $1000 \, \text{m}$ to be conservative. Then we get $$f \leq c / 1000 \, \text{m} = 300 \, \text{kHz} \, .$$

Voltage

We're talking about the voltage inside the building here. Note that power is transmitted at much higher voltage and then stepped down near the end point. The 120 V choice apparently comes from the fact that electricity was originally used for lighting, and the first lamps back in those early days were most efficient at around 110 V. The value 120 V may have been chosen to offset voltage drop in the wires going to the lighting sources.

Further reading

Detailed document by E. L. Owen with references

$[a]$: I'm not an expert in human flicker perception. This number is a rough guess based on personal experience and some literature.

P.S. I consider this answer a work in progress and will add more as I learn more.

DanielSank
  • 25,766
13

The two other answers address the frequency issue. The voltage issue is much simpler.

If the voltage is too high, you run the risk of arcs between conductors. The minimum distance between conductors before an arc appears is proportional to voltage. At 240V, you arc at a distance of a few millimeters in air, depending on humidity. More voltage gets clearly impractical...

If the voltage gets lower, on the other hand, you need more current for a given power. But heating of wires is proportional to current squared: This means one needs thicker wire, with lower resistance. That's cumbersome, expensive and stiff (as an example, 32A rated wire is barely bendable enough for wall corners).

So the chosen 120/240V reflects this balance between arcing concerns (especially around connections) and wire heating.

I also heard that safety dictates highish voltage so that muscle spasms give you a chance to drop whatever you're touching before you get burnt to the core. I don't know to which extent this is true...

Nicolas
  • 1,627
2

The disadvantage of having too low a frequency is that the mains transformers become very large.

However there have been lower frequency standards (25 Hz, 15, ect.) These are used by trains (mostly legacy systems).

1

Practical reasons include the skin effect (you do not want your frequency to exceed at most a few kHz by much unless you are willing to use something akin to Litz wire to transfer large currents) and the size of magnetic cores for transformers, which must be able to magnetically store more than the maximum energy to be transmitted in each cycle, such that their volume grows with the cycle period. However, these physical constraints do not define a sharp optimum; as such, 10 Hz or 500 Hz would be just as reasonable and similar values are used in practice even today: Modern jet planes have 400 Hz power supplies whilst, at least in Germany, the power supply for electric trains is standardized at 16 2/3 Hz.

There is obviously a similar trade-off between voltage and current, but at least as long as your chosen frequency allows you to compensate a lower voltage with thicker wires and a higher voltage with thicker isolation, you might argue that this is more of an economic or safety trade-off. After all, for long distances, we transform to achieve a better compromise (and must use AC rather than DC to always be able to do that, even with purely passive, historically old techniques). Hence I suspect, without actually knowing, that historic reasons, such as the maximum practical voltage for which light bulbs could be made during the time of standardization, or perhaps accompanying ideas about what might still not be too dangerous for factories and homes, play a role.

SuperCiocia
  • 25,602
0

It seems like 60 Hz may have been selected instead of 55 or 75 simply because there are 60 seconds in a minute and so 60 cycles per second seemed a comfortable number.

During the early days of distributed power transmission the frequencies and voltages would have been all over the place. The limits of what was safe and convenient would have been developed through practical experience.

The materials used for transformers would have preferred low frequencies. The mass of transformers would have preferred high frequencies. The range of 50-60 was the sweet spot and 50 and 60 are both 'round' numbers that divide well for timing purposes.

The voltages would have standardised somewhat with the equipment supplied, light bulbs, motors and such would have been sold to match a local supply and vendor voltage ranges would have promoted generation voltage optimisation.

KalleMP
  • 266