Integrated circuits seem to have standard voltages of 5 V, 3.3 V, 2.5 V, 1.8 V, etc.
- Who decides these voltages?
- Why do smaller devices require lower voltages?
Integrated circuits seem to have standard voltages of 5 V, 3.3 V, 2.5 V, 1.8 V, etc.
A lower VDD is required as the gate geometry shrinks. This prevents damage to the CMOS gate oxide and minimizes leakage. When the fabs switched from 0.5um to 0.35um, the thinner gates could only handle potentials up to 3.6V. That led to supplies at 3.3V +/- 10%. With the switch to 0.18um the voltage was reduced further to 1.8V +/- 10%. In the latest processes (e.g. 45nm), the gates are made of high-k dielectrics such as halfnium to reduce leakage.
That's a combination of several factors:
Recently the picture got more complicated - the supply voltage can't easily scale down because of limited intrinsic transistor gain. This gain presents a tradeoff (at a given supply voltage) between the "on" resistance of the transistor channel, which limits switching speed, and "off" resistance that causes current leakage through it. That's why core supply voltage settled at around 1V causing the speed of new digital IC chips to grows more slowly and their power consumption to grows faster than it used to be. Things are getting worse if you consider manufacturing process variability - if you can't position transistor switching threshold voltage accurately enough (and as transistors are getting smaller it becomes very difficult) the margin between "on"/"off" resistances disappears. The variability is an engineering problem so, at least in theory, it is fixable, but limited gain of MOS transistors is something we have to live with until we get better devices.
New voltages have often been chosen to give some degree of compatibility with what came before them.
3V3 CMOS output levels were compatible with 5V TTL inputs, for example.
"Why do smaller devices require lower voltages?" Smaller ICs have less surface to get rid of the heat. Whenever a bit toggles somewhere in an IC, a capacitor has to be charged or discharged (i.e. the gate capacitance of a CMOS transistor). Although the transisotrs in a digital IC are usually very very tiny, there are a lot of them, so the issue is still important. The energy stored in a capacitor is equal to 0.5*C*U^2. Twice the voltage will cause 2^2=4 times the energy that has to be used for every MOSFET's gate. Therefore, even a small step down from, say, 2.5V to 1.8V will bring a considerabe improvement. That's why IC designers didn't just stick to 5V for decades and waited until the technology was ready to use 1.2V, but used all the other funny voltage levels in between.
The voltages appear to follow a pattern:
sqrt(2)/2. Still not perfect, but within 10% and it makes a lot more sense than your arbitrary fractions :P
– Nick T
Nov 09 '10 at 18:42
Short answer: The geeks at TI said so, and everyone else followed suit by making compatible or competing products.
5 Volts was chosen for noise immunity. Early chips were power hogs, causing ripple in the power supply every time something switched that designers would try to overcome by putting a capacitor on the supply pins of every chip. Even so, an extra 2.4 volts of headroom gave them a cushion against going into the forbidden area between 0.8V and 2.2V. Also, the transistors caused ~0.4 V voltage drop just by their operation.
The supply voltages have been dropping to extend battery life, and because the chip dies have been shrinking to make your portable devices smaller and lighter. The closer spacing of the components on the chip demands lower voltages to prevent excessive heating and because the higher voltage could cross through the thinner insulation.
Whoever makes an IC decides on the voltages it needs.
In the olden days someone started using 5V for digital logic and that stuck for a long time, mainly because it's much harder to sell a chip that needs 4V when everybody is designing with a lot of chips that run on 5V.
iow: The reason that everybody tends to use the same voltage is not so much a matter of them all choosing the same process as it's a matter of them not wanting to be cursed for using "unusual" voltages by the designers who use their chips.
Switching a signal at a certain speed takes more power if the voltage is higher, so with higher speeds you need lower voltages to keep the current down, that's why the faster, denser, modern circuits tend to use lower voltages than the old chips.
Many chips even use 3.3V for i/o and a lower voltage, like 1.8V for the internal core.
Chip designers know that 1.8V is an oddball voltage and will often have an internal regulator to provide the core voltage for the chip itself, sparing the designer from having to generate the core voltage.
For an example of the dual-voltage situation take a look at the ENC28J60 which runs on 3.3V, but has an internal 2.5V regulator.
The voltages are dictated by the physics of the materials (semiconductor materials anyhow) and the processes used in the making of the chip. (I hope I'm using the right terms here...) Different types of semiconductors have different gap voltages - essentially the voltage that 'activates' them. They can also optimize the structure of the chip to allow lower voltages to work more reliably when they do layouts (I believe).
It's not so much that smaller devices require lower voltages, it's that they've designed them to use smaller voltages because less voltage means less heat dissipation and potentially faster operation. It's easier to have a 10MHz clock signal if it only has to go between 0V and 1.8V.