63

I've heard this in many quantum mechanics talks and lectures, nevertheless I don't seem to grasp the idea behind it.

What I mean is, at which point is that our modern understanding of quantum mechanics led to a technological development so fundamental for today's computers that we could not have got it working other way?

Why is it not enough with Maxwell, Bohr, Lorentz, (Liénard)?

Qmechanic
  • 220,844

9 Answers9

60

The reason is very simple. Computers depend on electronics. Even the first diodes and triodes that the first bulky computers were made up of depending on the quantum mechanical nature of matter. The present ones with the chip technology are directly dependent on energy levels and bands of conduction etc in the electronics used. Semiconductivity is a quantum mechanical phenomenon.

Edit after the editing of the question

What I mean is, at which point is that our modern understanding of quantum mechanics led to a technological development so fundamental for today's computers that we could not have got it working another way?

The crucial point where quantum mechanical calculations became necessary was with the use of transistor technology, which has morphed to chip technology. It was with the invention of the transistor that control of quantum mechanical calculations was necessary for the leaps in progress we have made. For the vacuum tube computers, it was not necessary except for explaining the tubes existence. The chip designs have reached the point of even needing to foresee the Casimir effect (QM vacuum between charged plates).

Why is it not enough with Maxwell, Bohr, Lorentz, (Liénard)?

Maxwell is not enough because the classical theory cannot explain atoms molecules and solid state. Bohr is not enough because the primitive calculations could not be used in complicated lattices. Lorenz is irrelevant for solid state physics, the energies of the ions and electrons are low.

xray0
  • 540
anna v
  • 236,935
19

I find this quite an imprecise catch phrase. It's as correct as saying with out quantum mechanics there would be no atoms because electrons would have fall onto nuclei.

There would be computers but not like the modern ones. The first (electrical) ones didn't depend on quantum mechanical effects, they used vacuum tubes in place of transitors. Not to mention you can make mechanical computers running even on water (I mean for signal instead of electrical current). Not very efficient though.

What they probably meant is that quantum effects lie at the basis of semiconductivity and solid-state transistors which led to a true electronic revolution. They made computers available kind of like Ford made cars, both made production mass-scaled and cheap.

EDIT: When you add "modern". It's a very vague term to me. Modern as in non mechanical - VC electronical, using high-intergration chips (solid state transistors), or nowadays modern?

I am not sure whether inventors of transistor used QM models to explain their work, or the inventors of 1st micro-chip. Maybe they didn't have to, they just needed find good materials. Never-the-less, hot it cannot be explained with out using QM, but this knowledge is not needed for things to work or invent and develop them.

Also, I am sure that today QM theories are needed and used to develop better and smaller transistors. These theories are used to simulate and design most basic building blocks of the most advanced chips that are being produced nowadays.

luk32
  • 413
12

Adding the word "modern" to the title of the question completely changes it. In modern computers you need semiconductors, and the whole theory of solid state physics (band structures, doping, etc) is based on a foundation of quantum mechanics - since electrons in semiconducting solids behave in a manner that is more wave-like than particle-like, with each electron occupying its own distinct state. Making a semiconductor work well requires in depth understanding of these things.

Floris
  • 119,981
9

Quantum mechanics led to a deeper understanding of field electron emission which was instrumental in developing the theory of electron energy bands and, in particular, an appreciation of the band gap. This let us work out the physics of semiconductors and develop models for selecting and refining semiconductor materials and treatments.

5

The key word that makes the statement approximately true is "modern." There are many computing devices that can be (and have been) created using pre-transistor parts. Pascal and Leibniz constructed adding devices with gears. Babbage designed (but did not build) a programmable computer with gears, linkages and metal plates with holes in them. Completely mechanical calculators that could add were ubiquitous before World War II, and various other devices like tabulators and calculators that could multiply were available (for a relatively large price).

The classic paper that is considered to have started the modern era of computing is Claude Shannon's 1937 masters thesis, that demonstrated that boolean algebra can be used to design relay circuits (that were wiedely used in telephone switching networks). The first computers used some combination of electro-mechanical relays and vacuum tube diodes and triodes, the design of which depends on electron ballistics, which is really classical models (although it involves quantum particles.)

An area of active research now is biochemical computing, which programs protein feedback loops in e-coli bacteria to perform (extremely simple) boolean computations. And people fool around building logic parts out of tinker-toys, mechano, and hydraulic switches.

But... the transistor, and especially the MOSFET are the only devices we currently have that we can reliably manufacture by the billions or trillions. So almost all modern (say post 1965 or so) computers are constructed almost exclusively from transistors.

5

The first programmable stored memory electronic computers were made with triodes. They do not need quantum mechanics to explain their operation. Lee de Forest invented the triode in 1906 http://en.wikipedia.org/wiki/Lee_De_Forest . The FET was invented by Lilienfeld in 1926 http://en.wikipedia.org/wiki/Julius_Edgar_Lilienfeld, and it being a majority carrier device its operation can be explained to an electrical engineer who will design it and with it very well without ever resorting to anything beyond phenomenological Maxwell's equations in the low-frequency limit. When it comes to bipolar transistors with their holes, minority carriers, etc., then one needs qm.

hyportnex
  • 21,193
4

I've heard this in many quantum mechanics talks and lectures, nevertheless I don't seem to grasp the idea behind it.

There is no “idea” behind it. Just general fluff talk, like saying we wouldn’t have the lightbulb without Ohm’s Law (Fact: Lightbulbs were in existence before Ohm formalised his law, and the first practical Swan-Edison bulbs had one feature—high resistance—derived from Ohm’s law, among many other innovations. Ohmic resistance is not essential for a lightbulb at all).

What I mean is, at which point is that our modern understanding of quantum mechanics led to a technological development so fundamental for today's computers that we could not have got it working other way?

You’re mixing the existence of quantum mechanics with understanding of it. In Physics, theory usually follows observation (except for a few dramatic cases). Today’s computers depend on semiconductor-transistor action, which is explained by QM theory. QM theory enabled the refinement of transistors (eg: predicting what doping materials would produce what effects on which substrates). This, while good, is not fundamental to computing or computers at all. Whether we could have had it working any other way? Definitely yes. Whether it is possible to compete with today’s computers with non-semiconductor tech? That’s a hypothetical question! Could the Allies have lost WW II? Yes, but they didn’t. So it is with semiconductors—they were (and are) the best available mechanism for cheap, ubiquitous computing.

Why is it not enough with Maxwell, Bohr, Lorentz, (Liénard)?

It’s enough with Newton, Coulomb/Gauss, Faraday, and van der Waals. It only depends on what your definition of “modern” is. For example, imagine an advanded Babbage machine built with nanoparticles, featuring molecular gears and cogs. Now imagine said machine with electrical linkages (using dynamos and capacitors) to mimic a CPU pinout. Can this machine replace a Core i5 and run Facebook? Absolutely. Do we have such machines? No.

Alex C.
  • 41
1

No. We had 'solid state' devices long before we understood them. The selenium rectifier and the 'cats whisker' diode we commercially available long before we understood them at the quantum level...19th century if I am correct.

SkipBerne
  • 428
1

Why is it said that without quantum mechanics we would not have modern computers?

I suspect this claim is often pronounced by people who like to stress how quantum mechanics (and sometimes, their work) is important. Computers are widely recognized useful machines even by laymen, which makes them efficient to use as a hammer in a discussion where somebody expresses their doubts about quantum theory.

It is true quantum theories have and are being used in studying materials and developing complicated devices. One should not forget, however, that this use is in form of simple effective models that are inspired partly by quantum ideas. Nobody constructs transistors based on solution of Schroedinger's equation for 6E23 particles.

The question of whether modern computer could be constructed without knowledge of quantum theory is hard to answer, because modern computer is a result of decades of evolution and work of thousands of people. Isolating them from quantum theories in a controlled experiment where they would strive for constructing modern computer would be very hard to achieve.

Thus, so far, the saying is a speculation.