Can someone compare the energy efficiency of human brain as a computer ? What is the energy in joules / flop ? may be some reasonable assumptions on the computational load of common tasks such as pattern recognition or speech synthesis can be used.
2 Answers
Human power consumption can be guesstimated as 100W, similar to the power consumption of an ordinary computer, plus or minus a few orders of magnitude depending on one's idea of "ordinary". A computer can do billions of flops per second, and it would take me many seconds or minutes to perform one with pen and paper, and furthermore I will make many more errors. If we assume that there is some other task which is stacked the opposite way, i.e. a human can perform it a billion times faster than a computer, and that both of these are in some sense extreme cases, then given some more "fair" test we can say that ratio of the efficiency is probably somewhere between $10^{-9}$ and $10^9$.
- 488
The brain is massively parallel, so it tends to come out looking very good. The OP suggested using joules/flop as the measure of (in)efficiency. This leaves considerable ambiguity. I believe the way neurons typically work is that they form something like a weighted average of their binary inputs, and generate a binary output that is based on a threshold value for that average. I would consider this to be the moral equivalent of a floating-point operation. Of course if a floating-point operation means working out a long-division problem using paper and pencil, then the result is going to be horrible -- a kilojoule per flop for me, or infinitely many joules per flop for a kindergartener who hasn't yet learned the long division algorithm. To me it seems perverse to say that a kindergartener's brain has zero efficiency compared to an x86, so I'm going to equate one neuron's weighted-average operation to one flop.
I'm not a big fan of Ray Kurzweil, but he does have a good summary of some relevant data in his 2005 book The Singularity is Near. There's a lot of ambiguity in trying to estimate the number of fundamental operations involved in a certain neurological process. Kurzweil refers to "synaptic transactions," and equates one of those to be something like $10^3$ "arithmetic" operations, where I imagine that he means something roughly similar to my definition of an arithmetic operation above. Anyway, subject to all these ambiguities, the studies he cites estimates of $10^{14}$-$10^{19}$ Hz for the rate of operations per second. If the brain draws ~10 W (Dan's link says 20), then this is an energy consumption of $10^{-13}$-$10^{-18}$ joules per operation. Since a desktop computer currently does $\sim10^9$ arithmetic operations per second, this makes the brain more efficient, as measured by joules per operation, by about a factor of $10^6$-$10^{10}$.