1

EDIT This question is extremely badly worded and I would appreciate it if you could ignore it's content. I cut and pasted parts of two related but unfortunately distinctly different questions into one post and was unable to edit it to my satisfaction for a period of time. In particular, the Moore's Law references are not germane to the question. My apologies for this error.

I have voted to close my own question, due to its inherent unfixable problems, and if anybody else with sufficient reputation wants to vote to close, please feel free to put it out of its misery:) Again, my apologies for my posting, before checking for errors. END EDIT

There are estimated to be 86 billion neurons in the human brain. I am ignoring a further 90 billion non neuron support cells. It is difficult to establish a comparison (as regards "packing density") between the number of transistors in an integrated circuit CPU of a current computer and the neurons in a human brain.

As of 2015, the highest transistor count in a commercially available CPU (in one chip) is over 5.5 billion transistors, in Intel's 18-core Xeon Haswell-EP.

My obvious problem here is that I simply don't know how directly we can compare a human neuron with a presumably much simpler man-made transistor. I will assume they can be directly compared, but I realise this is a completely ad-hoc assumption. Human Brain Energy Use may be of relevance here.

My question is, can we estimate how closely our brains conform to Moore's law? My rationale is that, although a neuron is different in many ways to a transistor, if they both ultimately follow quantum mechanical principles, then I have assumed they both follow Moore's law, in the context of "you must keep the wires a certain distance apart".

In particular, it would be interesting to know, assuming a neuron is much more complex than a transistor, if our brain's cells actually follow the Moore's law.

3 Answers3

15

Moore's law from the Wikipedia article:

Moore's law is the observation that the number of transistors in a dense integrated circuit doubles approximately every two years (since the invention of integrated circuits).

His prediction concerned the rate at which the density of transistors in circuits increased with time. So I'm not sure it makes sense to apply Moore's law to the human brain. Even if you could compare neurons (or groups of neurons) to a transistor, the human brain evolved over many millions of years and isn't reinvented on a regular basis the way that integrated circuits are.

Siraj R Khan
  • 2,028
3

Although it isn't possible to answer your question, as the Human brain doesn't evolve its physical appearance in short time spans comparable to Moore's Law, I found some information that may help with the gist of your question.

There are many differences between the human brain and a computer that make a comparison between their potentials difficult. Volume occupied by the brain may be the best measure of its potential.

Difference #2 in the above link explains that memory retrieval in the brain is different from byte-addressable memory retrieval in a computer. The brain seems to have a more efficient retrieval system than a computer, IMO.

Difference #7 says the processing that takes place at brain synapses is chemical and spatial, as well as electrical, and is much more complex than the logic switches made of transistors. Synapses are not at all comparable to transistors.

The brain also is a good deal more adept at self-repair and self-change of its circuit utilization and design. It may be in this area that the brain has the most promise for increasing its intellectual capacity in a manner comparable to Moore's law. After all, human brains make Moore's law possible.

The human brain underwent a spurt of rapid development from 800,000 to 200,000 years ago, during a period of volatile climate change. The rapidity of climate changes from cold to hot and back again favored adaptable tool-makers who could adjust their social and material lifestyles quickly. The brain had to "earn its keep", because it uses an enormous amount of energy and leaves humans at a disadvantage in physical ability compared to other animals. This link from the Smithsonian Institution shows the period of rapid development and compares it to the previous 4 million years of relatively slow development (click on the chart comparing brain size with climate change volatility).

Ernie
  • 8,598
  • 1
  • 18
  • 28
0

In addition to the physical constraints on size, one has to consider the fact that the parts of any machines will degrade. When comparing a conventional machine to the brain, you need to consider the self repair and also the fact that the system is supposed to be copied to produce the next generation. The system has to be reproducible from scratch.

If you consider a machine consisting of macroscopic parts then there is a fundamental problem with maintaining the integrity of the system. The system itself needs to contain the information about its own formal description in order to be able to maintain itself or to be able to make copies if itself. However, each macroscopic part of the system can be in an astronomically large number of physical states (the entropy of the part), while the system only keeps track of a few macroscopic parameters. There is then no way for the system to prevent itself from degrading on the long term.

The only way this problem can be averted is if the system is build out of parts whose formal description will be encoded in the system such that there is no missing information that the system cannot act on.

Count Iblis
  • 10,396
  • 1
  • 25
  • 49