1

I am looking for the really early attempts. Like trying to make a mathematical framework to describe the human mind before computers existed. Sure their means were very limited. Still someone must of tried.

Maze
  • 11
  • 2

2 Answers2

1

The book that contains the possible correct answers to your question(s) is

The Quest for Artificial Intelligence (2009) by Nils J. Nilsson

I read this book a few years ago (although not entirely) and I think it's quite interesting. (Btw, I wrote this blog post while reading this book.) Nilsson was an important person in the AI field (e.g. he contributed to the creation of the A* search algorithm), and he also published other AI books. So, I think you can rely on this book as the main reference for historical AI facts and achievements.

I'll not try to summarize anything because I forgot many things and a version of the book is freely available here.

I'll just quote one of the first sentences of the book

The Iliad of Homer talks about self-propelled chairs called "tripods" and golden "attendants" constructed by Hephaistos, the lame blacksmith god, to help him get around.

Having said that, I think there's a difference between describing the human mind and trying to create/build or dreaming of an AI - that's why we have the different fields like neuroscience, cognitive science, psychology and artificial intelligence - although they are certainly related. Moreover, there's a difference between describing/imagining/designing and building an AI.

The book will give you examples of dreamers, people that designed but maybe not built the AIs from the design, like Leonardo DaVinci, and people that actually tried to build AIs or automata.

Finally, it's important to note that, over the years, there have been many definitions of AI and a more accurate or specific answer would require one to assume a specific definition.

nbro
  • 42,615
  • 12
  • 119
  • 217
0

As early as the 17th century Leibniz dreamed of a characteristica universalis as a symbolic language of thought capable of representing all human thought along with a calculus ratiocinator, a logical calculus for reasoning. His famous dictum is "let's calculate" for any encountered dispute. This attempt is very akin to the early symbolic AI and knowledge base projects and the contemporary Cyc project is one of the culminations.

The Latin term characteristica universalis, commonly interpreted as universal characteristic, or universal character in English, is a universal and formal language imagined by Gottfried Leibniz able to express mathematical, scientific, and metaphysical concepts. Leibniz thus hoped to create a language usable within the framework of a universal logical calculation or calculus ratiocinator... The concept is sometimes paired with his notion of a calculus ratiocinator and with his plans for an encyclopaedia as a compendium of all human knowledge.

In his book The Laws of Thought (1854), George Boole introduced Boolean algebra providing a formal system for logical reasoning where he saw reasoning itself as reducible to algebraic manipulation which was a massive step towards formal reasoning in symbolic AI. Later in 1936 Alan Turing, who's often considered the father of AI, further formalized the idea of computation with the Turing machine and proposed the famous Turing Test as a measure of machine intelligence.

In 1867 Hermann von Helmholtz first proposed that perception is essentially unconscious inference. The brain makes probabilistic inference to interpret sensory input which foreshadowed modern Bayesian inference and connectionism in cognitive science and machine learning.

Work in computer science has made use of Helmholtz's ideas of unconscious inference by suggesting the cortex contains a generative model of the world. They develop a statistical method for discovering the structure inherent in a set of patterns

These thinkers tried to formalize reasoning, perception, and learning using tools like logic, probability, and symbolic reasoning long before computers existed. They laid the conceptual groundwork that made later developments in AI, machine learning, and cognitive science possible.

cinch
  • 11,000
  • 3
  • 8
  • 17