11

This question stems from quite a few "informal" sources. Movies like 2001, A Space Odyssey and Ex Machina; books like Destination Void (Frank Herbert), and others suggest that general intelligence wants to survive, and even learn the importance for it.

There may be several arguments for survival. What would be the most prominent?

nbro
  • 42,615
  • 12
  • 119
  • 217
Eric Platon
  • 1,510
  • 10
  • 22

3 Answers3

6

Steve Omohudro wrote a paper called Basic AI Drives that steps through why we would expect an AI with narrow goals to find some basic, general concepts as instrumentally useful for their narrow goals.

For example, an AI designed to maximize stock market returns but whose design is silent on the importance of continuing to survive would realize that its continued survival is a key component of maximizing stock market returns, and thus take actions to keep itself operational.

In general, we should be skeptical of 'anthropomorphizing' AI and other code, but it seems like there are reasons to expect this beyond "well, humans behave this way, so it must be how all intelligence behaves."

Matthew Gray
  • 4,272
  • 19
  • 28
5

The concept of 'survival instinct' probably falls in the category of what Marvin Minsky would call a 'suitcase word', i.e. it packages together a number of related phenomena into what at first appears to be a singular notion.

So it's quite possible that we can construct mechanisms that have the appearance of some kind of 'hard-coded' survival instinct, without that ever featuring as an explicit rule(s) in the design.

See the beautiful little book 'Vehicles' by the neuroanatomist Valentino Braitenberg for a compelling narrative of how such 'top down' concepts as 'survival instinct' might evolve 'from the bottom up'.

Also, trying to ensure that intelligent artefacts place too high a priority on their survival might easily lead to a Killbot Hellscape.

NietzscheanAI
  • 7,286
  • 24
  • 38
2

I will answer by quoting the book The Myth of Sisyphus:

There is only one really serious philosophical question, and that is suicide.

So, probably, we need some degree of a survival instinct if we don't want our AGIs to "terminate themselves" (whatever that means), whenever they get existential.

nbro
  • 42,615
  • 12
  • 119
  • 217
k.c. sayz 'k.c sayz'
  • 2,121
  • 13
  • 27