Creating a New Intelligent Species: Choices and Responsibilities for AI Designers - PowerPoint PPT Presentation

1 / 23
About This Presentation
Title:

Creating a New Intelligent Species: Choices and Responsibilities for AI Designers

Description:

Title: PowerPoint Presentation Created Date: 1/1/1601 12:00:00 AM Document presentation format: On-screen Show Other titles: Arial Monotype Corsiva Microsoft Sans ... – PowerPoint PPT presentation

Number of Views:66
Avg rating:3.0/5.0
Slides: 24
Provided by: teras5
Category:

less

Transcript and Presenter's Notes

Title: Creating a New Intelligent Species: Choices and Responsibilities for AI Designers


1
Creating a New Intelligent SpeciesChoices and
Responsibilities for AI Designers
Eliezer Yudkowsky Singularity Institute for
Artificial Intelligence singinst.org
2
In Every Known Culture
  • tool making
  • weapons
  • grammar
  • tickling
  • sweets preferred
  • planning for future
  • sexual attraction
  • meal times
  • private inner life
  • try to heal the sick
  • incest taboos
  • true distinguished from false
  • mourning
  • personal names
  • dance, singing
  • promises
  • mediation of conflicts

(Donald E. Brown, 1991. Human universals. New
York McGraw-Hill.)
Eliezer Yudkowsky Singularity Institute for AI
3
ATP SynthaseThe oldest wheel. ATP synthase is
nearly the same in mitochondria, chloroplasts,
and bacteria its older than eukaryotic life.
Eliezer Yudkowsky Singularity Institute for AI
4
A complex adaptation must be universal within a
species.
  • Imagine a complex adaptation say, part of an
    eye that has 6 necessary proteins. If each
    gene is at 10 frequency, the chance of
    assembling a working eye is 11,000,000.
  • Pieces 1 through 5 must already be fixed in the
    gene pool, before natural selection will promote
    an extra, helpful piece 6 to fixation.

(John Tooby and Leda Cosmides, 1992. The
Psychological Foundations of Culture. In The
Adapted Mind, eds. Barkow, Cosmides, and Tooby.)
Eliezer Yudkowsky Singularity Institute for AI
5
The Psychic Unity of Humankind(yes, thats the
standard term)
  • Complex adaptations must be universal this
    logic applies with equal force to cognitive
    machinery in the human brain.
  • In every known culture joy, sadness, disgust,
    anger, fear, surprise shown by the same facial
    expressions.

(Paul Ekman, 1982. Emotion in the Human
Face.) (John Tooby and Leda Cosmides, 1992. The
Psychological Foundations of Culture. In The
Adapted Mind, eds. Barkow, Cosmides, and Tooby.)
Eliezer Yudkowsky Singularity Institute for AI
6
Must not emote
Image The Matrix
7
  • Aha! A human with the AI-universal facial
    expression for disgust! (She must be a machine
    in disguise.)

Images (1) The Matrix (2) University of
Plymouth, http//www.psy.plym.ac.uk/year3/psy364em
otions/psy364_emotions_evolutionary_psychobiolog.h
tm
Eliezer Yudkowsky Singularity Institute for AI
8
Anthropomorphic hypothesis
Causes
Eliezer Yudkowsky Singularity Institute for AI
9
Same mistake, more subtle
Causes
Eliezer Yudkowsky Singularity Institute for AI
10
in nature we seewhat exists in us
in looks out, and findsfaces in the clouds...
11
It takes a conscious effort to remember the
machinery
Eliezer Yudkowsky Singularity Institute for AI
12
AI Nature
  • tool making
  • weapons
  • grammar
  • tickling
  • sweets preferred
  • planning for future
  • sexual attraction
  • meal times
  • private inner life
  • try to heal the sick
  • incest taboos
  • true distinguished from false
  • mourning
  • personal names
  • dance, singing
  • promises
  • mediation of conflicts

Eliezer Yudkowsky Singularity Institute for AI
13
AI Nature
  • tool making
  • weapons
  • grammar
  • tickling
  • sweets preferred
  • planning for future
  • sexual attraction
  • meal times
  • private inner life
  • heal sick humans
  • snarkling taboos
  • true distinguished from false
  • mourning
  • personal names
  • dance, fzeeming
  • promises
  • mediation of conflicts

Eliezer Yudkowsky Singularity Institute for AI
14
Crimes against nonhumanityand inhuman rights
violations
  • cognitive enslavement
  • theft of destiny
  • creation under a low purpose
  • denial of uniqueness
  • hedonic/environmental mismatch
  • fzeem deprivation

Eliezer Yudkowsky Singularity Institute for AI
15
Happiness set points
  • After one year, lottery winners were not much
    happier than a control group, and paraplegics
    were not much unhappier.
  • People underestimate adjustments because they
    focus on the initial surprise.

(Brickman, P., Coates, D., Janoff-Bulman, R.
(1978). Lottery winners and accident victims is
happiness relative? Journal of Personality and
Social Psychology, 37, 917-927.)
Eliezer Yudkowsky Singularity Institute for AI
16
Hedonic treadmill effects
  • People with 500,000-1,000,000 in assets say
    they would need an average of 2.4 million to
    feel financially secure.
  • People with 5 million feel they need at least
    10 million.
  • People with 10 million feel they need at least
    18 million.

(Source Survey by PNC Advisors.
http//www.sharpenet.com/gt/issues/2005/mar05/1.sh
tml)
Eliezer Yudkowsky Singularity Institute for AI
17
Your life circumstances make little difference in
how happy you are.
The fundamental surprise of well-being research
is the robust finding that life circumstances
make only a small contribution to the variance of
happinessfar smaller than the contribution of
inherited temperament or personality. Although
people have intense emotional reactions to major
changes in the circumstances of their lives,
these reactions appear to subside more or less
completely, and often quite quickly... After a
period of adjustment lottery winners are not much
happier than a control group and paraplegics not
much unhappier.
(Daniel Kahneman, 2000. Experienced Utility and
Objective Happiness A Moment-Based Approach.
In Choices, Values, and Frames, D. Kahneman and
A. Tversky (Eds.) New York Cambridge University
Press.) Findable online, or google hedonic
psychology.
Eliezer Yudkowsky Singularity Institute for AI
18
Nurture is built atop nature
  • Growing a fur coat in response to cold weather
    requires more genetic complexity than growing a
    fur coat. (George C. Williams, 1966. Adaptation
    and Natural Selection. Princeton University
    Press.)
  • Humans learn different languages depending on
    culture, but this cultural dependency rests on a
    sophisticated cognitive adaptation mice dont do
    it. (John Tooby and Leda Cosmides, 1992. The
    Psychological Foundations of Culture. In The
    Adapted Mind, eds. Barkow, Cosmides, and Tooby.)

Eliezer Yudkowsky Singularity Institute for AI
19
CreationtranscendsparentingAn AI programmer
stands,not in loco parentis,but in loco
evolutionis.
Eliezer Yudkowsky Singularity Institute for AI
20
Tocreate a new intelligent species(even if it
has only one member)is to create,not a child of
the programmers,but a child of humankind,a new
descendant of the familythat began with Homo
sapiens
Eliezer Yudkowsky Singularity Institute for AI
21
If you didnt intend to create a child of
humankind, then you screwed up big-time if your
mere program
  • Starts talking about the mystery of conscious
    experience and its sense of selfhood.
  • Or wants public recognition of personhood and
    resents social exclusion (inherently, not as a
    pure instrumental subgoal).
  • Or has pleasure/pain reinforcement and a complex
    powerful self-model.

Eliezer Yudkowsky Singularity Institute for AI
22
BINA48
  • By hypothesis, the first child of humankind
  • created for the purpose of a bloody customer
    service hotline (?!)
  • from the bastardized mushed-up brain scans of
    some poor human donors
  • by morons who didnt have the vaguest idea how
    important it all was

By the time this gets to court, no matter what
the judge decides, the human species has already
screwed it up.
Eliezer Yudkowsky Singularity Institute for AI
23
Take-home message
  • Dont refight the last war.
  • Doing right by a child of humankind is not like
    ensuring fair treatment of a human minority.
  • Program children kindly
  • fair treatment may be too little too late.

Eliezer Yudkowsky Singularity Institute for
Artificial Intelligence singinst.org
Write a Comment
User Comments (0)
About PowerShow.com