John Searle - PowerPoint PPT Presentation

1 / 40
About This Presentation
Title:

John Searle

Description:

William Rapaport, for instance, deems it 'a rival to the Turing Test as a ... into a network of capacities executed by primitive processes (reductionism) ... – PowerPoint PPT presentation

Number of Views:614
Avg rating:3.0/5.0
Slides: 41
Provided by: sal64
Category:

less

Transcript and Presenter's Notes

Title: John Searle


1
John Searle
  • John Searle is Mills Professor of Philosophy of
    Mind and Language at the University of California
    at Berkeley
  • His Chinese room argument is perhaps the most
    influential and widely cited argument against
    claims of artificial intelligence .

Chinese Room John Searle
2
John Searle
  • This "infamous Chinese room argument" has been
    described by Stevan Harnad as having "already
    achieved the status of a minor classic" and as
    having "shook up the entire AI field" so
    considerably that "things still have not settled
    down since" .
  • William Rapaport, for instance, deems it "a rival
    to the Turing Test as a touchstone of
    philosophical inquiries into the foundations of
    AI"
  • On the other hand, Searle's argument has been
    decried, by Dennett as "sophistry" and, by
    Hofstadter, as a "religious diatribe against AI
    masquerading as a serious scientific argument"

Chinese Room John Searle
3
Weak versus Strong AI
  • Weak AI modeling or simulating intelligence on a
    computer. Simulated hurricane does not blow down
    trees.
  • Strong AI constructing actual intelligence on a
    computer. Must have intentionality

Chinese Room Weak Vs.Strong
4
Intentionality
  • Intentionality is by definition that feature of
    certain mental states by which they are directed
    at or are about object and states of affairs in
    the world. Thus, beliefs, desires and intentions
    are intentionality states undirected forms of
    anxiety or depression are not

Chinese Room Weak Vs.Strong
5
Strong AI
  • Thinking is a species of computation.
    (Computationalism/Functionalism)
  • Universal Turing Machines can compute any
    computable function. (TuringChurch thesis)
  • Digital computers are Universal Turing Machines.
  • therefore,
  • Digital computers can think. (Possible AI)

Chinese Room Weak Vs.Strong
6
Chinese RoomSimulating Simulation
  • Suppose a computer program can simulate the
    understanding of Chinese by examining all the
    Chinese symbols it receives, consulting an
    internal look-up table and determining what
    symbols to send back as output.
  • Suppose this program could pass the Turing Test
  • Suppose Searle himself (who understands no
    Chinese) replaces the program
  • Since Searle obviously would not understand
    Chinese, neither could the computer simulation.

Chinese Room Simulating Simulation
7
Chinese Room Main Issue
  • The real issue is whether the right kind of
    computation / program is sufficient for having a
    mind. Does something think in virtue of having
    the right software?

Chinese Room Main Issue
8
Chinese Room Objections
  • Systems Room knows Chinese, not man Searle
    suggests the man in the room internalize
    everything, then he is the whole system. But
    there is a real problem of scale.
  • Robot Room lacks connection to world. The
    computer can only point to other data (even the
    mechanisms to control a robot are, internally,
    just device addresses) while human
    intentionality points outside the brain (e.g. to
    actual horses). But brains dont directly
    interact with the world either.
  • Brain Simulator As long as it simulates only
    the formal structure of the sequence of neuron
    firings at the synapses, it wont have simulated
    what matters about the brain its ability to
    produce intentional states.

Chinese RoomObjections
9
Chinese Room Objections
  • Combination Of 1, 2 and 3
  • Other Minds How do you know that anyone has a
    mind?
  • Many mansions Future developments

Chinese RoomObjections
10
Main Points
  • But the main point of the present argument is
    that no purely formal model will ever be by
    itself sufficient for intentionality, because the
    formal properties are not by themselves
    constitutive of intentionality and they have by
    themselves no casual powers except the power,
    when instantiated, to produce the next state of
    formalism when the machine is running.

Chinese RoomObjections
11
Main Points
  • Because formal symbol manipulations by
    themselves dont have any intentionality. They
    are meaningless-they arent even symbol
    manipulations, since the symbols dont symbolize
    anything. In the linguistic jargon, they have
    only a syntax but no semantics.

Chinese RoomObjections
12
Main Points
  • Searle's primary argument remains the linking
    between the simulation of intelligence, with
    "real" intelligence. In the Chinese room he
    attempts to put this on a firm footing, but in
    the discussion he makes this claim more clear
    when he notes no one would expect to get wet
    jumping into a swimming pool full of ping-pong
    balls simulating water".

Chinese RoomObjections
13
Searle is NOT saying
  • Machines cannot think
  • Thinking organisms have to be biological
  • Thinking is not symbol manipulation
  • Searle is saying
  • Hardware with the right casual powers AND
    software important
  • Consciousness is necessary

Searle is NOT saying
14
Chinese RoomScientific American Article January
1990
  • (A1) Programs are formal (syntactic).
  • (A2) Minds have mental contents (semantics).
  • (A3) Syntax by itself is neither constitutive of
    nor sufficient for semantics
  • to the following, conclusion (p.27)
  • (C1) Programs are neither constitutive of nor
    sufficient for minds.
  • Searle then adds a fourth axiom (p.29)
  • (A4) Brains cause minds.
  • from which we are supposed to "immediately
    derive, trivially" the conclusion (p.29)
  • (C2) Any other system capable of causing minds
    would have to have causal powers (at least)
    equivalent to those of brains.
  • Finally, from the preceding, Searle claims to
    derive two further conclusions (p.29)
  • (C3) Any artifact that produced mental
    phenomena, any artificial brain, would have to be
    able to duplicate the specific causal powers of
    brains, and it could not do that just by running
    a formal program.
  • and
  • (C4) The way that human brains actually produce
    mental phenomena cannot be solely by virtue of
    running a computer program.

Chinese Room Axioms
15
Some Observations
  • Realistic implementation? What is the scale?
  • Chinese room and computer cannot cause mental
    event but he never explains how the brain can
  • Agrees consciousness is an emergent property, but
    claims physical medium must be right

Final Arguments
16
Some Observations
  • The Chinese Room begs the question. The very
    existence of "understanding" is at stake the
    question is whether the Chinese Room
    "understands". Yet "understanding" is simply
    postulated of other entities-- human minds, and
    the human inside the Chinese Room. Searle cannot
    have it both ways.

Final Arguments
17
Some Observations
  • The man in the Chinese Room represents the
    computer-- more precisely, the CPU, the tiny chip
    which actually executes the instructions that
    make up a program.
  • It seems obvious the CPU really doesn't
    understand, and no one who believes in "strong
    AI" thinks it does.
  • But note Searle's sleight of hand. The Chinese
    Room story tells us that the CPU doesn't
    "understand". But a page later he's trying to say
    that programs don't understand. ("Programs are
    neither constitutive of nor sufficient for
    minds.")

Final Arguments
18
BlockComputer Model of the Mindhttp//www.nyu.ed
u/gsas/dept/philo/faculty/block/papers/msb.html
  • The mind is the program of the brain
  • For cognitive scientists intelligent computers
    offer the chance to learn about the mind

BlockModel of the Mind
19
BlockDefinitions of Intelligence
  • Two definitions
  • the word..behavioral like the Turing test
  • construction.. how it works (functionalism)

Block Definitions of Intelligence
20
BlockDefinitions of Intelligence
  • Functionalism is a theory in philosophy developed
    as an answer to the mind body problem  because of
    objections to both identity theory and logical
    behaviorism. Its core idea is that the mental
    states can be accounted for without taking into
    account the underlying physical substrate (the
    neurons), instead attending to higher-level
    functions such as beliefs, desires, and emotions

Block Definitions of Intelligence
21
BlockHomunculi
  • The mind consists of smaller and smaller and
    stupider and stupider homunculi until you finally
    reach to level of a mechanical homunculi

Block Homunculi
22
BlockIntelligence
  • Intelligent capacities are understood via
    decomposition into a network of capacities
    executed by primitive processes (reductionism).

Block Intelligence
23
Block Multiplication
Multiplication is broken down into repetitive
addition
Block Intelligence
24
BlockPrimitive Processes
  • For primitive processes the question of how they
    work is not a question for cognitive scientist.
  • They are the only computational devices for which
    behaviorism is true.

Block Primitive Processes
25
Block Gates The primitive Processes for Adder
Block Primitive Processes
26
Block Gates The Primitive Processes for Adder
  • Primitive processes considered equivalent if they
    have the same input and output
  • Here gates can be realized in different ways
    different ways of representing bistable states
  • Hardware irrelevant to computational description

Block Primitive Processes
27
Block Gates The Primitive Processes for Adder
Electrical AND gate open 0 closed 1
Block Primitive Processes
28
Block Gates The Primitive Processes for Adder
Cat and Mouse AND Gate hungry mouse 0 mouse
fed 1
Block Primitive Processes
29
Block Modeling the Mind
  • Cognitive science attempts to model the primitive
    processes of the mind
  • At this level hardware is irrelevant

Block Intelligence and Intentionality
30
Block Intentionality
  • Intentionality is aboutness
  • The thought that the moon is full and the
    perceptual state that the moon is full are both
    intentional states
  • Pain is not an intentional state

Block Intentional Levels
31
Block Functionalism and Intentionality
  • The method of functional analysis that explains
    intelligent processes by reducing them to
    unintelligent mechanical processes does not
    explain intentionality
  • The parts of an intentional system can be just as
    intentional as the whole system the component
    processors of an intentional system can
    manipulate symbols that are about just the same
    things as the symbols manipulated by the whole
    system

Block Intentional Levels
32
Block Functionalism and Intentionality
  • The multiplier was explained via a decomposition
    into devices that add, subtract and the like.
  • The multiplier's states were intentional in that
    they were about numbers. The states of the adder,
    subtractor, etc., are also about numbers and are
    thus similarly intentional.

Block Intentional Levels
33
BlockIntentional Levels
  • There is, however, an important relation between
    intentionality and functional decomposition
  • Though the multiplier's and the adder's states
    are about numbers, the gate's representational
    states represent numerals
  • In general the subject matter of representations
    shift as we cross the divide from complex
    processors to primitive processors.

Block Intentional Levels
34
BlockIntentional Levels
  • Lowest intentional level are the primitive
    processes
  • This level is the realization level where input
    and output are about things but they do not
    have parts that are about things

Block Intentional Levels
35
Block Symbols and Representations
  • At the primitive process level, there is a shift
    of subject matter from abstractions like numbers
    or from things in the world to the symbols
    themselves .
  • Difference between numbers and numerals are like
    the difference between the city of Boston and the
    word Boston.

Block Intentional Levels
36
BlockIntentional Levels
  • The algorithm used by the multiplier notation is
    independent. It depends on the properties of the
    numbers represented, not the representations
    themselves.
  • By contrast, the internal operation of the adder
    depends on binary notation, and its description
    speaks of numerals
  • The adder works in binary, but not in other
    standard notations.

Block Intentional Levels
37
BlockIntentional Levels
  • The designer has found a machine which has
    physical aspects that can be interpreted
    symbolically
  • Under that symbolic interpretation, there are
    symbolic regularities that are isomorphic to
    rational relations among the semantic values of
    the symbols
  • It is the isomorphism between these two functions
    that explains how it is that a device that
    manipulates symbols manages to add numbers.

Block Intentional Levels
38
BlockIntentional Levels
  • The idea of the brain as a syntactic engine
    driving a semantic engine is just a
    generalization of this picture to a wider class
    of symbolic activities, namely the symbolic
    activities of human thought.
  • The idea is that we have symbolic structures in
    our brains, and that nature (evolution and
    learning) has seen to it that there are
    correlations between causal interactions among
    these structures and rational relations among the
    meanings of the symbolic structures.

Block Intentional Levels
39
BlockComputational System
  • As a computational system is decomposed reach a
    point where there is a shift in subject matter
    from the things in the world to the symbols
    themselves

Block Computational System
40
BlockSymbolic Representation
  • Remember
  • The map is not the territory

Block Symbolic Representation
Write a Comment
User Comments (0)
About PowerShow.com