fetzerreply-cpa.ppt - PowerPoint PPT Presentation

About This Presentation
Title:

fetzerreply-cpa.ppt

Description:

X is a semiotic system iff X carries out a process that mediates between a sign & its interpretant Semiotic systems interpret signs Algorithms describe ... – PowerPoint PPT presentation

Number of Views:71
Avg rating:3.0/5.0
Slides: 44
Provided by: CSEDepa
Learn more at: https://cse.buffalo.edu
Category:

less

Transcript and Presenter's Notes

Title: fetzerreply-cpa.ppt


1
fetzerreply-cpa.ppt
  • version 20120523
  • for Canadian Phil. Assn.
  • based on fetzerreply-ccs.ppt

2
How Cognition Could Be ComputingSemiotic
Systems, Computers, the Mind
  • William J. Rapaport
  • Department of Computer Science Engineering,
  • Department of Philosophy, Department of
    Linguistics,
  • and Center for Cognitive Science
  • rapaport_at_buffalo.edu
  • http//www.cse.buffalo.edu/rapaport

3
Summary
  • Computationalism cognition is computable.
  • Mental processes can be the result of algorithmic
    procedures
  • that can be affected by emotions/attitudes/indivi
    dual histories.
  • Computers that implement these (cognitive)
    procedures really exhibit those mental
    processes.
  • They are semiotic ( sign-using) systems.
  • They really think.
  • Syntactic semantics explains how all this is
    possible.

4
I. What Is Computationalism?
  • What is AI?
  • Not artificial Its a computational theory
  • Not about intelligence Its about cognition
  • Better name
  • Computational cognition
  • cf. computational linguistics
    computational statistics computational
    geometry, etc.

5
What Is Computationalism?
  • Computationalism ? cognition is computation
  • Hobbes, McCulloch/Pitts, Putnam, Fodor, Pylyshyn,
  • interesting, worth exploring, possibly true
  • BUT
  • Not what computational/computable usually
    mean!
  • What should computationalism be?
  • Must preserve crucial insight
  • cognition is explainable via mathematical theory
    of computation

6
Preliminary Definitions
  • Cognition
  • whatever cognitive scientists study,
    including
  • believing
  • consciousness
  • emotion
  • language
  • learning
  • memory
  • perception
  • planning
  • problem solving
  • reasoning
  • representation
  • including categorization, concepts, mental
    imagery, etc.
  • sensation
  • thought, etc.

7
Preliminary Definitions
  • Algorithm (informal notion)
  • A is an algorithm for executor E to accomplish
    goal G informally
  • A is a procedure (finite set/seq of
    statements/rules/instructions) such that
  • Each statement/rule/instruction S is such that
  • S is composed of finite of symbols/marks from
    finite alphabet
  • S is unambiguous for Ei.e.
  • E knows how to execute S
  • E can execute S
  • S can be executed in finite time
  • after executing S, E knows what to do next
  • A halts ( takes finite time)
  • A halts with G accomplished

8
Preliminary Definitions
  • Effective
  • Church left undefined
  • Rosser
  • each step is precisely determined
  • method produces an answer
  • in finite of steps
  • Markov
  • process produces an answer
  • Kleene
  • effective procedure algorithm
  • Knuth
  • all operations can be done exactly, in finite time

9
Preliminary Definitions
  • Algorithm (formal notion)
  • A is an algorithm formallyA is (logically
    equivalent to) a Turing machine
  • Church-Turing Thesis
  • algorithminformal algorithmformal

10
Computable
  • Task / goal / field of study G is computable
    iff
  • ? algorithm(s)formal for G

11
The Proper Treatment of Computationalism
  • Computationalism ? Cognition is computation

12
The Proper Treatment of Computationalism
  • Computationalism Cognition is computable
  • i.e., ? algorithm(s) that compute cognitive
    functions
  • Working assumption of computational cognitive
    science
  • All cognition is computable
  • Basic research question of computational
    cognitive science
  • How much of cognition is computable?

13
Proper Treatment of Computationalism
  • Implementational implication(multiple
    realization)
  • If cognition is computable, then anything
    that implements cognitive computations would
    be cognitive (would really think)
  • even if humans dont do it that way!
  • Turing
  • brain might be analog, but digital computer can
    still pass TT
  • Piccinini
  • neural spike trains are not representable as
    digit strings? not computational /? brain does
    not compute
  • BUT
  • cog. functions whose O/P they produce are
    computable
  • ? human cognition is computable but not computed

14
Proper Treatment of Computationalism
  • 2 Views of Cognitive Science
  • How do humans cognize? (part of CogSci)
  • Might not be computationally.
  • BUT Can abstract away from the specifically
    human to a more general issue
  • How is cognition possible? (also part of
    CogSci)
  • Might be computable.

15
The Proper Treatment of Computationalism
  • 2 Views of Computationalism
  • Cognition is computation
  • strong / narrow / nearsighted view
  • the mind or brain is a computer
  • how the M/B does what it does is by computing
  • vs. the proper treatment
  • Cognition is computable
  • weak / wide / farsighted view
  • what the mind or brain does can be described in
    computational terms
  • how it does it is a matter for neuroscience to
    determine

16
Proper Treatment of Computationalism
  • Turing on his test
  • the use of words and general educated opinion
    will alter so much that one will be able to
    speak of machines thinking without expecting to
    be contradicted.
  • general educated opinion
  • changes when we abstract generalize
  • the use of words
  • changes when reference shifts from words
    initial / narrow application to more
    abstract / general phenomenon
  • cf. fly, compute, algorithm
  • ditto for cognition / think

17
II. Syntactic Semanticsas a theory underlying
computationalism
  • Cognition is internal
  • Cognitive agents have direct access only to
    internal representatives of external objects
  • Semantics is syntactic
  • ? Words, meanings, semantic relations between
    them are all syntactic items
  • Understanding is recursive
  • Recursive Case
  • We understand one thing in terms of another that
    must already be understood
  • Base Case
  • We understand something in terms of
    itself(syntactic understanding)

18
Syntactic Semantics
  • InternalismCognitive agents have direct access
    only to internal representatives of external
    objects
  • A cognitive agent understands the world by
    pushing the world into the mind (Jackendoff
    2002)
  • Output of sensory transducers is the only
    contact the cognitive system ever has with the
    environment (Pylyshyn 1985)
  • Both words their meanings (including external
    objects) are represented internally in a single
    LOT
  • Humans biological neural network
  • Computers
  • artificial neural network
  • symbolic knowledge-representation reasoning
    system

19
Syntactic Semantics Internalism
  • Hume argument from double vision
  • Kant noumena vs. phenomena
  • Ayer argument from illusion
  • Fodor anti-Putnam methodological solipsism
  • G.Segal anti-Burge individualism
  • Pylyshyn
  • output of sensory transducers is the only
    contact the cognitive system ever has with the
    environment
  • Changizi argument from time delay in perception

20
Syntactic Semantics
  • (Internalism ?) Syntacticism? Words, meanings,
    semantic relations between them are all
    syntactic
  • syntax ? grammar

21
Syntactic Semantics
  • (Internalism ?) Syntacticism? Words, meanings,
    semantic relations between them are all
    syntactic
  • syntax study of relations among members of
    a single set
  • set of signs / uninterpreted marks / neuron
    firings /
  • semantics study of relations between members of
    two sets
  • set of signs / marks / neuron firings /
  • set of (external) meanings / (with its own
    syntax!)
  • Pushing meanings into same set as symbols for
    them allows semantics to be done syntactically
  • turns semantic relations between 2 sets (internal
    signs, external meanings) into relations among
    the marks of a single (internal) LOT/? syntax
  • e.g. truth tables formal semantics are both
    syntactic
  • e.g. neuron firings representing both signs
    external meanings
  • Symbol-manipulating computers can do semantics
    by doing syntax

22
SYN DOM
Syntax


23
SYN DOM
Syntax


SEM DOM
SYN DOM

Semantics



24
SYN DOM
Syntax


SEM DOM
SYN DOM

Semantics



Syntactic semantics




25
Syntactic Semantics Syntacticism
  • Syntactic semantics underlies the Semantic
    Web
  • syntactic info on web pages gains meaning
    from
  • syntactic info (metadata encoded in RDF) in
    HTML source files
  • metadata annotates webpage data
  • metadata provides semantic interpretation of
    webpage data
  • Ceusters Smith Syntactic Web

26
Syntactic Semantics
  • Understanding must be understood recursively
  • Recursive cases
  • We understand a syntactic domain (SYN-1)
    indirectly by interpreting it in terms of a
    semantic domain (SEM-1)
  • e.g.) understanding relevance logic in terms of
    Routley-Meyer ternary relation on points.
  • but SEM-1 must be antecedently understood
  • SEM-1 can be understood by considering it as a
    syntactic domain SYN-2 interpreted in terms of
    yet another semantic domain
  • e.g.) understanding RM ternary relation in terms
    of situation semantics
  • which also must be antecedently understood, etc.
  • Base case
  • A domain that is understood directly (i.e., not
    antecedently)
  • in terms of itself (in terms of relations
    among its symbols)
  • i.e., syntactically holistically

27
Syntactic Semantics Recursiveness
  • Syntactic understanding
  • the meaning of an internal state (which may or
    may not be linked to an external state of
    affairs) for the system itself is most
    naturally defined in terms of that states
    relations to its other states.
  • Edelman, Shimon (2008), On the Nature of Minds,
    or Truth and Consequences, JETAI 20 181-196
    quote on pp.188-189.
  • I.e., syntactically

28
III. Rapaports Thesis
  • Syntax suffices for semantic cognition
  • cognition is computable
  • ? computers are capable of thinking
  • James H. Fetzers Thesis
  • It doesnt,
  • it isnt,
  • they arent

29
The Nature of Signs
Something S
Ground
Causation
z Interpretant (wrt a
context) x Somebody Something
Figure 3. The Nature of Signs
30
Questions about Semiotic Systems
  • What is the causation relation between sign-user
    z and sign S?
  • Which causes which?
  • What is the grounding relation between sign S
    and thing x that S stands for?
  • If S is grounded by that (x) which it stands
    for,does is grounded by stands for?
  • If sign S stands for thing x for sign-user
    z,then do we need a different diagram?

31
  • Sign-user z causes sign S
  • Thing x grounds sign S
  • Sign S stands for thing x for sign-user z

32
Input-Output Systems
Input i
No Grounding
Causation
c (wrt a context)
o Computer
Output
Figure 5. An Input-Output System
33
Input-Output Systems
  • Differences
  • Sign-user z is now computer c
  • Sign S is now input i
  • Thing x is now output o
  • No grounding relation between i o
  • BUT
  • The marks by means of which computers operate
    include more than merely the (external) input
  • Can include internally stored marks
  • What the marks stand for is not necessarily the
    output
  • What does it mean for sign S to stand for thing x
    yet not be grounded by x?

34
Fetzers Thesis
  • Computers differ from cognitive agents in 3 ways
  • statically (symbolically)
  • dynamically (algorithmically)
  • affectively (emotionally)
  • Simulation is not the real thing

35
Fetzers Static Difference
ARGUMENT 1 Computers are mark-manipulating
systems, minds are not. Premise 1
Computers manipulate marks on the basis of their
size, shapes, and relative
locations. Premise 2 (a) These shapes,
sizes, and relative locations exert causal
influence upon
computers but (b) do not stand
for anything for those systems. Premise 3
Minds operate by utilizing signs that stand for
other things in some
respect or other for them as sign-using (or
semiotic) systems.
__________________________________________________
________________ Conclusion 1 Computers are
not semiotic (or sign-using) systems.
______________________________________________
_____________________ Conclusion 2
Computers are not the possessors of minds.
Figure 10.
The Static Difference
36
The Static Difference
  • Static Premise 1
  • Is computer manipulation of symbols independent
    of meaning?
  • depends on what meaning means
  • Computational symbol-manipulation is independent
    of external, 3rd-person meaning imposed on the
    symbols
  • But not independent of internal, 1st-person
    meaning
  • arises from syntactic relations among internal
    symbols
  • intrinsic meaning

37
The Static Difference
  • Static Premise 2b
  • The symbols that computers manipulate do not
    stand for anything for those computers.
  • But
  • Fetzers locution allows for the possibility that
    symbols could stand for something for the
    computer
  • Insofar as they could, such machines might be
    capable of thinking
  • He should have said could not stand for
    anything
  • But then hed be wrong -)

38
Fetzer - Computers Are Not Semiotic Systems
  • But
  • Semiotic systems interpret signs
  • An algorithms O/P is an interpretation of its
    I/P
  • Algorithms ground the I/P-O/P relation
  • Computers are algorithm machines
  • Computers are semiotic systems

39
Fetzer - Computers Are Not Semiotic Systems
  • In a semiotic system (e.g., a mind)
  • something (S) is a sign of something (x) for
    somebody (z)
  • x grounds sign S
  • x is an interpretant w.r.t. a context to
    sign-user z
  • S is in a causation relation with z

40
Fetzer - Computers Are Not Semiotic Systems
  • In a computer (I/O) system
  • input i (playing role of sign S) is in a
    causation relation with computer c (playing
    role of sign-user z)
  • output o (playing role of thing x) is in an
    interpretant relation with computer c
  • BUT No grounding relation between i o

41
Fetzer - Computers Are Not Semiotic Systems
  • ? Computers only have causal relationships,
    no mediation between I/P O/P (?!)
  • But semiotic systems require such mediation
  • Peirceinterpretant is mediately determined by
    the sign
  • interpretant is really the sign-users mental
    concept of the thing x (!!)
  • ? Computers are not semiotic systems
  • But minds are.
  • ? Minds are not computers computers cant be
    minds.

42
Three Arguments against Static Difference
  • Incardona - Computers are semiotic systems!
  • X is a semiotic system iff X carries out a
    process that mediates between a sign its
    interpretant
  • Semiotic systems interpret signs
  • Algorithms describe processesthat mediate
    between I/Ps O/Ps
  • An algorithms O/P is an interpretation of its
    I/P
  • Algorithms ground the I/O relation
  • Computers are algorithm machines
  • Computers are semiotic systems

43
Three arguments against the Static Difference
  • Argument that computers are semiotic systems from
    embedding in the world
  • Fetzers (counter?)example
  • A red light at an intersection stands for
    applying the brakes and coming to a complete
    halt, only proceeding when the light turns
    green, for those who know the rules of the
    road.
  • Can such a red light stand for applying the
    brakes, etc., for a computer?
  • It could, if the computer knows the rules of the
    road
  • But a computer can know those rules
  • if it has those rules stored in a knowledge base
  • and if it uses those rules to drive a vehicle
  • cf. 2007 DARPA Urban Grand Challenge
  • Parisien Thagard 2008, Robosemantics How
    Stanley Represents the World, Minds Machines 18

44
Three Arguments against the Static Difference
  • Goldfain - Computers marks stand for something
    for it
  • Does a calculator that computes GCDs understand
    them?
  • Fetzer Rapaport No
  • Could a computer that computes GCDs understand
    them?
  • Fetzer No
  • Goldfain Rapaport Yes, it could
  • as long as it had enough background / contextual
    / supporting info
  • a computer with a full-blown theory of mathat
    the level of an algebra student learning
    GCDscould understand GCDs as well as the student

45
The Static Difference
  • Goldfain - Computers could be semiotic systems
  • G1 The natural s that a cognitive agent refers
    to are denoted by a sequence of unique
    marks exemplifying a finite initial
    segment of the natural- structure.
  • G2 Such a finite initial segment can be
    generated by a computational cognitive
    agent (a computer) via perception action in the
    world during an act of counting (e.g.,
    using Lisps gensym)
  • they have a history of how they became marks
    that signify something for the agent (the
    computer).
  • G3 These marks (e.g., b4532, b182, b9000) have
    no meaning for a human user who lacks
    access to their ordering.
  • G4 Such private marks (numerons) are
    associable with publicly meaningful
    marks (numerlogs)
  • e.g., b4532 denotes the same number as 1,
    b182 denotes the same number as 2, etc.
  • G5 A computational cognitive agent (a computer)
    can do math solely on the basis of its
    numerons. see Goldfain dissertation
  • C1 ? These marks stand for something for the
    computer (the agent).
  • C2 we can check the math because of G4.

46
The Static Difference
  • Static Premise 1
  • Computers do manipulate marks on the basis
    of size, shape, location
  • but also on the basis of
  • relations of those symbols to other symbols
  • i.e., on the basis of their syntax
  • which Fetzer can safely add to his theory
  • this processing is not independent of their
    meaning
  • (by SS II)
  • but is independent of (external) reference
  • In this way, such symbols can stand for something
    for the computer
  • Computers are indeed string-processing systems
  • But meaning can arise from (appropriate)
    combinations of strings

47
Summary No Static Differences
  • Both computers minds manipulate marks
  • The marks can stand for something for both
    computers minds
  • Computers (and minds) are semiotic systems
  • Computers can possess minds

48
Fetzers Dynamic Difference
ARGUMENT 2 Computers are governed by
algorithms, but minds are not. Premise 1
Computers are governed by programs, which
are causal models of algorithms. Premise 2
Algorithms are effective decision procedures for
arriving at definite
solutions to problems in a finite number of
steps. Premise 3 Most human thought
processes, including dreams, daydreams, and
ordinary thinking, are not procedures for
arriving at solutions to problems in a finite
number of steps.
__________________________________________________
____________________ Conclusion 1 Most human
thought processes are not governed by
programs as causal models of algorithms.
___________________________________________
____________________________ Conclusion 2
Minds are not computers.
Figure 11. The
Dynamic Difference
49
The Dynamic Difference A Red Herring
  • Fetzer
  • If thinking is computing computing is
    thinking
  • if computing is algorithmic
  • then thinking is algorithmic
  • but it isnt
  • 2nd conjunct is irrelevant and false
  • A computer executing a non-cognitive program
    (e.g., an operating system) is computing but not
    thinking

50
The Dynamic Difference
  • Premises 1 2
  • Def of algorithm is OK
  • But algorithms may be the wrong entity
  • may need a more general notion of procedure
    (Shapiro)
  • like an algorithm, but
  • need not halt
  • need not yield correct output
  • can access external KB (Turing oracle machine)

51
The Dynamic Difference
  • Premise 3 Most human thinking is not
    algorithmic
  • Dreams are not algorithms
  • Ordinary stream-of-consciousness thinking is not
    algorithmic
  • BUT
  • Some human thought processes may indeed not be
    algorithms
  • Consistent with proper treatment of
    computationalism
  • Real issue is
  • Could there be algorithms/procedures that produce
    these(or other mental states or processes)?
  • If dreams are our interpretations of random
    neuron firings during sleep, as if they were
    due to external causes
  • then if non-dream neuron-firings are
    computable ( theres every reason to
    think they are) then so are dreams
  • Stream of consciousness might be computable
  • e.g., via spreading activation in a semantic
    network

52
The Dynamic Difference
  • Whether a mental state/process is computable is
    at least an empirical question
  • Must avoid the Hubert Dreyfus fallacy
  • one philosophers idea of a non-computable
    processis another computer scientists research
    project
  • what no one has yet written a program for is not
    thereby necessarily non-computable
  • In fact Mueller, Erik T. (1990), Daydreaming
    in Humans Machines A Computer Model of the
    Stream of Thought (Ablex)
  • Cf. Edelman, Shimon (2008), Computing the Mind
    (Oxford)
  • ? burden of proof is on Fetzer!

53
The Dynamic Difference
  • Dynamic Conclusion 2
  • Are minds computers?
  • Maybe, maybe not
  • I prefer to say (with Shimon Edelman, et al.)
  • The (human) mind is a virtual machine,computation
    ally implemented (in the nervous system)

54
Summary No Dynamic Difference
  • All (human) thought processes are/might be
    describable by algorithms/procedures
  • computationalism properly treated

55
Fetzers Affective Difference
ARGUMENT 3 Mental thought transitions are
affected by emotions, attitudes, and
histories, but computers are not. Premise 1
Computers are governed by programs, which
are causal models of algorithms. Premise 2
Algorithms are effective decisions, which
are not affected by emotions, attitudes, or
histories. Premise 3 Mental thought
transitions are affected by values of variables
that do not affect computers.
___________________________________________
__________________________ Conclusion 1
The processes controlling mental thought
transitions are fundamentally
different than those that control
computer procedures.
__________________________________________________
___________________ Conclusion 2 Minds are
not computers.
Figure 12. The Affective Difference
56
The Affective Difference
  • Fetzers definitions
  • intension of expression E def
  • conditions that need to be satisfied for
    something to be an E
  • extension of E def
  • class of all things that satisfy Es intension
  • denotation of E for agent A def
  • subset of Es extension that A comes into contact
    with
  • connotation of E for A def
  • As attitudes emotions in response to As
    interactions with Es denotation for A
  • Somewhat non-standard, but useful
  • e.g. meaning of E for A ? Es denotation
    connotation for A

57
Contra Affective Premises 2 3
  • Programs can be based on (idiosyncratic)emotions,
    attitudes, histories
  • Rapaport-Ehrlich contextual vocabulary
    acquisition program
  • Learns a meaning for an unfamiliar word from
  • the words textual context
  • integrated with the readers idiosyncratic
  • denotations, connotations,
  • emotions, attitudes, histories,
  • prior beliefs
  • Sloman, Picard, Thagard
  • Developing computational theories of affect,
    emotion, etc.
  • Emotions, attitudes, histories can affect
    computers that model them.

58
Summary No Affective Differences
  • Processes controlling mental thought transitions
    are not fundamentally different from those
    controlling algorithms/procedures.
  • Algorithms can take emotions/attitudes/histories
    into account.
  • Both computers minds can be affected by
    emotions/attitudes/histories

59
The Matter of Simulation
ARGUMENT 4 Digital machines can nevertheless
simulate thought processes and other
forms of human behavior. Premise 1 Computer
programmers and those who design the systems that
they control can increase their performance
capabilities, making them better and better
simulations. Premise 2 Their performance
capabilities may be closer and closer
approximations to the performance capabilities of
human beings without turning them into
thinking things. Premise 3 Indeed, the
static, dynamic, and affective differences that
distinguish computer performance from human
performance preclude them from being
thinking things. _______________________________
_______________________________________________
Conclusion Although the performance
capabilities of digital machines can
become better and better approximations of human
behavior, they are still not thinking
things.
Figure 15. The Matter of Simulation
60
Argument from Simulation
  • AgreedA computer that simulates some process
    P is not necessarily really doing P
  • But what is really doing P vs. simulating P?
  • What is the scope of a simulation?
  • Computer simulations of hurricanes dont get
    real people really wet
  • Real people are outside the scope of the
    simulation
  • BUT a computer simulation of a hurricane could
    get simulated people simulatedly wet
  • Computer simulation of the daily operations of a
    bank is not thereby the daily operations of a
    (real) bank
  • BUT I can do my banking online
  • Simulations can be used as if they were real

61
Argument from Simulation
  • Some simulations of Xs are real Xs
  • scale model of a scale model of X is a scale
    model of X
  • Xeroxed/faxed/PDF copies of documents are those
    documents
  • A computer that simulates an informational
    process is thereby actually doing that
    informational process
  • Because a computer simulation of information is
    information

62
Argument from Simulation
  • Computer simulation of a picture is a picture
  • digital photography
  • Computer simulation of language is language
  • computers really do parse sentences (Woods)
  • IBMs Watson really answers questions
  • Computer simulation of math is math
  • A simulation of a computation and the
    computation itself are equivalent try to
    simulate the addition of 2 and 3, and the result
    will be just as good as if you actually carried
    out the additionthat is the nature of numbers
    (Edelman)
  • Computer simulation of reasoning is reasoning
  • automated theorem proving, computational logic,

63
Argument from Simulation
  • Computer simulation of cognition is cognition
  • if the mind is a computational entity, a
    simulation of the relevant computations would
    constitute its fully functional replica
    (Edelman)
  • cf. implementational implication

64
Argument from Simulation
  • A simulation of a computation and the
    computation itself are equivalent try to
    simulate the addition of 2 and 3, and the result
    will be just as good as if you actually carried
    out the additionthat is the nature of numbers.
    Therefore, if the mind is a computational entity,
    a simulation of the relevant computations would
    constitute its fully functional replica.
  • Shimon Edelman (2008), Computing the Mind

65
Summary Simulation Can Be(come) the Real Thing
  • Close approximation to human thought processes
    can turn computers into thinking things
  • actually?
  • only asymptotically?
  • merely conventionally?
  • Turing said

66
  • the use of words and general educated opinion
    will alter so much that one will be able to
    speak of machines thinking without expecting to
    be contradicted. (Turing 1950)
  • general educated opinion
  • changes when we abstract generalize
  • the use of words
  • changes when reference shifts from words
    initial / narrow application to more
    abstract / general phenomenon
  • cf. fly, compute, algorithm
  • ditto for cognition / think

67
Summary
  • Computers are semiotic (sign-using) systems.
  • Computationalismproperly treated cognition is
    computable
  • not necessarily computational.
  • Any non-computable residue will be negligible
  • Mental processes are describable (?governable) by
    algorithmic procedures
  • that can be affected by emotions/attitudes/indivi
    dual histories.
  • Computers that implement these cognitive
    procedures really exhibit those cognitive
    behaviors.
  • They really think.
  • Computers can possess minds.
  • Syntactic semantics explains how all this is
    possible.

68
Any non-computable residue will be negligible
  • On negligible differences
  • cf. music on CDs with music on vinyl
  • discrete/digital vs. continuous/analog
  • Does it matter whether a cognitive computer
    really thinks?
  • on the meaning of really
  • human-specific thinking
  • vs. abstract/general notion of thinking
  • on does it matter
  • an android will need to behave ethically
  • to be treated ethically

69
  • Rapaport, William J. (2012),
  • Semiotic Systems, Computers, and the Mind
    How Cognition Could Be Computing,
  • International Journal of Signs and Semiotic
    Systems 2(1) (January-June) 3271.
Write a Comment
User Comments (0)
About PowerShow.com