Knowledge Representation and Reasoning - PowerPoint PPT Presentation

Loading...

PPT – Knowledge Representation and Reasoning PowerPoint presentation | free to view - id: 1e35e0-ZDc1Z



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

Knowledge Representation and Reasoning

Description:

Knowledge Representation and Reasoning – PowerPoint PPT presentation

Number of Views:18
Avg rating:3.0/5.0
Slides: 52
Provided by: StuartC93
Category:

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: Knowledge Representation and Reasoning


1
Knowledge Representation and Reasoning
  • Stuart C. Shapiro
  • Professor, CSE
  • Director, SNePS Research Group
  • Member, Center for Cognitive Science

2
Introduction
3
Long-Term Goal
  • Theory and Implementation of
  • Natural-Language-Competent
  • Computerized Cognitive Agent
  • and Supporting Research in
  • Artificial Intelligence
  • Cognitive Science
  • Computational Linguistics.

4
Research Areas
  • Knowledge Representation and Reasoning
  • Cognitive Robotics
  • Natural-Language Understanding
  • Natural-Language Generation.

5
Goal
  • A computational cognitive agent that can
  • Understand and communicate in English
  • Discuss specific, generic, and rule-like
    information
  • Reason
  • Discuss acts and plans
  • Sense
  • Act
  • Remember and report what it has sensed and done.

6
Cassie
  • A computational cognitive agent
  • Embodied in hardware
  • or Software-Simulated
  • Based on SNePS and GLAIR.

7
GLAIR Architecture
Grounded Layered Architecture with Integrated
Reasoning
Knowledge Level
SNePS
Perceptuo-Motor Level
NL
Sensory-Actuator Level
Vision
Sonar
Motion
Proprioception
8
SNePS
  • Knowledge Representation and Reasoning
  • Propositions as Terms
  • SNIP SNePS Inference Package
  • Specialized connectives and quantifiers
  • SNeBR SNePS Belief Revision
  • SNeRE SNePS Rational Engine
  • Interface Languages
  • SNePSUL Lisp-Like
  • SNePSLOG Logic-Like
  • GATN for Fragments of English.

9
Example Cassies Worlds
10
BlocksWorld
11
FEVAHR
12
FEVAHRWorld Simulation
13
UXO Remediation
Corner flag
Field
Drop-off zone
UXO
NonUXO object
Battery meter
Corner flag
Corner flag
Cassie
Recharging Station
Safe zone
14
Crystal Space Environment
15
Sample Research IssuesComplex Categories
16
Complex Categories 1
  • Noun Phrases
  • ltDetgt N Adj N
  • Understanding of the modification must be left to
    reasoning.
  • Example
  • orange juice seat
  • Representation must be left vague.

17
Complex Categories 2
  • Kevin went to the orange juice seat.
  • I understand that Kevin went to the orange juice
    seat.
  • Did Kevin go to a seat?
  • Yes, Kevin went to the orange juice seat.

18
Complex Categories 3
  • Pat is an excellent teacher.
  • I understand that Pat is an excellent teacher.
  • Is Pat a teacher?
  • Yes, Pat is a teacher.
  • Lucy is a former teacher.
  • I understand that Lucy is a former teacher.

19
Complex Categories 4
  • former' is a negative adjective.
  • I understand that former' is a negative
    adjective.
  • Is Lucy a teacher?
  • No, Lucy is not a teacher.

Also note representation and use of knowledge
about words.
20
Sample Research IssuesIndexicals
21
Representation and Use of Indexicals
  • Words whose meanings are determined by occasion
    of use
  • E.g. I, you, now, then, here, there
  • Deictic Center ltI, YOU, NOWgt
  • I SNePS term representing Cassie
  • YOU person Cassie is talking with
  • NOW current time.

22
Analysis of Indexicals(in input)
  • First person pronouns YOU
  • Second person pronouns I
  • here location of YOU
  • Present/Past relative to NOW.

23
Generation of Indexicals
  • I First person pronouns
  • YOU Second person pronouns
  • NOW used to determine tense and aspect.

24
Use of Indexicals 1
Come here.
25
Use of Indexicals 2
Come here.
I came to you, Stu. I am near you.
26
Use of Indexicals 3
Who am I?
Your name is Stu and you are a person.
Who have you talked to?
I am talking to you.
Talk to Bill.
I am talking to you, Bill.
Come here.
27
Use of Indexicals 4
Come here.
I found you. I am looking at you.
28
Use of Indexicals 5
Come here.
I found you. I am looking at you.
I came to you. I am near you.
29
Use of Indexicals 6
Who am I?
Your name is Bill and you are a person.
Who are you?
I am the FEVAHR and my name is Cassie.
Who have you talked to?
I talked to Stu and I am talking to you.
30
Current Research Issues Distinguishing
Perceptually Indistinguishable ObjectsPh.D.
Dissertation, John F. Santore
31
  • Some robots in a suite of rooms.

32
  • Are these the same two robots?
  • Why do you think so/not?

33
Next Steps
  • How do people do this?
  • Currently doing protocol experiments
  • Getting Cassie to do it.

34
Current Research Issues Belief Revisionin
aDeductively Open Belief SpacePh.D.
Dissertation, Frances L. Johnson
35
Belief Revision in a Deductively Open Belief
Space
  • Beliefs in a knowledge base must be able to be
    changed (belief revision)
  • Add remove beliefs
  • Detect and correct errors/conflicts/inconsistencie
    s
  • BUT
  • Guaranteeing consistency is an ideal concept
  • Real world systems are not ideal

36
Belief Revision in a DOBS Ideal Theories vs.
Real World
  • Ideal Belief Revision theories assume
  • No reasoning limits (time or storage)
  • All derivable beliefs are acquirable (deductive
    closure)
  • All belief credibilities are known and fixed
  • Real world
  • Reasoning takes time, storage space is finite
  • Some implicit beliefs might be currently
    inaccessible
  • Source/belief credibilities can change

37
Belief Revision in a DOBS A Real World KR System
  • Must recognize its limitations
  • Some knowledge remains implicit
  • Inconsistencies might be missed
  • A source turns out to be unreliable
  • Revision choices might be poor in hindsight
  • After further deduction or knowledge acquisition
  • Must repair itself
  • Catch and correct poor revision choices

38
Belief Revision in a DOBS Theory Example
Reconsideration
Ranking 1 is more credible that Ranking 2.
Ranking 1 is more credible that Ranking 2.
College A is better than College B. (Source
Ranking 1)
College B is better than College A. (Source
Ranking 2)
College B is better than College A. (Source
Ranking 2)
Ranking 1 was flawed, so Ranking 2 is more
credible than Ranking 1.
Need to reconsider!
39
Next Steps
  • Implement reconsideration
  • Develop benchmarks for implemented krr systems.

40
Current Research Issues Default
ReasoningbyPreferential Ordering of
BeliefsM.S. Thesis, Bharat Bhushan
41
Small Knowledge Base
  • Birds have wings.
  • Birds fly.
  • Penguins are birds.
  • Penguins dont fly.

42
KB Using Default Logic
  • ?x(Bird(x) ? Has(x, wings))
  • ?x(Penguin(x) ? Bird(x))
  • ?x(Penguin(x) ? ?Flies(x))

43
KB Using Preferential Ordering
  • ?x(Bird(x) ? Has(x, wings))
  • ?x(Bird(x) ? Flies(x))
  • ?x(Penguin(x) ? Bird(x))
  • ?x(Penguin(x) ? ?Flies(x))
  • Precludes(?x(Penguin(x) ? ?Flies(x)),
  • ?x(Bird(x) ? Flies(x)))

44
Next Steps
  • Finish theory and implementation.

45
Current Research Issues Representation
Reasoningwith Arbitrary ObjectsStuart C. Shapiro
46
Classical Representation
  • Clyde is gray.
  • Gray(Clyde)
  • All elephants are gray.
  • ?x(Elephant(x) ? Gray(x))
  • Some elephants are albino.
  • ?x(Elephant(x) Albino(x))
  • Why the difference?

47
Representation Using Arbitrary Indefinite
Objects
  • Clyde is gray.
  • Gray(Clyde)
  • Elephants are gray.
  • Gray(any x Elephant(x))
  • Some elephants are albino.
  • Albino(some x Elephant(x))

48
Subsumption Among Arbitrary Indefinite Objects
(any x Elephant(x))
(any x Albino(x) Elephant(x))
(some x Albino(x) Elephant(x))
(some x Elephant(x))
If x subsumes y, then P(x) ? P(y)
49
Example (Runs in SNePS 3)
  • Hungry(any x Elephant(x)
  • Eats(x, any y Tall(y)

  • Grass(y)
  • On(y,
    Savanna)))
  • ?
  • Hungry(any u Albino(u)
  • Elephant(u)
  • Eats(u, any v Grass(v)
  • On(v,
    Savanna)))

50
Next Steps
  • Finish theory and implementation of arbitrary and
    indefinite objects.
  • Extend to other generalized quantifiers
  • Such as most, many, few, no, both, 3 of,

51
For More Information
  • Shapiro http//www.cse.buffalo.edu/shapiro/
  • SNePS Research Group http//www.cse.buffalo.edu/s
    neps/
About PowerShow.com