CS G120 Artificial Intelligence - PowerPoint PPT Presentation

1 / 52
About This Presentation
Title:

CS G120 Artificial Intelligence

Description:

banana ? edible ? healthy. Sketch of Backward Chaining algorithm ... Whereas propositional logic assumes the world contains facts, ... – PowerPoint PPT presentation

Number of Views:70
Avg rating:3.0/5.0
Slides: 53
Provided by: carole5
Category:

less

Transcript and Presenter's Notes

Title: CS G120 Artificial Intelligence


1
CS G120 Artificial Intelligence
  • Prof. C. Hafner
  • Class Notes Jan 22, 2009

2
Topics for today
  • Probability of success in Wumpus World
  • Finish discussion of propositional logic
  • Horn clauses
  • Example
  • Forward chaining
  • Backward chaining
  • First order logic
  • Unification
  • CNF for first order logic (if time permits)

3
Exploring a wumpus world
(1,4)
(4,4)
X
Y
Z
(1,1)
(4,1)
Change S in location (2,1) to B Calculate
Prob(pit at Z)
4
Important formulas
  • P(A v B) P(A) P(B) - P(A and B)
  • P(A B) P(A B) / P(B)
  • P(A B) P(A) P(B)
  • if they are independent events

5
Breeze in (1,2) B1 Breeze in (2, 1)
B2 Prob(B1) Prob (PX v PY) 0.2 0.2 0.04
0.36 Prob(PX B1) Prob (PX B1) / Prob(B1)
0.2 / 0.36 0.555 Prob ((PX PY)
B1) Prob(PX PY B1) / Prob(B1) 0.04 /
0.36 0.111
6
Prob(B1 B2) Prob(PY v (PX and PZ)) Prob
(PY) Prob(PX PZ) Prob(PX PY PZ)
0.2 0.04 - 0.008 0.232 Prob(PZ (B1
B2)) Prob (PZ B1 B2) / Prob(B1 B2)
0.2 0.36 / 0.232 0.31 Therefore agent
should proceed to Z if Prob(winning) gt .31
7
Reasoning with Propositional Logic (cont.)Horn
Clauses
  • P1 v P2 v . . . v Pn v Q
  • Same as P1 P2 . . . Pn ? Q

8
Example
  • KB a database of assertions (sometimes called
    facts and rules)
  • fruit ? edible
  • vegetable ? edible
  • vegetable green ? healthy
  • apple ? fruit
  • banana ? fruit
  • spinach ? vegetable
  • spinach ? green
  • edible healthy ? recommended
  • ? apple

9
Forward chaining data driven reasoning triggered
by a new percept (fact)
  • Basis of forward chaining
  • P ? Q is an assertion in the KB
  • P is a new percept
  • ----------
  • Conclude Q
  • Two views of KB
  • All implications are made explicit vs.
  • Reasoning on demand
  • Pure forward chaining suggests the former

10
Example
  • KB fruit ? edible
  • vegetable ? edible
  • vegetable green ? healthy
  • edible healthy ? recommended
  • apple ? fruit
  • banana ? fruit
  • spinach ? vegetable
  • spinach ? green
  • apple
  • Lets make all the implications explicit
  • Now consider this KB without the fact ? apple,
    and consider a new percept ? spinach. What
    happens

11
Forward chaining algorithm
  • forwardChain(KB, percept) returns updated KB
  • new-knowledge percept
  • while new-knowledge is not empty
  • p ? pop(new-knowledge)
  • add p to KB
  • for each rule r in KB with p contained in
    LHS(r)
  • if all elements of LHS(r) are in KB
  • push (RHS(r), new-knowledge)
  • return KB
  • --------------------------------------------------
    ----------------
  • Requires an index of the rule set by LHS symbols

12
Forward chaining (cont.)
  • To make a KB explicit
  • KB ? makeExplicit(KB)
  • for each fact f in KB
  • KB ? forwardChain(KB, f)
  • Note possibility of infinite loop how to avoid
  • for each rule r in KB with p contained in
    LHS(r)
  • if all elements of LHS(r) are in KB and
    RHS(r) is not in KB
  • push (RHS(r), new-knowledge)

13
Backward chaining goal-driven reasoning
triggered by a question being asked
  • KBfruit ? edible
  • vegetable ? edible
  • edible green ? healthy
  • apple ? fruit
  • banana ? fruit
  • spinach ? vegetable
  • spinach ? green
  • edible healthy ? recommended
  • apple
  • Consider some queries
  • ? apple ? fruit ? banana ? edible ?
    healthy

14
Sketch of Backward Chaining algorithm
  • backwardChain(KB, query) returns Boolean
  • if query is in KB, return True
  • for each rule r in KB such that RHS(r) query
  • testing True
  • for each element e of the LHS(r)
  • if backwardChain(KB, e) False
  • testing False
  • break
  • if testing True return True
  • return False
  • NOTE that backward chaining does not update the
    KB
  • DISCUSS NEGATION BY FAILURE (example green is
    false if apple)

15
Assignment 3
  • Part I. Due Date January 29 (upload to
    Blackboard)
  • Write a Python program to implement the forward
    chaining mechanism we defined for propositional
    logic. The following functions should be
    implemented
  • KB ? readKB(filename) returns a knowledge
    base object
  • KB ? makeExplicit(KB)
  • KB ? forwardChain(KB, fact)
  • You should define a class KB and objects of the
    class will be returned from these 3 functions.
    Other classes may be defined as needed.
  • The makeExplicit and forwardChain functions
    should also produce printed output of the
    assertions added to the KB.
  • The format of input for the readKB function is,
    one assertion per line
  • Rule format p1 p2 q
  • Fact format q

16
Assignment 3 (cont.)
  • Part 2. Due date Feb 5
  • Create a model of the Boston subway (T) system
    using first order logic (FOL). Define some
    axioms that would support reasoning about how to
    get from one subway station to another. Be
    prepared to explain your model in class. You can
    find a map of the Boston subway system at the
    MBTA web site or
  • http//www.tuftslife.com/images/map_mbta.gif
  • More detailed information about this assignment
    will be posted on the Blackboard site in the next
    few days.

17
First Order Logic (FOL)
  • Why FOL?
  • Syntax and semantics of FOL
  • Using FOL
  • Wumpus world in FOL
  • Knowledge engineering in FOL

18
Pros and cons of propositional logic
  • ? Propositional logic is declarative
  • ? Propositional logic allows partial/disjunctive/n
    egated information
  • (unlike most data structures and databases)
  • (Horn clauses an intermediate form)
  • Propositional logic is compositional
  • meaning of B1,1 ? P1,2 is derived from meaning of
    B1,1 and of P1,2
  • ? Meaning in propositional logic is
    context-independent
  • (unlike natural language, where meaning depends
    on context)
  • ? Propositional logic has very limited expressive
    power
  • (unlike natural language)
  • E.g., cannot say "pits cause breezes in adjacent
    squares
  • except by writing one sentence for each square

19
First-order logic
  • Whereas propositional logic assumes the world
    contains facts,
  • first-order logic (like natural language) assumes
    the world contains
  • Objects people, houses, numbers, colors,
    baseball games, wars,
  • Relations red, round, prime, brother of, bigger
    than, part of, comes between,
  • Functions father of, best friend, one more than,
    plus,

20
Syntax of FOL Basic elements
  • Constants KingJohn, 2, NU,...
  • Predicates Brother, gt,...
  • Functions Sqrt, LeftLegOf,...
  • Variables x, y, a, b,...
  • Connectives ?, ?, ?, ?, ?
  • Equality
  • Quantifiers ?, ?

21
Atomic sentences
  • Atomic sentence predicate (term1,...,termn)
    or term1 term2
  • Term function (term1,...,termn)
    or constant or variable
  • E.g., Brother(KingJohn,RichardTheLionheart)
  • gt (Length(LeftLegOf(Richard)),
    Length(LeftLegOf(KingJohn)))
  • Brother (Length(LeftLegOf(Richard)),
    Length(LeftLegOf(KingJohn)))

22
Complex sentences
  • Complex sentences are made from atomic sentences
    using connectives
  • ?S, S1 ? S2, S1 ? S2, S1 ? S2, S1 ? S2,
  • E.g. Sibling(KingJohn,Richard) ?
    Sibling(Richard,KingJohn)
  • gt(1,2) ? (1,2)
  • gt(1,2) ? ? gt(1,2)

23
Truth in first-order logic
  • Sentences are true with respect to a model and an
    interpretation
  • Model contains objects (domain elements) and
    relations among them
  • Interpretation specifies referents for
  • constant symbols ? objects
  • predicate symbols ? relations
  • function symbols ? functional relations
  • An atomic sentence predicate(term1,...,termn) is
    true
  • iff the objects referred to by term1,...,termn
  • are in the relation referred to by predicate

24
Models for FOL Example
25
Universal quantification
  • ?ltvariablesgt ltsentencegt
  • Everyone at NU is smart
  • ?x At(x,NU) ? Smart(x)
  • ?x P is true in a model m iff P is true with x
    being each possible object in the model
  • Roughly speaking, equivalent to the conjunction
    of instantiations of P
  • At(KingJohn,NU) ? Smart(KingJohn)
  • ? At(Richard,NU) ? Smart(Richard)
  • ? At(NUS,NU) ? Smart(NU)

26
A common mistake to avoid
  • Typically, ? is the main connective with ?
  • Common mistake using ? as the main connective
    with ?
  • ?x At(x,NU) ? Smart(x)
  • means Everyone is at NU and everyone is smart

27
Existential quantification
  • ?ltvariablesgt ltsentencegt
  • Someone at NU is smart
  • ?x At(x,NU) ? Smart(x)
  • ?x P is true in a model m iff P is true with x
    being some possible object in the model
  • Roughly speaking, equivalent to the disjunction
    of instantiations of P
  • At(KingJohn,NU) ? Smart(KingJohn)
  • ? At(Richard,NU) ? Smart(Richard)
  • ? At(NU,NU) ? Smart(NU)
  • ? ...

28
Another common mistake to avoid
  • Typically, ? is the main connective with ?
  • Common mistake using ? as the main connective
    with ?
  • ?x At(x,NU) ? Smart(x)
  • is true if there is anyone who is not at NUS!

29
Properties of quantifiers
  • ?x ?y is the same as ?y ?x
  • ?x ?y is the same as ?y ?x
  • ?x ?y is not the same as ?y ?x
  • ?x ?y Loves(x,y)
  • There is a person who loves everyone in the
    world
  • ?y ?x Loves(x,y)
  • Everyone in the world is loved by at least one
    person
  • Quantifier duality each can be expressed using
    the other
  • ?x Likes(x,IceCream) ??x ?Likes(x,IceCream)
  • ?x Likes(x,Broccoli) ??x ?Likes(x,Broccoli)

30
Equality
  • term1 term2 is true under a given
    interpretation if and only if term1 and term2
    refer to the same object
  • E.g., definition of Sibling in terms of Parent
  • ?x,y Sibling(x,y) ? ?(x y) ? ?m,f ? (m f) ?
    Parent(m,x) ? Parent(f,x) ? Parent(m,y) ?
    Parent(f,y)

31
Using FOL
  • The kinship domain
  • Brothers are siblings
  • ?x,y Brother(x,y) ? Sibling(x,y)
  • One's mother is one's female parent
  • ?m,c Mother(c) m ? (Female(m) ? Parent(m,c))
  • Sibling is symmetric
  • ?x,y Sibling(x,y) ? Sibling(y,x)

32
A model M for the kinship domain
  • Individuals J K L M N O P Q R
  • Functions mom1 mom(N) ? M
  • Relationsarity
  • fem1 M, Q
  • par2 M, N, N, R . .
  • sib2 M, O, P, J, J, P
  • --------------------Interpretation I
    -----------------------
  • Constants John, Mary, Sue, Tom .. ..
  • I(Mary) M, I(Sue) Q, . . .
  • Function symbol Mother, I(Mother) mom
  • Relation symbols Female, Parent, Sibling
  • I(Female) fem, I(Parent) par, I(Sibling)
    sib

33
Wumpus world in FOL
  • First step define constants, function symbols,
    predicate symbols to express the facts
  • Percept(data, t) means at step t, the agent
    perceived the data where data is a 5 element
    vector
  • Stench, Breeze, Glitter, Bump, Scream
  • Ex Percept(None, Breeze, None, None, None),2
  • At(Agent, s, t) means agent is at square s at
    step t
  • Ex At(Agent, 2,1, 2

34
Some Wumpus axioms
  • Axiom to interpreting perceptions in context
  • ?x,t At(Agent, x, t) Breeze(t) ? Breezy(x)
  • Definitional axiom
  • ?a,b,c,d,t Percept(a, Breeze, b, c, d, t) ?
    Breeze(t)
  • Diagnostic Axiom
  • ?x Breezy(x) ? ?z Adjacent(z, x) Pit(z)
  • Causal Axiom
  • ?z Pit(z) ? (?x Adjacent(z, x) ? Breezy(x))
  • World model axioms Adjacent(1,1,2,1) etc.
  • ?x,y Adjacent(x, y) ?? Adjacent(y,x)

35
Interacting with FOL KBs
  • Suppose a wumpus-world agent is using an FOL KB
    and perceives a smell and a breeze (but no
    glitter) at t5
  • Tell(KB,Percept(Smell,Breeze,None,5))
  • Ask(KB,?a BestAction(a,5))
  • I.e., does the KB entail some best action at t5?
  • Answer Yes, a/Shoot ? substitution (binding
    list)
  • Given a sentence S and a substitution s,
  • Ss denotes the result of plugging s into S e.g.,
  • S Smarter(x,y)
  • s x/Hillary,y/Bill
  • Ss Smarter(Hillary,Bill)
  • Ask(KB,S) returns some/all s such that KB Ss

36
Knowledge engineering in FOL
  • Identify the taskAssemble the relevant knowledge
  • Decide on a vocabulary of predicates, functions,
    and constants
  • Encode general knowledge about the domain
  • Encode a description of the specific problem
    instance
  • Pose queries to the inference procedure and get
    answers
    Debug the knowledge base

37
Inference in FOL
  • Rules of Inference
  • Unification
  • Generalized Modus Ponens

38
Universal instantiation (UI)
  • Every instantiation of a universally quantified
    sentence is entailed by it
  • ?v a
  • Subst(v/g, a)
  • for any variable v and ground term g
  • E.g., ?x King(x) ? Greedy(x) ? Evil(x) yields
  • King(John) ? Greedy(John) ? Evil(John)
  • King(Richard) ? Greedy(Richard) ? Evil(Richard)
  • King(Father(John)) ? Greedy(Father(John)) ?
    Evil(Father(John))
  • .
  • .
  • .

39
Existential instantiation (EI)
  • For any sentence a, variable v, and constant
    symbol k that does not appear elsewhere in the
    knowledge base
  • ?v a
  • Subst(v/k, a)
  • E.g., ?x Crown(x) ? OnHead(x,John) yields
  • Crown(C1) ? OnHead(C1,John)
  • provided C1 is a new constant symbol, called a
    Skolem constant

40
Reduction to propositional inference
  • Suppose the KB contains just the following
  • ?x King(x) ? Greedy(x) ? Evil(x)
  • King(John)
  • Greedy(John)
  • Brother(Richard,John)
  • Instantiating the universal sentence in all
    possible ways, we have
  • King(John) ? Greedy(John) ? Evil(John)
  • King(Richard) ? Greedy(Richard) ? Evil(Richard)
  • King(John)
  • Greedy(John)
  • Brother(Richard,John)
  • The new KB is propositionalized proposition
    symbols are
  • King(John), Greedy(John), Evil(John),
    King(Richard), etc.

41
Unification
  • We can get the inference immediately if we can
    find a substitution ? such that King(x) and
    Greedy(x) match King(John) and Greedy(y)
  • ? x/John,y/John works
  • Unify(a,ß) ? if a? ß?
  • p q ?
  • Knows(John,x) Knows(John,Jane)
  • Knows(John,x) Knows(y,OJ)
  • Knows(John,x) Knows(y,Mother(y))
  • Knows(John,x) Knows(x,OJ)
  • Standardizing apart eliminates overlap of
    variables, e.g., Knows(z17,OJ)

42
Unification
  • We can get the inference immediately if we can
    find a substitution ? such that King(x) and
    Greedy(x) match King(John) and Greedy(y)
  • ? x/John,y/John works
  • Unify(a,ß) ? if a? ß?
  • p q ?
  • Knows(John,x) Knows(John,Jane) x/Jane
  • Knows(John,x) Knows(y,OJ)
  • Knows(John,x) Knows(y,Mother(y))
  • Knows(John,x) Knows(x,OJ)

43
Unification
  • We can get the inference immediately if we can
    find a substitution ? such that King(x) and
    Greedy(x) match King(John) and Greedy(y)
  • ? x/John,y/John works
  • Unify(a,ß) ? if a? ß?
  • p q ?
  • Knows(John,x) Knows(John,Jane) x/Jane
  • Knows(John,x) Knows(y,OJ) x/OJ,y/John
  • Knows(John,x) Knows(y,Mother(y))
  • Knows(John,x) Knows(x,OJ)

44
Unification
  • We can get the inference immediately if we can
    find a substitution ? such that King(x) and
    Greedy(x) match King(John) and Greedy(y)
  • ? x/John,y/John works
  • Unify(a,ß) ? if a? ß?
  • p q ?
  • Knows(John,x) Knows(John,Jane) x/Jane
  • Knows(John,x) Knows(y,OJ) x/OJ,y/John
  • Knows(John,x) Knows(y,Mother(y)) y/John,x/Mother
    (John)
  • Knows(John,x) Knows(x,OJ)

45
Unification
  • We can get the inference immediately if we can
    find a substitution ? such that King(x) and
    Greedy(x) match King(John) and Greedy(y)
  • ? x/John,y/John works
  • Unify(a,ß) ? if a? ß?
  • p q ?
  • Knows(John,x) Knows(John,Jane) x/Jane
  • Knows(John,x) Knows(y,OJ) x/OJ,y/John
  • Knows(John,x) Knows(y,Mother(y)) y/John,x/Mother
    (John)
  • Knows(John,x) Knows(x,OJ) fail
  • Standardizing apart eliminates overlap of
    variables, e.g., Knows(z17,OJ)

46
Unification
  • To unify Knows(John,x) and Knows(y,z),
  • ? y/John, x/z or ? y/John, x/John,
    z/John
  • The first unifier is more general than the
    second.
  • There is a single most general unifier (MGU) that
    is unique up to renaming of variables.
  • MGU y/John, x/z

47
The unification algorithm
48
The unification algorithm
49
Generalized Modus Ponens (GMP)
  • p1', p2', , pn', ( p1 ? p2 ? ? pn ?q)
  • q?
  • p1' is King(John) p1 is King(x)
  • p2' is Greedy(y) p2 is Greedy(x)
  • ? is x/John,y/John q is Evil(x)
  • q ? is Evil(John)
  • GMP used with KB of definite clauses (exactly one
    positive literal)
  • All variables assumed universally quantified

where pi'? pi ? for all i
50
Soundness of GMP
  • Need to show that
  • p1', , pn', (p1 ? ? pn ? q) q?
  • provided that pi'? pi? for all I
  • Lemma For any sentence p, we have p p? by UI
  • (p1 ? ? pn ? q) (p1 ? ? pn ? q)? (p1? ?
    ? pn? ? q?)
  • p1', \ , \pn' p1' ? ? pn' p1'? ? ?
    pn'?
  • From 1 and 2, q? follows by ordinary Modus Ponens

51
Example knowledge base
  • The law says that it is a crime for an American
    to sell weapons to hostile nations. The country
    Nono, an enemy of America, has some missiles, and
    all of its missiles were sold to it by Colonel
    West, who is American.
  • Prove that Col. West is a criminal

52
Example knowledge base contd.
  • ... it is a crime for an American to sell weapons
    to hostile nations
  • American(x) ? Weapon(y) ? Sells(x,y,z) ?
    Hostile(z) ? Criminal(x)
  • Nono has some missiles, i.e., ?x Owns(Nono,x) ?
    Missile(x)
  • Owns(Nono,M1) and Missile(M1)
  • all of its missiles were sold to it by Colonel
    West
  • Missile(x) ? Owns(Nono,x) ? Sells(West,x,Nono)
  • Missiles are weapons
  • Missile(x) ? Weapon(x)
  • An enemy of America counts as "hostile
  • Enemy(x,America) ? Hostile(x)
  • West, who is American
  • American(West)
  • The country Nono, an enemy of America
  • Enemy(Nono,America)
Write a Comment
User Comments (0)
About PowerShow.com