EXPERT SYSTEMS AND KNOWLEDGE REPRESENTATION Ivan Bratko Faculty of Computer and Info. Sc. University of Ljubljana - PowerPoint PPT Presentation

1 / 27
About This Presentation
Title:

EXPERT SYSTEMS AND KNOWLEDGE REPRESENTATION Ivan Bratko Faculty of Computer and Info. Sc. University of Ljubljana

Description:

EXPERT SYSTEMS AND KNOWLEDGE REPRESENTATION Ivan Bratko Faculty of Computer and Info. Sc. University of Ljubljana EPERT SYSTEMS An expert system behaves like an ... – PowerPoint PPT presentation

Number of Views:89
Avg rating:3.0/5.0
Slides: 28
Provided by: IvanB150
Category:

less

Transcript and Presenter's Notes

Title: EXPERT SYSTEMS AND KNOWLEDGE REPRESENTATION Ivan Bratko Faculty of Computer and Info. Sc. University of Ljubljana


1
EXPERT SYSTEMS ANDKNOWLEDGE REPRESENTATIONIvan
BratkoFaculty of Computer and Info.
Sc.University of Ljubljana
2
EPERT SYSTEMS
  • An expert system behaves like an expert in some
    narrow area of expertise
  • Typical examples
  • Diagnosis in an area of medicine
  • Adjusting between economy class and business
    class seats for future flights
  • Diagnosing paper problems in rotary printing
  • Very fashionable area of AI in period 1985-95

3
SOME FAMOUS EXPERT SYSTEMS
  • MYCIN (Shortliffe, Feigenbaum, late seventies)
  • Diagnosis of infections
  • PROSPECTOR (Buchanan, et al. 1980)
  • Ore and oil exploration
  • INTERNIST (Popple, mid eighties)
  • Internal medicine

4
Structure of expert systems
5
FEATURES OF EXPERT SYSTEMS
  • Problem solving in the area of expertise
  • Interaction with user during and after problem
    solving
  • Relying heavily on domain knowledge
  • Explanation ability of explain results to user

6
REPRESENTING KNOWLEDGE WITH IF-THEN RULES
  • Traditionally the most popular form of knowledge
    representation in expert systems
  • Examples of rules
  • if condition P then conclusion C
  • if situation S then action A
  • if conditions C1 and C2 hold then condition C
    does not hold

7
DESIRABLE FEATURES OF RULES
  • Modularity each rule defines a relatively
    independent piece of knowledge
  • Incrementability new rules added (relatively
    independently) of other rules
  • Support explanation
  • Can represent uncertainty

8
TYPICAL TYPES OF EXPLANATION
  • How explanation
  • Answers users questions of form How did you
    reach this conclusion?
  • Why explanation
  • Answers users questions Why do you need this
    information?

9
RULES CAN ALSO HANDLE UNCERTAINTY
  • If condition A then conclusion B with certainty F

10
EXAMPLE RULE FROM MYCIN
  • if
  • 1 the infection is primary bacteremia, and
  • 2 the site of the culture is one of the
    sterilesites, and
  • 3 the suspected portal of entry of the
    organism is the
  • gastrointestinal tract
  • then
  • there is suggestive evidence (0.7) that the
    identity of the organism is bacteroides.

11
EXAMPLE RULES FROM AL/X
  • Diagnosing equipment failure on oil platforms,
    Reiter 1980
  • if
  • the pressure in V-01 reached relief valve
    lift pressure
  • then
  • the relief valve on V-01 has lifted N
    0.005, S 400
  • if
  • NOT the pressure in V-01 reached relief
    valve lift pressure,
  • and the relief valve on V-01 has lifted
  • then
  • the V-01 relief valve opened early (the set
    pressure has drifted)
  • N 0.001, S 2000

12
EXAMPLE RULE FROM AL3Game playing, Bratko 1982
  • if
  • 1 there is a hypothesis, H, that a plan P
    succeeds, and
  • 2 there are two hypotheses,
  • H1, that a plan R1 refutes plan P, and
  • H2, that a plan R2 refutes plan P, and
  • 3 there are facts H1 is false, and H2 is
    false
  • then
  • 1 generate the hypothesis, H3, that the
    combined
  • plan R1 or R2' refutes plan P, and
  • 2 generate the fact H3 implies not(H)

13
A TOY KNOWLEDGE BASEWATER IN FLAT
Flat floor plan
Inference network
14
IMPLEMENTING THE LEAKS EXPERT SYSTEM
  • Possible directly as Prolog rules
  • leak_in_bathroom -
  • hall_wet,
  • kitchen_dry.
  • Employ Prolog interpreter as inference engine
  • Deficiences of this approach
  • Limited syntax
  • Limited inference (Prologs backward chaining)
  • Limited explanation

15
TAILOR THE SYNTAX WITH OPERATORS
  • Rules
  • if
  • hall_wet and kitchen_dry
  • then
  • leak_in_bathroom.
  • Facts
  • fact( hall_wet).
  • fact( bathroom_dry).
  • fact( window_closed).

16
BACKWARD CHAINING RULE INTERPRETER
  • is_true( P) -
  • fact( P).
  • is_true( P) -
  • if Condition then P, A relevant rule
  • is_true( Condition). whose condition
    is true
  • is_true( P1 and P2) -
  • is_true( P1),
  • is_true( P2).
  • is_true( P1 or P2) -
  • is_true( P1) is_true( P2).

17
A FORWARD CHAINING RULE INTERPRETER
  • forward -
  • new_derived_fact( P), A new fact
  • !,
  • write( 'Derived '), write( P), nl,
  • assert( fact( P)),
  • forward
    Continue
  • write( 'No more facts'). All facts
    derived

18
FORWARD CHAINING INTERPRETER, CTD.
  • new_derived_fact( Concl) -
  • if Cond then Concl, A rule
  • not fact( Concl), Rule's
    conclusion not yet a fact
  • composed_fact( Cond). Condition
    true?
  • composed_fact( Cond) -
  • fact( Cond). Simple
    fact
  • composed_fact( Cond1 and Cond2) -
  • composed_fact( Cond1),
  • composed_fact( Cond2). Both
    conjuncts true
  • composed_fact( Cond1 or Cond2) -
  • composed_fact( Cond1) composed_fact(
    Cond2).

19
FORWARD VS. BACKWARD CHAINING
  • Inference chains connect various types of info.
  • data ... goals
  • evidence ... hypotheses
  • findings, observations ...
    explanations, diagnoses
  • manifestations ... diagnoses, causes
  • Backward chaining goal driven
  • Forward chaining data driven

20
FORWARD VS. BACKWARD CHAINING
  • What is better?
  • Checking a given hypothesis backward more
    natural
  • If there are many possible hypotheses forward
    more natural (e.g. monitoring data-driven)
  • Often in expert reasoning combination of both
  • e.g. medical diagnosis
  • water in flat observe hall_wet, infer forward
    leak_in_bathroom, check backward kitchen_dry

21
GENERATING EXPLANATION
  • How explanation How did you find this answer?
  • Explanation proof tree of final conclusion
  • E.g. System There is leak in kitchen.
  • User How did you get this?

22
  • is_true( P, Proof) Proof is a proof that P is
    true
  • - op( 800, xfx, lt).
  • is_true( P, P) -
  • fact( P).
  • is_true( P, P lt CondProof) -
  • if Cond then P,
  • is_true( Cond, CondProof).
  • is_true( P1 and P2, Proof1 and Proof2) -
  • is_true( P1, Proof1),
  • is_true( P2, Proof2).
  • is_true( P1 or P2, Proof) -
  • is_true( P1, Proof)
  • is_true( P2, Proof).

23
WHY EXPLANTION
  • Exploring leak in kitchen, system asks
  • Was there rain?
  • User (not sure) asks
  • Why do you need this?
  • System
  • To explore whether water came from outside
  • Why explanation chain of rules between current
    goal and original (main) goal

24
UNCERTAINTY
  • Propositions are qualified with
  • certainty factors, degree of belief,
    subjective probability, ...
  • Proposition CertaintyFactor
  • if Condition then Conclusion Certainty

25
A SIMPLE SCHEME FORHANDLING UNCERTAINTY
  • c( P1 and P2) min( c(P1), c(P2))
  • c(P1 or P2) max( c(P1), c(P2))
  • Rule if P1 then P2 C implies c(P2)
    c(P1) C

26
DIFFICULTIES IN HANDLING UNCERTAINTY
  • In our simple scheme, for example
  • Let c(a) 0.5, c(b) 0, then c( a or b)
    0.5
  • Now, suppose that c(b) increases to 0.5
  • Still c( a or b) 0.5
  • This is counter intuitive!
  • Proper probability calculus would fix this
  • Problem with probability calculus Needs
    conditional probabilities - too many?!?

27
TWO SCHOOLS RE. UNCERTAINTY
  • School 1 Probabilities impractical! Humans dont
    think in terms of probability calculus anyway!
  • School 2 Probabilities are mathematically sound
    and are the right way!
  • Historical development of this debate School 2
    eventually prevailed - Bayesian nets, see e.g.
    Russell Norvig 2003 (AI, Modern Approach),
    Bratko 2001 (Prolog Programming for AI)
Write a Comment
User Comments (0)
About PowerShow.com