Bayesian Networks and Markov Models: User Modeling and Natural Language Processing - PowerPoint PPT Presentation

1 / 82
About This Presentation
Title:

Bayesian Networks and Markov Models: User Modeling and Natural Language Processing

Description:

... influence of greasy engine block on worn piston rings: Greasy engine block is evidence of oil ... Jack went to the liquor store. 39. Cycle of Operation. Loop ... – PowerPoint PPT presentation

Number of Views:95
Avg rating:3.0/5.0
Slides: 83
Provided by: ING139
Category:

less

Transcript and Presenter's Notes

Title: Bayesian Networks and Markov Models: User Modeling and Natural Language Processing


1
Bayesian Networks and Markov Models User
Modeling and Natural Language Processing
  • Bayesian networks and Markov models
  • Applications in User Modeling and Natural
    Language Processing

2
BNs Applications in User Modeling and Natural
Language Processing
  • Discourse planning
  • Plan recognition
  • Dialogue

3
Bayesian Networks in Discourse Planning
  • Explaining BN reasoning
  • Scenario-based and qualitative reasoning
  • Combining BNs with User Modeling

4
Explaining BN Reasoning
  • Probabilistic reasoning is sound
  • ? Expert systems that perform probabilistic
    reasoning yield correct results
  • But people are not Bayesian (Kahneman and
    Tversky, 1982)
  • Challenge explain Bayesian reasoning

5
Bayesian Networks and Discourse Planning
  • Explaining BN reasoning
  • Scenario-based and qualitative reasoning
  • Combining BNs with User Modeling

6
Druzdzel and Henrion (1993)
  • Scenario-based explanations of reasoning in BNs
  • Justified by psychological findings
  • Identify nodes in the BN that are relevant to the
    hypothesis of interest
  • Qualitative probabilistic networks translate
    quantitative and qualitative information in a BN
    into linguistic expressions

7
Example Scenario-based Explanations
  • 27 scenarios, e.g.,
  • cold, sneezing, no-allergy, no-cat, paw-marks,
    dog, barking
  • no-cold, sneezing, allergy, cat, paw-marks, dog,
    barking

8
Generating Scenario-based Explanations
  • The probability of each scenario is computed by
    multiplying the probability of its component
    events
  • Scenarios are divided into two groups
  • compatible with the hypothesis
  • incompatible with the hypothesis
  • Explanations are based on the most probable
    scenarios from these groups that collectively
    account for X of the probability of the
    hypothesis

9
Example
  • Why cold? Given sneezing, paw marks, barking
  • Therefore cold is almost as likely as not (p0.43)

10
Qualitative Probabilistic Networks (QPNs)
  • Draw inferences qualitatively
  • The relation between two adjacent nodes is
    denoted as positive (), negative (-), null (0)
    or unknown (?)
  • Advantage simplify the construction of models
  • Disadvantage lack of precision in the results

11
Example
12
Generating Explanations from QPNs
  • Linguistic patterns of inference
  • Predictive (causal) inference A can cause B.
  • Diagnostic inference B is evidence of A.
  • Explain away A and B can each cause C. A
    explains C, so it is evidence against B.

13
Example
Qualitative influence of greasy engine block on
worn piston rings Greasy engine block is
evidence of oil leak. Oil leak and excessive oil
consumption can each cause low oil level. Oil
leak explains low oil level and so is evidence
against excessive oil consumption. Decreased
likelihood of excessive oil consumption is
evidence against worn piston rings. Therefore,
greasy engine block is evidence against worn
piston rings.
14
Druzdzel and Henrion Contributions
  • Two types of explanations of reasoning in BNs
  • Scenario based
  • Qualitative probabilistic networks
  • Linguistic patterns of inference

15
Bayesian Networks and Discourse Planning
  • Explaining BN reasoning
  • Scenario-based and qualitative reasoning
  • Combining BNs with User Modeling

16
Zukerman et al. (1998)
  • NAG (Nice Argument Generator) Bayesian
    argumentation system that generates nice
    arguments
  • Nice arguments combine
  • normative justification of a conclusion
  • persuasiveness
  • Goal given a goal proposition and target belief
    ranges, generate an argument for the goal
    proposition

17
Basic Process Generation-Analysis Cycle
18
Two Models, Two Views
Normative model
User model
Semantic Net
Attention
Reasoning
BN
19
Output of the System
  • Argument Graph structural intersection of the
    user model BN and the normative model BN
  • Output an Argument Graph which achieves a belief
    in the goal proposition within the target ranges

20
Generation-Analysis Algorithm
  • Input goal, initial context, target belief
    ranges
  • Use attentional focus to expand the Argument
    Graph and determine subgoals for investigation
  • Perform abduction on the subgoals to expand
    further the Argument Graph
  • Analyze the Argument Graph by propagating belief
    in the user and normative BNs
  • If the argument is nice then
  • Try to simplify it
  • Present it
  • Else expand the context and go to Step 1

21
Attentional Focusing
  • Clamp the items in the current context
  • Spread activation
  • Add the propositions activated in the user- and
    normative BNs to the list of subgoals

22
Analysis
  • Propagate the part of the Argument Graph that is
    connected to the goal proposition in the
    normative model BN and the user model BN
  • Determine the posterior belief in the goal
    proposition in both models

23
Example
  • Preamble Approximately 65 million years BC the
    dinosaurs, large reptiles that dominated the
    Earth for many millions of years, became extinct.
    At about the same time, the number of giant
    sequoias in California greatly increased.
  • Goal A large iridium-rich asteroid struck Earth
    about 65 million years BC.
  • Context Goal proposition salient concepts and
    propositions related to the preamble and the goal
  • Belief ranges
  • Normative model 0.8,1
  • User model 0.7,1

24
Example Focusing (in the SN)
25
Example Abduction (in the BN)
26
Example Next Cycle (in SN and BN)
27
Argument Simplification
  • Iteratively traverse the Argument Graph
  • Delete a node from the Argument Graph
  • Analyze the resulting Argument Graph to determine
    belief in the goal proposition
  • If the argument is no longer nice, reinstate the
    deleted node

28
Example after Simplification
29
NAG Contributions
  • A Bayesian mechanism for argument generation
  • uses a series of focus-generation-analysis cycles
  • combines goal-based and associative reasoning?
    Significant reduction in generation time
  • relies on a normative model and a model of the
    user's beliefs

30
Summary
  • Two Bayesian discourse planning systems

31
Bayesian Networks in Plan Recognition
  • Narrative understanding
  • Intention recognition in graphics
  • Discourse interpretation

32
WIMP Charniak and Goldman (1993)
  • Explain actions in a narrative
  • Combine
  • plan knowledge
  • associative reasoning
  • probabilistic reasoning
  • BN is expanded as the story unfolds

33
Plan Knowledge
  • Predicates that represent a plan and its actions,
    e.g.,(inst ?shop shopping- ) ? (and
    (inst (go-stp ?shop) go- ) (
    (agent (go-stp ?shop))
    (agent ?shop)) ( (destination
    (go-stp ?shop)) (store-of
    ?shop)))

34
Associative Network
  • Associative view of the plan database

35
Plan Recognition Networks (PRNs)
A BN is a pair (N,E), where N are nodes and E are
edges
  • Basis BN (,)
  • Object evidence introduce new evidence
  • Up-existential introduce a hypothesis
  • Slot-filler integrate actions and entities into
    plans
  • Other evidence add other nodes and arcs into
    the network

36
Object-evidence Clause Example
Jack went to the liquor store.
37
Up-existential Clause Example
38
Slot-filler Clause Example
Jack went to the liquor store.
39
Cycle of Operation
  • Loop
  • Read a word and record some propositions about
    the text
  • Propositions trigger semantic network
    construction rules Object evidence Other
    evidence
  • Evaluate the resultant network
  • Perform marker passing ? discover paths that
    link observed and hypothesized entities
  • Paths trigger associative network construction
    rules Up existential Slot filler
  • Re-evaluate the network

40
Marker Passing
  • Curbs combinatorial explosion during network
    construction
  • Marker passer
  • propagates marks through the plan network,
    starting with newly inserted nodes
  • uses information left behind by the marks to
    reconstruct paths between the endpoints
  • Good paths are used to fire network construction
    rules

41
Gathering the Probabilities
  • Atomic (leaf) terms are deemed equiprobable from
    a large space of events
  • Non-leaf events are assigned probabilities
    derived from the construction rules

42
Competing Hypotheses Example
Jack went to the liquor store.
43
Competing Hypotheses Example
Jack went to the liquor store. He pointed a gun
at the owner.
44
WIMP Contributions
  • A set of rules that translate plan recognition
    problems into BNs
  • A marker passing scheme for focusing attention

45
Elzer et al. (2005)
  • Intention recognition in graphics
  • Recognize an intended message by reasoning about
    the communicative signals in the graphic
  • Extend speech act theory to the understanding of
    information graphics
  • Dynamic construction of a BN

46
Plan Operators
  • Capture knowledge about how a designers
    communicative goal can be achieved via the viewer
    performing certain perceptual or cognitive tasks
  • High-level messages, based on a corpus study (12
    categories), e.g.,
  • Get-Rank
  • Trend (Rising, Falling, Stable)
  • Change-Trend
  • Maximum/Minimum

47
Network Structure
  • Top Level Node (root node) captures the
    probability of all of the categories of
    high-level messages
  • Second Level each individual category of
    high-level message is represented (again) as a
    child of the top-level node
  • Alternative instantiations appear in the network
    as children of the nodes representing the
    high-level intentions
  • If there are multiple ways for a goal to be
    achieved, these are captured as children of the
    instantiated goal node
  • Evidence is incorporated at various levels

48
(No Transcript)
49
Alternative Instantiations of Get-Rank
50
Alternative Ways of Achieving Get-Rank(BAR1)
51
Subnetwork for Get-Rank(BAR1)
52
Dynamic Construction of the BN
  • Restrict the size of the network by only adding
    nodes representing tasks that we have some reason
    to believe might be part of the inferred plan

53
Communicative Signals
  • Relative effort required for different perceptual
    tasks
  • Captions
  • Salience
  • Noun in caption
  • Highlighting a bar
  • Special annotations
  • Most recent date
  • Height of a bar

54
Exploiting Communicative Signals
  • Selecting perceptual task nodes for insertion
    into the network
  • Identify
  • set of lowest effort perceptual tasks
  • salient elements
  • This is evidence that will influence the systems
    hypothesis regarding the graphic designers
    intended message

55
Evidence Nodes at Perceptual Task Node Level
56
Evidence Nodes at Top Level (Verbs Adjectives)
57
Gathering the Probabilities
  • Used a corpus study, and for each graph
  • estimated the perceptual task effort
  • determined which tasks involve salient elements
    (and which type of salience)

58
System Performance Example 1
  • Perceptual task effort as only evidence
  • Hypothesis Rank-All ? 87

59
System Performance Example 2
  • U.S. still salient, U.S. and Japan annotated
  • Hypothesis Relative diff btwn U.S. and Japan ?
    87.3

60
Elzer et al. Contributions
  • Extend plan inference to information graphics
  • Identify communicative signals
  • Exploit the signals in a BN

61
Zukerman et al. (2006)
  • Intention recognition in argument
  • Interpret a speakers discourse in terms of a
    systems knowledge representation formalism ? BN
  • Model selection approach

62
Example Users Simple Argument
Since Mr Green was in the garden at 11, he
probably had the opportunity to murder Mr Body.
63
Interpretation in the context of a BN
  • Since Mr Green was in the garden at 11, he
    probably had the opportunity to murder Mr Body.

. . .
GreenMurderedBody
GreenVisitBody LastNight
. . .
GreenHadOpportunity
probably
NeighbourHeard GreenBodyArgue LastNight
GreenInGardenAtTimeOfDeath
very probably
GreenLadder AtWindow
TimeOfDeath11
GreenInGardenAt11
. . .
64
What is an Interpretation?
  • IG Interpretation Graph Bayesian subnet that
    matches the users argument
  • SC Supposition Configuration suppositions
    attributed to the user to make sense of the
    argument
  • EE Explanatory Extensions shared beliefs
    incorporated in the interpretation to make it
    more acceptable to people

65
Selecting an Interpretation
  • Postulate Given candidate interpretations
    Int1,,Intn, the intended interpretation is that
    with the highest posterior probability

66
Selecting an Interpretation
Postulate Given candidate interpretations
IG1,SC1,EE1,,IGn,SCn,EEn, the intended one
is that with the highest posterior probability
67
Model Selection
  • Trade-off between
  • probability of the model (interpretation), and
  • data fit probability of the data (discourse)
    given the model

68
BIAS (Bayesian Interactive Argumentation System)
Argument
Search Proposing Interpretations
Interpretations
BACKGROUND KNOWLEDGE
Probabilistic Reasoning Selecting an
Interpretation
Most probable interpretation
69
Selecting an Interpretation
Argument
Interpretations
BACKGROUND KNOWLEDGE
Probabilistic Reasoning Selecting an
Interpretation
70
Pr (IG, SC, EE)
  • The prior probability of the IG, SC and EE in
    light of the background knowledge
  • probability of extracting the IG structure and
    the EE structure from the background knowledge
    (underlying BN)
  • probability of the beliefs in the IG given the
    beliefs in the SC background knowledge
    (including peoples preferences)
  • probability of the beliefs in SC given the
    beliefs in the background knowledge

71
Pr (Discourse IG,SC,EE)
  • The probability of the discourse given the IG
  • probability of obtaining the discourse structure
    from the IG
  • probability of stating the beliefs in the
    discourse when intending the corresponding
    beliefs in the IG (obtained from the background
    knowledge and the SC)

72
Zukerman et al. Contributions
  • Casts discourse interpretation as a model
    selection task ? balances conflicting factors
  • Model incorporates various factors
  • argument structure (Interpretation Graph)
  • not shared beliefs (Suppositions), and
  • implicitly shared beliefs (Explanatory Extensions)

73
Summary
  • Three plan recognition systems based on BNs

74
BNs Applications in User Modeling and Natural
Language Processing
  • Discourse planning
  • Plan recognition
  • Dialogue

75
Horvitz and Paek (2007)
  • Goal combine human and automated resources in a
    spoken dialogue system
  • Employ an automatically generated BN to predict
    dialogue duration and outcome
  • Apply a decision theoretic approach to determine
    when to transfer a call to a human receptionist

76
Predictive Features
  • System and user actions e.g., whether the
    dialogue system has asked for confirmation
  • Session summary evidence e.g., number of
    attempts to specify a name
  • N-best list evidence e.g., features output by
    the ASR, e.g., range of confidence scores, count
    of most frequent name
  • Generalized temporal evidence e.g., trends
    across the n-best list, e.g., whether top name is
    same for two lists

77
Predictive Bayesian Network Outcome
78
Predictive Bayesian Network Duration
79
Minimizing Expected Duration
  • Ho transfer to an operator
  • Ha successful termination with automation

80
Results
Legacy requests for operator at each step (black
bars), and portions of these cases transferred
proactively to an operator
81
Summary BN Applications
  • Two discourse planning systems
  • BN used as KR platform
  • Three plan recognition/interpretation systems
  • BN used as inference mechanism
  • BN used as KR platform
  • One dialogue system
  • BN used as predictive/inference mechanism

82
Challenges Managing Uncertainty in Discourse
and Dialogue
  • Tasks
  • Explaining BN reasoning to people
  • Understanding peoples reasoning in terms of BNs
  • More work on dialogue more complex tasks
  • Specific issues
  • Representing info, obtaining probabilities
  • Curbing combinatorial explosion
  • Combining information from multiple sources
Write a Comment
User Comments (0)
About PowerShow.com