CPSC 503 Computational Linguistics - PowerPoint PPT Presentation

About This Presentation
Title:

CPSC 503 Computational Linguistics

Description:

(ii) B: Well, I have to go to Edinburgh today sir (iii) A: Hmm. How about this ... U: I want to go from Boston to Baltimore on the 8th. Slot Optional questions ... – PowerPoint PPT presentation

Number of Views:57
Avg rating:3.0/5.0
Slides: 45
Provided by: giuseppe7
Category:

less

Transcript and Presenter's Notes

Title: CPSC 503 Computational Linguistics


1
CPSC 503Computational Linguistics
  • Discourse and Dialog
  • Lecture 13
  • Giuseppe Carenini

2
Knowledge-Formalisms Map(including probabilistic
formalisms)
Understanding
Generation
State Machines (and prob. versions)
Morphology
Syntax
Rule systems (and prob. versions)
Semantics
  • Logical formalisms
  • (First-Order Logics)

Pragmatics Discourse and Dialogue
AI planners
3
Today 25/10
  • Brief Intro Pragmatics
  • Discourse
  • Monologue
  • Dialog

4
Semantic Analysis
Sentence
Meanings of grammatical structures
  • Syntax-driven
  • Semantic Analysis

Meanings of words
Literal Meaning
I N F E R E N C E
Common-Sense Domain knowledge
Further Analysis
Discourse Structure
Intended meaning
Context
Pragmatics
5
Pragmatics Example
  • (i) A So can you please come over here again
    right now
  • (ii) B Well, I have to go to Edinburgh today
    sir
  • (iii) A Hmm. How about this Thursday?

What information can we infer about the context
in which this (short and insignificant) exchange
occurred ?
6
Pragmatics Conversational Structure
  • (i) A So can you please come over here again
    right now
  • (ii) B Well, I have to go to Edinburgh today
    sir
  • (iii) A Hmm. How about this Thursday?

Not the end of a conversation (nor the beginning)
  • Pragmatic knowledge Strong expectations about
    the structure of conversations
  • Pairs e.g., request response
  • Closing/Opening forms

7
Pragmatics Dialog Acts
(i) A So can you please come over here again
right now (ii) B Well, I have to go to Edinburg
h today sir (iii) A Hmm. How about this Thursday
?
  • A is requesting B to come at time of speaking,
  • B implies he cant (or would rather not)
  • A repeats the request for some other time.
  • Pragmatic assumptions relying on
  • mutual knowledge (B knows that A knows that)
  • co-operation (must be a response triggers
    inference)
  • topical coherence (who should do what on Thur?)

8
Pragmatics Specific Act (Request)
(i) A So can you please come over here again
right now (ii) B Well, I have to go to Edinburg
h today sir (iii) A Hmm. How about this Thursday
?
  • A wants B to come over
  • A believes it is possible for B to come over
  • A believes B is not already there
  • A believes he is not in a position to order B to

Pragmatic knowledge speaker beliefs and
intentions underlying the act of requesting
Assumption A behaving rationally and sincerely
9
Pragmatics Deixis
(i) A So can you please come over here again
right now (ii) B Well, I have to go to Edinburg
h today sir (iii) A Hmm. How about this Thursday
?
  • A assumes B knows where A is
  • Neither A nor B are in Edinburgh
  • The day in which the exchange is taking place is
    not Thur., nor Wed. (or at least, so A believes)

Pragmatic knowledge References to space and time
wrt space and time of speaking
10
Today 25/10
  • Brief Intro Pragmatics
  • Discourse
  • Monologue
  • Dialog

11
Discourse Monologue
  • Monologues as sequences of sentences have
    structure

(like sentences as sequences of words)
  • Key discourse phenomenon referring expressions
    (what they denote may depend on previous
    discourse)
  • Task Coreference resolution

12
Sample Monologues
House-A is an interesting house. It has a
convenient location. Even though house-A is
somewhat far from the park , it is close to work
and to a rapid transportation stop.
It has a convenient location. It is close to
work. Even though house-A is somewhat far from
the park, house-A is an interesting house. It is
close to a rapid transportation stop.
13
Corresponding Text Structure
House-A is an interesting house.
CORE-1
CONCESSION-1
EVIDENCE-1
It has a convenient location.
it is close to a rapid transportation stop
it is close to work
Even though house-A is somewhat far from the park
decomposition
ordering
rhetorical relations
14
Discourse/Text Segmentation(1)
  • State of the art
  • linear (unable to identify hierarchical
    structure)
  • Subtopics, passages
  • UNSUPERVISED
  • Key idea lexical cohesion (vs. coherence)
  • There is not water on the moon. Andromeda is
    covered by the moon.
  • Discourse segments tend to be lexically cohesive
  • Cohesion score drops on segment boundaries

15
Discourse/Text Segmentation(2)
  • SUPERVISED
  • Binary classifier (SVM, decision tree,)
  • make yes-no boundary decision between any two
    sentences
  • features
  • Cohesion features (e.g., word overlap, word
    cosine)
  • Presence of (domain specific) discourse markers
  • News good evening, I am.., joining us now is
  • Real estate ads is previous word phone number?

16
Text Relations, Parsing and Generation
  • Rhetorical (coherence) Relations
  • different proposals (typically 20-30 rels)
  • Elaboration, Contrast, Purpose
  • Parsing Given a monologue, determine its
    rhetorical structure Marcu, 00 and 02

Project James
  • Generation Given a communicative goal e.g.,
    convince user to quit smoking generate
    structure
  • Next class

Project Jackie
17
Reference
Language contains many references to entities
mentioned in previous sentences (i.e., in the
discourse context/model)
  • I saw him
  • I passed the course
  • Id like the red one
  • I disagree with what you just said
  • That caused the invasion
  • Two tasks
  • Coreference resolution
  • Anaphora/pronominal resolution

18
Reference Resolution
Terminology Referring expression NL expression
used to perform reference Referent entity that
is referred
  • Types of referring expressions
  • Indefinite NP (a, some, this)
  • Definite NP (the, )
  • Pronouns (he, she, her,...)
  • Demonstratives (this, that,..)
  • Names
  • Inferrables
  • Generics

19
Pronominal Resolution Simple Algorithm
  • Last object mentioned (correct gender/person)
  • John ate an apple. He was hungry.
  • He refers to John (apple is not a he)
  • Google is unstoppable. They have increased..
  • Selectional restrictions
  • John ate an apple in the store.
  • It was delicious. stores cannot be delicious
  • It was quiet. apples cannot be quiet
  • Binding Theory constraints
  • Mary bought herself a new Ferrari
  • Mary bought her a new Ferrari

20
Additional Complications
  • Some pronouns dont refer to anything
  • It rained
  • must check if verb has a dummy subject
  • Evaluate last object mentioned using parse
    tree, not literal text position
  • I went to the GAP which is opposite to BR.
  • It is a big store.

GAP, not BP
21
Focus
  • John is a good student
  • He goes to all his tutorials
  • He helped Sam with CS4001
  • He wants to do a project for Prof. Gray

He refers to John (not Sam)
22
Supervised ML approach
Corpus annotated with co-reference relations (all
antecedents of each pronoun are marked)
  • What features ?

(U1) Joe saw a nice Ferrari in the parking lot
(U2) He showed it to Bob
(U3) He bought it
23
Need World Knowledge
  • The police prohibited the fascists from
    demonstrating because they feared violence.
  • vs
  • The police prohibited the fascists from
    demonstrating because they advocated violence.

Exactly the same syntax!
  • Not possible to resolve they without detailed
    representation of world knowledge about feared
    violence vs. advocated violence

24
Coreference resolution
  • Decide whether any pair of NPs co-refer
  • Binary classifier again

anaphor
NPj
antecedents
  • What features?
  • Same as for anaphora specific ones to deal with
    definite and names. E.g.,
  • Edit distance
  • Alias (based on type e.g., for PERSON Dr. or
    Chairman can be removed)
  • Appositive (Mary, the new CEO, .

25
Today 25/10
  • Brief Intro Pragmatics
  • Discourse
  • Monologue
  • Dialog

26
Discourse Dialog
  • Most fundamental form of language use
  • First kind we learn as children

Dialog can be seen as a sequence of communicative
actions of different kinds (dialog acts) - (DAMSL
1997 20)
27
Dialog two key tasks
  • (1) Dialog act interpretation identify the user
    dialog act
  • (2) Dialog management (1) decide what to say
    and when

28
Dialog Act Interpretation
  • What dialog act a given utterance is?
  • Surface form is not sufficient!

E.g., Im having problems with the homework
  • Statement - prof. should make a note of this,
    perhaps make homework easier next year
  • Directive - prof. should help student with the
    homework
  • Information request - prof should give student
    the solution

29
Automatic Interpretation of Dialog Acts
State Machines (and prob. versions)
Morphology
Cue-based
Syntax
Rule systems (and prob. versions)
Semantics
  • Logical formalisms
  • (First-Order Logics)

Pragmatics Discourse and Dialogue
Plan-Inferential
AI planners
30
Plan Inferential (BDI) Pros/Cons
  • Dialog acts are expressed as plan operators
    involving belief, desire, intentions
  • Powerful uses rich and sound knowledge
    structures - should enable modeling of subtle
    indirect uses of dialog acts

31
Cue-Based Key Idea
  • Words and collocations
  • Please and would you - REQUEST
  • are you and is it - YES-NO-QUESTIONs

Prosody Loudness or stress yeah - AGREEMENT
vs. BACKCHANNEL
  • Conversational structure
  • Yeah following PROPOSAL - AGREEMENT
  • Yeah following INFORM - BACKCHANNEL

32
Cue-Based model (1)
  • Each dialog act type (d) has its own
    micro-grammar which can be captured by N-gram
    models

Annotated Corpus
  • Lexical given an utterance W w1 wn for each
    dialog act (d) we can compute P(Wd)
  • Prosodic given an utterance F f1 fn for each
    dialog act (d) we can compute P(Fd)

33
Cue-Based model (2)
  • Conversational structure Markov chain
  • 1

Annotated Corpus
  • 1
  • .8
  • .3
  • 1
  • .2
  • .2
  • .3
  • .5
  • .7

Combine all info sources HMM
N-gram models!
34
Cue-Based model Summary
  • Start form annotated corpus (each utterance
    labeled with appropriate dialog act)
  • For each dialog act type (e.g., REQUEST), build
    lexical and phonological N-grams
  • Build Markov chain for dialog acts (to express
    conversational structure)

35
Dialog Managers in Conversational Agents
  • Examples Airline travel info system,
    restaurant/movie guide, email access by phone
  • Tasks
  • Control flow of dialogue (turn-taking)
  • What to say/ask and when

36
Dialog Managers
State Machines (and prob. versions)
Morphology
FSA
Syntax
Rule systems (and prob. versions)
Semantics
Template-Based
Pragmatics Discourse and Dialogue
  • Logical formalisms
  • (First-Order Logics)

BDI
MDP
AI planners (and prob. versions)
37
FSA Dialog Manager system initiative
  • xxx

38
Template-based Dialog Manager (1)
  • GOAL to allow more complex sentences that
    provide more than one info item at a time

S How may I help you? U I want to go from Bosto
n to Baltimore on the 8th.
Slot Optional questions From_Airport Fro
m what city are you leaving? To_Airport W
here are you going? Dept-Time When do you
want to leave? Dept-Day
  • Interpretation Semantic Grammars, semi-HMM,
    Hidden-Understanding-Models (HUM)

39
Template-based Dialog Manager (2)
  • More than one template e.g., car or hotel
    reservation
  • User may provide information to fill slots in
    different templates
  • A set of production rules fill slots depending on
    input and determines what questions should be
    asked next

E.g., IF user mention car slot and most of air
slot are filled THEN ask about remaining car sl
ots.
40
Markov Decision Processes
  • Common formalism in AI to model an agent
    interacting with its environment.
  • States / Actions / Rewards
  • Application to dialog
  • States slot in frame currently worked on, ASR
    confidence value, number of questions about
    slot,..
  • Actions questions types, confirmation types
  • Rewards user feedback, task completion rate

41
BDI Dialog Manager
S1 How may I help you? U1 I want to go to Pitts
burgh in April. S2 And, what date in April do yo
u want to travel? U2 Uh hmm I have a mtg. there
on the 12th.
REQUEST
ACKNOWLEDGE
REQUEST
INFORM
  • Sys to understand U2 needs model of
    preconditions, effects, decomposition of
  • meeting event (precon be there)
  • fly-to plan (decomp book-flight, take-flight)
  • Take-flight plan (effect be there)

42
BDI Dialog Manager
S1 How may I help you? U1 I want to go to Pitts
burgh in April. S2 And, what date in April do yo
u want to travel? U2 Uh hmm I have a mtg. there
on the 12th.
REQUEST
ACKNOWLEDGE
REQUEST
INFORM
  • Sys to generate S2 needs model preconditions of
  • Book-flight action (agent knows departure date
    and time)

Integrated with logic-based planning system
  • Generating an utterance plan generation
    (possibly) satisfying multiple goals
  • Understanding an utterance plan recognition
    (recognize multiple goals)

43
Designing Dialog Systems User-Centered Design
  • Early Focus on User and Task e.g., interview the
    users
  • Build Prototypes Wizard-of-Oz (WOZ) studies
  • Evaluation

Iterative Design
44
Next Time Natural Language Generation
  • Read handout on NLG
  • Lecture will be about an NLG system that I
    developed and tested
Write a Comment
User Comments (0)
About PowerShow.com