Tense and Implicit Role Reference - PowerPoint PPT Presentation

About This Presentation
Title:

Tense and Implicit Role Reference

Description:

U1: Take Engine E1 from Avon to Dansville. U2: Pick up the boxcar ... ne12 engine E1 /ne from ne id=ne5 Avon /ne to ne id=ne6 Dansville /ne /ve ... – PowerPoint PPT presentation

Number of Views:63
Avg rating:3.0/5.0
Slides: 20
Provided by: csRoch
Category:

less

Transcript and Presenter's Notes

Title: Tense and Implicit Role Reference


1
Tense and Implicit Role Reference
  • Joel Tetreault
  • University of Rochester
  • Department of Computer Science

2
Implicit Role Reference
  • Verb phrases have certain required roles NPs
    that are expected
  • For example take
  • Something to take (theme)
  • A place to take it from (from-loc)
  • A place to take it to (to-loc)
  • Something to do the taking (agent)
  • Possibly a tool to do the taking (instrument)
  • Very little work has been has been done
  • (Poesio, 1994 Asher and Lascarides, 1998)

3
Goal
  • Resolving IRRs important to NLP
  • To investigate how implicit roles work
  • Develop an algorithm for resolving them
  • Evaluation of algorithm for empirical results
  • Use temporal information and discourse relations
    to improve results

4
Outline
  • Implicit Roles
  • Annotation
  • Algorithm
  • Results
  • Discussion

5
Example
  • (1) Take engine E1 from Avon to Dansville
  • (2a) Pick up the boxcar and take it to Broxburn
    from ?
  • (2b) And then take the boxcar from Corning
  • to ?
  • (3a) Leave E1 there but move the boxcar down the
    road to Evansville. from ?
  • (3b) Leave the boxcar there.

6
Statistics Role Distributions
7
Corpus
  • Annotated a subset of the TRAINS-93 Corpus
    (Heeman and Allen, 1994)
  • 86 utterance task-oriented dialog between two
    humans
  • Task move commodities around in a virtual world

8
Annotation
  • Used sgml style annotation scheme
  • NPs annotated with ID and its class (engine,
    tanker, location, food)
  • VPs annotated with ID, event time, and roles
  • Roles for each verb are taken from TRIPS lexicon
    (Allen et al., 2000)

9
Temporal Annotation
  • An event time was assigned each utterance, such
    as t0, t1, u1, etc.
  • And constraints upon the event time are imposed
  • t9t1 (t9 comes after t1)
  • t9
  • t9t1 t9

10
Sample Annotation
  • U1 Take Engine E1 from Avon to Dansville.
  • U2 Pick up the boxcar
  • to-locne6Take engine E1 from
    Avon to Dansville/ve.
  • t0 themene13
    from-locne6iPick upthe
    boxcar.

11
Statistics
  • Most implicit roles have antecedents found
    locally (0-2 sentences back over 90 of the time)
  • Instrument 79 Instr, 10 theme, 10 ID
  • Theme 88 Theme, 12 ID
  • From-Loc 62 From-Loc, 38 To-Loc
  • To-Loc 57 To-Loc, 29 From-Loc, 14 Theme

12
Algorithm
  • For each utterance u, process u left to right
  • If NP is encountered, push it on appropriate
    focus stack
  • If VP is encountered
  • place all explicit roles on top of appropriate
    focus stack
  • If role is implicit.

13
Algorithm Example
U1 Take engine E1 from Avon to Dansville
Avon
Dansville
Engine E1
empty
Theme
From-Loc
To-Loc
Instrument
U2 Also take the boxcar
Avon
Dansville
boxcar
empty
Theme
From-Loc
To-Loc
Instrument
14
Implicit Role Algorithm
  • Type determines method. If role is
  • Instrument search through current utterance
    first for an entity that meets verbs
    constraints, else go back through past
    utterances instrument and theme focus lists
  • Theme same as above except search theme before
    instrument for each past utterance
  • From/To-Loc use temporal reasoning to determine
    what order to search past To-Loc and From-Loc
    lists for each utterance

15
Temporal Algorithm
  • For two utterances uk and uj,with k j,
    determine rel(uk, uj)
  • If time(uk) time(uj) then rel(uk, uj)
    narrative
  • Else rel(uk, uj) parallel

16
Experiment
  • Developed LISP system that automates the
    algorithm
  • For each marked implicit role, system tries to
    find an antecedent
  • Notations
  • R-L each focus list searched right to left
    (order of recency)
  • L-R search is left-to-right (sentence order)
  • Time algorithm augmented with temporal algorithm

17
Results
18
Discussion
  • From-Locs naive version is better
  • To-Loc any strategy better than naïve
  • Top pronoun resolution algorithms perform around
    70-80 accuracy
  • Problems
  • Corpus size hard to make concrete conclusions
    or find trends
  • Annotation scheme is basic
  • Need to handle return verbs properly
  • Augment system to identify whether implicit roles
    should be resolved or not (ignore general cases)

19
Current Work
  • Building a larger corpus that can be annotated
    automatically using the TRIPS parser
  • Domain is much more varied and has different
    types of verbs
Write a Comment
User Comments (0)
About PowerShow.com