Semantic Annotation Evaluation and Utility - PowerPoint PPT Presentation

About This Presentation
Title:

Semantic Annotation Evaluation and Utility

Description:

Site presentations included an overview of the phenomena covered and utility ... Mona Diab (Columbia) Bonnie Dorr (UMD) Tim Finin (JHU/APL) Nizar Habash (Columbia) ... – PowerPoint PPT presentation

Number of Views:75
Avg rating:3.0/5.0
Slides: 23
Provided by: bonni5
Category:

less

Transcript and Presenter's Notes

Title: Semantic Annotation Evaluation and Utility


1
Semantic Annotation Evaluation and Utility
  • Bonnie Dorr
  • Saif Mohammad
  • David Yarowsky
  • Keith Hall

2
Road Map
  • Project Organization
  • Semantic Annotation and Utility Evaluation
    Workshop
  • Focus Area Informal Input
  • Belief/Opinion/Confidence (modality)
  • Dialog Acts
  • Complex Coreference (e.g., events)
  • Temporal relations
  • Interoperability
  • Current and Future Work

3
Project Organization
CMU (Mitamura, Levin, Nyberg) Coreference Entity
relations Committed Belief
BBN (Ramshaw, Habash) Temporal
Annotation Coreference (complex)
Evaluation Bonnie Dorr David Yarowsky Keith
HallSaif Mohammad
UMBC (Nirenburg, McShane) Modality polarity,
epistemic, belief, deontic, volitive, potential,
permissive, evaluative
Columbia (Rambow, Passonneau) Dialogic
Content Committed Belief
Affiliated Efforts Ed Hovy Martha
Palmer George Wilson (Mitre)
4
Semantic Annotation Utility Evaluation Meeting
Feb 14th
  • Site presentations included an overview of the
    phenomena covered and utility-motivating
    examples, extracted from the target corpus.
  • Collective assessment of what additional
    capabilities could be achieved if a machine
    could achieve near human-performance on
    annotation of these meaning layers relative to
    applications operating on text without such
    meaning layer analysis.
  • Compatibility, Interoperability, integration into
    larger KB environment.
  • How can we automate these processes?

5
Attendees
  • Kathy Baker (DoD)
  • Mona Diab (Columbia)
  • Bonnie Dorr (UMD)
  • Tim Finin (JHU/APL)
  • Nizar Habash (Columbia)
  • Keith Hall (JHU)
  • Eduard Hovy (USC/ISI)
  • Lori Levin (CMU)
  • James Mayfield (JHU/APL)
  • Teruko Mitamura (CMU)
  • Saif Mohammad (UMD)
  • Smaranda Muresan (UMD)
  • Sergei Nirenburg (UMBC)
  • Eric Nyberg (CMU)
  • Doug Oard (UMD)
  • Boyan Onyshkevych (DoD)
  • Martha Palmer (Colorado)
  • Rebecca Passonneau (Columbia)
  • Owen Rambow (Columbia)
  • Lance Ramshaw (BBN)
  • Clare Voss (ARL)
  • Ralph Weischedel (BBN)
  • George Wilson (Mitre)
  • David Yarowsky (JHU)

6
Analysis of Informal Input Unifies Majority of
Annotation Themes
  • Four relevant representational Layers
  • Belief/Opinion/Confidence (modality)
  • Dialog Acts
  • Coreference (entities and events)
  • Temporal relations
  • Many relevant applications
  • KB population
  • Social Network Analysis
  • Sentiment analysis
  • Deception detection
  • Text mining
  • Question answering
  • Information retrieval
  • Summarization
  • Analysis of informal input is dynamic a first
    analysis may be refined when subsequent informal
    input contributions are processed

7
Representational Layer 1 Committed Belief
  • Committed belief Speaker indicates in this
    utterance that Speaker believes the proposition
  • I know Afghanistan and Pakistan have provided the
    richest opportunity for Al Qaeda to take root.
  • Non-committed belief Speaker identifies the
    proposition as something which Speaker could
    believe, but Speaker happens not to have a strong
    belief in the proposition
  • Afghanistan and Pakistan may have provided the
    richest opportunity for Al Qaeda to take root.
  • No asserted belief for Speaker, the proposition
    is not of type in which Speaker is expressing a
    belief, or could express a belief. Usually, this
    is because the proposition does not have a truth
    value in this world.
  • Did Afghanistan and Pakistan provide the richest
    opportunity for Al Qaeda to take root?

8
Committed Belief is not Factivity
  • CB committed belief, NA No asserted belief
  • Committed-belief annotation and factivity
    annotation are complementary
  • NA cases may lead to detection of current and
    future threats, sometimes conditional. Multiple
    modalities (opinion detection)
  • Potential Smith might be assassinated if he
    is in power.
  • Obligative Smith should be assassinated.

9
Committed Belief is not Tense
  • CB committed belief, NA No asserted
    belief
  • Special feature to indicate future tense on CB
    (committed belief) and NCB (non-committed belief)

10
Why Is RecognizingCommitted Belief Important?
  • Committed-Belief Annotation Distinguishes
  • Propositions that are asserted as true (CB)
  • Propositions that are asserted but speculative
    (NCB)
  • Propositions that are not asserted at all (NA)
  • Important whenever we need to identify facts
  • IR Query show documents discussing instances of
    peasants being robbed of their land
  • Document found 1 The people robbing Iraqi
    peasants of their land should be
    punished RELEVANT YES
  • Document found 2 Robbing Iraqi peasants of their
    land would be bad. RELEVANT NO
  • QA Did the humanitarian crisis in Iraq end?
  • Text found 1 He arrived on Tuesday, bringing an
    end to the humanitarian crisis in Iraq. ANS
    YES.
  • Text found 2 He arrived on Tuesday, calling for
    an end to the humanitarian crisis in Iraq. ANS I
    DONT KNOW

11
Representational Layer 2 Dialog Acts
  • INFORM
  • REQUEST-INFORMATION
  • REQUEST-ACTION
  • COMMIT
  • ACCEPT
  • REJECT
  • BACKCHANNEL
  • PERFORM
  • CONVENTIONAL

12
Why is dialog analysis important?
  • Understanding the outcome of an interaction
  • What is the outcome?
  • Who prevailed?
  • Why (status of interactants, priority of
    communicative action)?
  • Application of a common architecture to automatic
    analysis of interaction in email, blogs, phone
    conversations, . . .
  • Social Network Analysis Is the speaker/sender in
    an inferior position to the hearer/receiver?
  • How can we know? (e.g., REJECT a REQUEST)

13
Representational Layer 3 Complex Coreference
(e.g., events)
  • Annotate events beyond ACE coreference definition
  • ACE does not identify Events as coreferents when
    one mention refers only to a part of the other
  • In ACE, the plural event mention is not
    coreferent with mentions of the component
    individual events.
  • ACE does not annotate
  • Three people have been convictedSmith and Jones
    were found guilty of selling guns
  • The gunman shot Smith and his son. ..The attack
    against Smith.

14
Related Events (and sub-events)
  • Events that happened
  • Britain bombed Iraq last night.
  • Events which did not happen
  • Hall did not speak about the bombings.
  • Planned events
  • planned, expected to happen, agree to do
  • Hall planned to meet with Saddam.
  • Sub-Event Examples
  • drug war (contains subevents attacks,
    crackdowns, bullying)
  • attacks (contains subevents deaths,
    kidnappings, assassination, bombed)

15
Why is complex coreference resolution important?
  • Complex Question Answering
  • Event questions Describe the drug war events in
    Latin America.
  • List questions List the events related to
    attacks in the drug war.
  • Relationship questions Who is attacking who?

16
Representational Layer 4 Temporal Relations
  • Baghdad 11/28 -- Senator Hall arrived in Baghdad
    yesterday. He told reporters that he will not
    be visiting Tehran before he left Washington. He
    will return next Monday.
  • TimeUnit Type Relation Parent
  • 11/28 Specific.Date After arrived
  • arrived Past.Event Before ltwritergt
  • yesterday Past.Date Concurrent arrived
  • told Past.Say Before arrived
  • visiting Neg.Future.Event After told
  • left Past.Event After told
  • return Future.Event After ltwritergt
  • Monday Specific.Date Concurrent return

17
Temporal Relation Parse
ltwritergt
arrived
return
11/28
yesterday
told
Monday
left
(not) visiting
TIME
18
Temporal Relation AnalysisInter-annotator
Agreement
19
Why is Temporal Analysis Important?
  • Constructing activity schedules from text
  • Question answering (temporal) did/does/will X
    happen before/after/same_time_with Y?where X,Y
    are events, states, dates or time ranges.

20
Interoperability Data
  • Common data model
  • Multiple implementations
  • based on the same underlying schema(formal
    object model)
  • meet different goals / requirements
  • Implementation Criteria
  • Support effective run-time annotation
  • Support effective user interface, query/update
  • Support on-the-fly schema extension

21
Example UMBC Modality Annotations
21
22
Ongoing and Future work
  • Move to new genreinformal input.
  • Establish compatibility across levels.
  • Continue examining intra-site and cross-site
    annotation agreement rates
  • Initial assessment of computational feasibility
    of machine learning approachesour annotations
    are supposed to be fodder for ML approaches.
  • Implementation of framework for superimposing
    semantic layers on existing objects (e.g., on
    top of ACE types).
  • Move to multiple languages.
Write a Comment
User Comments (0)
About PowerShow.com