Special Topics in Computer Science Advanced Topics in Information Retrieval Lecture 11: Natural Lang - PowerPoint PPT Presentation

Loading...

PPT – Special Topics in Computer Science Advanced Topics in Information Retrieval Lecture 11: Natural Lang PowerPoint presentation | free to download - id: ee5ff-NzY4Z



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

Special Topics in Computer Science Advanced Topics in Information Retrieval Lecture 11: Natural Lang

Description:

Electronic Commerce & Internet Application Laboratory. Special ... adjudge a prize. yield the word. confer a degree. deliver a lection 'get' attract attention ... – PowerPoint PPT presentation

Number of Views:847
Avg rating:3.0/5.0
Slides: 49
Provided by: alexande95
Learn more at: http://www.gelbukh.com
Category:

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: Special Topics in Computer Science Advanced Topics in Information Retrieval Lecture 11: Natural Lang


1
Special Topics in Computer Science Advanced
Topics in Information Retrieval Lecture 11
Natural Language Processing and IR.
Semanticsand Semantically-rich representations
  • Alexander Gelbukh
  • www.Gelbukh.com

2
Previous Lecture Conclusions
  • Syntax structure is one of intermediaterepresenta
    tions of a text for its processing
  • Helps text understanding
  • Thus reasoning, question answering, ...
  • Directly helps POS tagging
  • Resolves lexical ambiguity of part of speech
  • But not WSD-type ambiguities
  • A big science in itself, with 50 (2000?) years of
    history

3
Previous Lecture Research topics
  • Faster algorithms
  • E.g. parallel
  • Handling linguistic phenomena not handled
    bycurrent approaches
  • Ambiguity resolution!
  • Statistical methods
  • A lot can be done

4
Contents
  • Semantic representations
  • Semantic networks
  • Conceptual graphs
  • Simpler representations
  • Head-Modifier pairs
  • Tasks beyond IR
  • Question Answering
  • Summarization
  • Information Extraction
  • Cross-language IR

5
Syntactic representation
  • A sequence of syntactic trees.

6
Semantic analysis
Semanticanalysis
7
Semantic representation
  • Complex structure of whole text

8
Semantic representation
  • Expresses the (direct) meaning of the text
  • Not what is implied
  • Free of the means of communications
  • Morphological cases (transformed to semantic
    links)
  • Word order, passive/active
  • Sentences and paragraphs
  • Pronouns (resolved)
  • Free of means of expressing
  • Synonyms (reduced to a common ID)
  • Lexical functions

9
Lexical Functions
  • The same meaning expressed by different words
  • The choice of the word is a function of other
    words
  • Few standard meanings
  • Example Magn much, very
  • Strong wind, tea, desire
  • Thick soup
  • High temperature, potential, sea highly
    expensive
  • Hard work hardcore porno
  • Deep understanding, knowledge, appreciation

10
...Lexical Functions
  • give
  • pay attention
  • provide help
  • adjudge a prize
  • yield the word
  • confer a degree
  • deliver a lection
  • get
  • attract attention
  • obtain help
  • receive a degree
  • attend a lection

11
...Syntagmatic lexical functions
  • In semantic representation, are transformed to
    the function name
  • Magn wind, tea, desire
  • Magn soup
  • Magn temperature, potential, sea MAGN expensive
  • Magn work Magn porno
  • Magn understanding, knowledge, appreciation
  • In different languages, different words are
    used...
  • Russian dense soup Spanish loaded tea, lend
    attention
  • ...but the same function names.

12
Example Translation
13
...Paradigmatic lexical functions
  • Used for synonymic rephrasing
  • Need to reduce the meaning to a standard form
  • Example Syn, hyponyms, hypernyms
  • W ? Syn (W)
  • complex apparatus ? complex mechanism
  • Example Conv31, Conv24, ...
  • A V B C ? C Conv31(V) B A
  • John sold the book to Mary for 5
  • Mary bough the book from John for 5
  • The book costed Mary 5

14
Semantic network
  • Representation of the text as a directed graph
  • Nodes are situations and entities
  • Edges are participation of an entity in a
    situation
  • Also situation in a situationbegin reading a
    book, John died yesterday
  • Situation can be expressed with a nounProfessor
    delivered a lection to studentsProfessor
    lectured to studentsLecture on history,
    memorial to heroes
  • A node can participate in many situations!
  • No division into sentences

15
Situations
  • Situations with different participants are
    different situations
  • John reads a book and Mary reads a newspaper. He
    aks her whether the newspaper is interesting.
  • Here two different situations of reading!
  • But the same entities John, Mary, newspaper,
    participating in different situations
  • Tense and number is described as situations
  • John reads a book
  • Now (reading (John, book) quantity (book, one)

16
Semantic valencies
  • A situation can have few participants (up to 5)
  • Their meaning is usually very general
  • They are usually naturally ordered
  • Who (agent)
  • What (patient, object)
  • To whom (receiver)
  • With what (instrument, ...)
  • John sold the book to Mary for 5
  • So, in the network the outgoing arcs of a node
    are numbered

17
Semantic representation
  • Complex structure of whole text

Now
Give
2
1
ATTENTION


GOVERNMENT
Now

1

IMPORTANT
2


Now
COUNTRY
1

2


2
Possess
SCIENCE
Quantity


1
1
WE





18
Reasoning and common-sense info
  • One can reason on the network
  • If John sold a book, he does not have it
  • For this, additional knowledge is needed!
  • A huge amount of knowledge to reason
  • A 9-year-old child knows some 10,000,000 simple
    facts
  • Probably some of them can be inferred, but not
    (yet) automatically
  • There were attempts to compile such knowledge
    manually
  • There is a hope to compile it automatically...

19
Semantic representation
  • ... and common-sense knowledge

20
Computer representation
  • Logical predicates
  • Arcs are arguments
  • In AI, allows reasoning
  • In IR, can allow comparison even without reasoning

21
Conceptual Graphs
  • A CG is a bipartite graph.
  • Concept nodes represent entities, attributes, or
    events (actions).
  • Relation nodes denote the kinds of relationships
    between the concept nodes.
  • John?(agnt)?love?(ptnt)?Mary

22
(No Transcript)
23
Use in IR
  • Restrict the search to specific situations
  • Where John loves Mary, but not vice versa
  • or
  • Soften the comparison
  • Approximate search
  • Look for John loves Mary, get someone loves Mary

24
Obtaining from text
  • Algebraic formulation of flow diagrams
  • AlgebraicJJ formulationNN ofIN flowNN
    diagramsNNS
  • np, n, formulation, sg, adj,
    algebraic, of, np, n, diagram, pl,
    n_pos, np, n, flow, sg
  • algebraically?(manr)?formulate?(ptn)?flow-dia
    gram

25
Steps of comparison
  • Determine the common elements (overlap) between
    the two graphs.
  • Based on the CG theory
  • Compatible common generalizations
  • Measure their similarity.
  • The similarity must be proportional to the size
    of their overlap.

26
An overlap
  • Given two conceptual graphs G1 and G2, the set of
    their common generalizations O g1, g2,...,gn
    is an overlap if
  • If all common generalizations gi are compatible.
  • If the set O is maximal.

27
An example of overlap
28
Similarity measure
  • Conceptual similarity indicates the amount of
    information contained in common concepts of G1
    and G2.
  • Do they mention similar concepts?
  • Relational similarity indicates how similar the
    contexts of the common concepts in both graphs
    are.
  • Do they mention similar things about the common
    concepts?

29
Conceptual similarity
  • Analogous to the Dice coefficient.
  • Considers different weights for the different
    kinds of concepts.
  • Considers the level of generalization of the
    common concepts (of the overlap).

30
Relational Similarity
  • Analogous to the Dice coefficient.
  • Considers just the neighbors of the common
    concepts.
  • Considers different weights for the different
    kinds of conceptual relations.

31
Similarity Measure
  • Combines the conceptual and relational
    similarities.
  • Multiplicative combination a similarity roughly
    proportional to each of the two components.
  • Relational similarity has secondary importance
    even if no common relations exits, the pieces of
    knowledge are still similar to some degree.

32
Flexibility of the comparison
  • Configurable by the user.
  • Use different concept hierarchies.
  • Designate the importance for the different kind
    of concepts.
  • Manipulate the importance of the conceptual and
    relational similarities.

33
Example of the flexibility
Gore criticezes Bush vs. Bush criticizes Gore
34
An Experiment
  • Use the collection CACM-3204 (articles of
    computer science).
  • We built the conceptual graphs from the document
    titles.
  • Query Description of a fast procedure for
    solving a systemof linear equations.

35
The results
  • Focus on the structural similarity, basically on
    the one caused by the entities and attributes.
  • (a0.3,b0.7, WeWa10,Wv1)
  • One of the best matches
  • Description of a fast algorithm for copying list
    structures.

36
The results (2)
  • Focus on the structural similarity, basically on
    the one caused by the entities and actions.
  • (a0.3,b0.7, WeWv10,Wa1)
  • One of the best matches
  • Solution of an overdetermined system of equations
    in the L1 norm.

37
Advantages of CGs
  • Well-known strategies for text comparison (Dice
    coefficient) with new characteristics derived
    from the CGs structure.
  • The similarity is a combination of two sources of
    similarity the conceptual similarity and the
    relational similarity.
  • Appropriate to compare small pieces of knowledge
    (other methods based on topical statistics do not
    work).
  • Two interesting characteristics uses domain
    knowledge and allows a direct influence of the
    user.
  • Analyze the similarity between two CGs from
    different points of view.
  • Selects the best interpretation in accordance
    with the user interests.

38
Simpler representations
  • Head-Modifier pairs
  • John sold Mary an interesting book for a very low
    price
  • John sold, sold Mary, sold book, sold for
    priceinteresting book, low price
  • A paper in CICLing-2004
  • Restrict your semantic representation to only two
    words
  • Shallow syntax
  • Semantics improves this representation
  • Standard form Mary bought ? John sold, etc.

39
Tasks beyond IR Question Answering
  • User information need
  • An answer to a question
  • Not a bunch of docs
  • Who won Nobel Peace Prize in 1992? (35500 docs)

40
...QA
  • Answer Rigoberta Menchú Tum
  • Logical methods
  • Understand the text
  • Reason on it
  • Construct the answer
  • Generate the text expressing it
  • Statistical methods (no or little semantics)
  • Look what word is repeated in the docs
  • Perhaps try to understand something around it

41
...Better QA
  • What is the info is not in a single document?
  • Who is the queen of Spain?
  • King of Spain is Juan Carlos
  • Wife of Juan Carlos is Sofía
  • (Wife of a king is a queen)
  • Logical reasoning may prove useful
  • In practice, the degree of understanding is not
    yet enough
  • We are working to improve it

42
Tasks beyond IR Passage Extraction
  • If the answer is long a story
  • What do you know on wars between England and
    France?
  • Or if we cannot detect the simple answer
  • Then find short pieces of the text where the
    answer is
  • Can be done even with keywords
  • Find passages with many keywords
  • (Kang et al. 2004) Choose passages with greatest
    vector similarity. Too short few keywords, too
    long normalized
  • Awful quality ?
  • Reasoning can help

43
Tasks beyond IR Summarization
  • And what if the answer is not in a short passage
  • Summarize say the same (without unimportant
    details) but in fewer words
  • Now statistical methods
  • Reasoning can help

44
Tasks beyond IR Information Extraction
  • Question answering on a massive basis
  • Fill a database with the answers
  • Example what company bought what company and
    when?
  • A database of three columns
  • Now (statistical) patterns
  • Reasoning can help

45
Cross-lingual IR
  • Question in one language, answer in another
    language
  • Or question and summary of the answer in
    English, over a database in Chinese
  • Is a kind of translation, but simpler
  • Thus can be done more reliably
  • A transformation into semantic network can
    greatly help

46
Research topics
  • Recognition of the semantic structure
  • Convert text to conceptual graphs
  • All kinds of disambiguation
  • Shallow semantic representations
  • Application of semantic representations to
    specifictasks
  • Similarity measures on semantic representations
  • Reasoning and IR

47
Conclusions
  • Semantic representation gives meaning
  • Language-specific constructions used only in
    theprocess of communication are removed
  • Network of entities / situations and predicates
  • Allows for translation and logical reasoning
  • Can improve IR
  • Compare the query with the doc by meaning, not
    words
  • Search for a specific situation
  • Search for an approximate situation
  • QA, summarization, IE
  • Cross-lingual IR

48
Thank you! Till June 15? 6 pm Thesis
presentation? Oral test?
About PowerShow.com