Transfer Learning With Markov Logic Networks - PowerPoint PPT Presentation

1 / 19
About This Presentation
Title:

Transfer Learning With Markov Logic Networks

Description:

Refinements to a clause are generated by. Adding a literal in all possible ways. Trying all possible literal deletions (only from clauses of the original source MLN) ... – PowerPoint PPT presentation

Number of Views:31
Avg rating:3.0/5.0
Slides: 20
Provided by: lilyanam
Category:

less

Transcript and Presenter's Notes

Title: Transfer Learning With Markov Logic Networks


1
Transfer Learning With Markov Logic Networks
  • Lilyana Mihalkova Raymond Mooney
  • University of Texas at Austin

ICML-06 Workshop on Structural Knowledge Transfer
for Machine Learning June 29, 2006
2
Markov Logic Networks
Richardson Domingos 2006
  • Set of first-order formulae, each with a weight
    attached
  • Templates for constructing Markov networks, when
    a set of constants is provided
  • Include a node for each grounding of each
    predicate in the MLN
  • Include a feature for each grounding of each
    formula in the MLN

3
MLNs Inference
Richardson Domingos 2006
  • Given a set of unknown query ground literals Q
    and a set of evidence ground literals E, find
    P(QE)
  • Done using Gibbs sampling

4
MLNs Inference Cont.
Richardson Domingos 2006
  • Recomputing P(XMBX) in basic Gibbs step

5
MLN Structure Learning
Kok Domingos 2005
  • Can start from scratch or from provided MLN
  • Proceeds by using greedy beam-search
  • Refinements to a clause are generated by
  • Adding a literal in all possible ways
  • Trying all possible literal deletions (only from
    clauses of the original source MLN)
  • Trying all possible sign flips of the literals
  • Candidates are scored using a weighted pseudo
    log-likelihood measure
  • The best b candidates are kept and in the next
    iteration new refinements are generated from them

6
Setting and Assumptions
  • Given
  • MLN, S, learned in source relational domain
  • Mapping between predicates in source and target
    domains
  • Find out which parts of S are still valid in new
    domain and which need to be relearned
  • Perform relearning of incorrect structure

7
Transfer Learning as Revision
  • Regard source MLN as an incorrect model for the
    target task that needs to be accurately and
    efficiently revised
  • Thus our general approach is similar to that
    taken by revision systems Richards Mooney,
    1995
  • Revisions are proposed in a bottom-up fashion

8
New Algorithm
Relational Data
Change in pseudo log-likelihood
9
Self-Diagnosis Preliminaries
  • From the perspective of a particular ground
    literal, a clause can be in one of four
    diagnostic bins
  • Applies, good
  • Applies, bad
  • Does not apply, good
  • Does not apply, bad

Student(Ann) !HasJob(Ann,J) Sleepy(Ann),
Sociable(Ann) InClass(Ann,C)
InClass(Ann,C) gt Student(Ann)
Sociable(Ann) gt !Student(Ann)
Too General
!Sleepy(Ann) gt !Student(Ann)
HasJob(Ann,J) gt Student(Ann)
Too Specific
10
Self-Diagnosis
  • Perform Gibbs sampling with each predicate P in
    turn serving as query
  • In addition to recomputing probability of each
    ground literal, calculate bin memberships of
    participating clauses

11
Structure Updates
  • Using directed beam search
  • Literal deletions attempted only from clauses
    marked as too specific
  • Literal additions only to clauses determined to
    be too general
  • Restrictions carry over to derivatives of
    original clauses
  • Search space constrained by
  • Limiting the clauses considered for updates
  • Restricting the type of update allowed

12
Experiments Domains
Academic Industrial
13
Experiments Data
  • Each training mega-example is a world
    representing the domain
  • Consists of roughly 50-150 true ground literals
  • Generated by fixing the values of some predicate
    groundings and performing MAP inference over a
    hand-crafted MLN to set the rest

14
Experiments Details
  • Systems compared
  • ScratchAlchemy Kok Domingos 2005
  • TransferAlchemy Kok Domingos 2005
  • TransferNew
  • Metrics
  • Conditional log likelihood (CLL)
  • Area under the precision-recall curve (AUC)
  • Running time
  • Number of candidate clauses considered
  • Test predicates SupervisedBy and Secretary

15
Results CLL
16
Results AUC
17
Results Running Time
TransferAlchemy
TransferNew
18
Results Number of Candidates
TransferAlchemy
TransferNew
19
Conclusions
  • Presented new transfer learning algorithm that
  • Explicitly takes advantage of similarities
    between tasks
  • Focuses on relearning only incorrect portions of
    the domain
  • Decreases both search space and running time
Write a Comment
User Comments (0)
About PowerShow.com