ContextSpecific Independence in Bayesian Networks - PowerPoint PPT Presentation

1 / 12
About This Presentation
Title:

ContextSpecific Independence in Bayesian Networks

Description:

... inferring local CSI. Context-Specific ... Compact Representation through CSI (1) ... But with CSI, we can remove more edges for some instantiation: ... – PowerPoint PPT presentation

Number of Views:75
Avg rating:3.0/5.0
Slides: 13
Provided by: akoz
Category:

less

Transcript and Presenter's Notes

Title: ContextSpecific Independence in Bayesian Networks


1
Context-Specific Independence in Bayesian Networks
  • Boutilier, Friedman, Goldszmidt Koller
  • Presented by Vincent Conitzer for cs282

2
Part 1 -- Setup
  • Context-specific independence introduction
  • Local CSI
  • Tree representation of CPTs
  • Using the tree representation for inferring local
    CSI

3
Context-Specific Independence
  • Wheres Waldo example
  • Would like to be able to conclude I(Wet, Rain
    in ParisWWLondon), but not I(Wet, Rain in
    ParisWWParis) -- the independence is
    context-specific

Rain in London
Rain in Paris
Wheres Waldo
Waldo Wet
4
Local CSIs
  • Local CSIs indicate which of the parents you
    dont need to know to instantiate the child in a
    context
  • In the context, the edge between such a parent
    and the child is then called vacuous
  • In order to find out about independence in a
    context, just use d-separation on the network
    after removing vacuous edges (CSI-separation)
  • CSI-separation gives the strongest CSI
    conclusions possible given just the local CSIs

5
How should we represent CPTs?
  • We want to represent our CPTs in a way which
    allows for easy recognition of local CSIs.
    Authors focus on decision trees, or CPT-trees.
    E.g. tree for Waldo Wet
  • Note that the structure of the second tree
    intuitively does not seem to represent I(Wet,
    Rain in LWWP)

Wheres Waldo
Rain in London
OR
N
Y
L
P
Wheres Waldo
Wheres Waldo
Rain in Paris
Rain in London
P
P
L
L
Rain in Paris
Rain in Paris
N
N
Y
Y
N
Y
N
Y
0
.8
0
.9
0
0
.9
0
.8
.9
6
Reducing CPTs in a Context
  • To reduce a CPT given a context, we cut out all
    the decision nodes that are known, and their
    subtrees corresponding to values other than the
    instantiation
  • Variable eliminated from tree its edge is
    vacuous
  • This method is complete given only the tree
    structure

Wheres Waldo
Rain in London
OR
N
Y
L
P
Wheres Waldo
Wheres Waldo
Rain in Paris
Rain in London
P
P
L
L
Rain in Paris
Rain in Paris
N
N
Y
Y
N
Y
N
Y
0
.8
0
.9
0
0
.9
0
.8
.9
7
Part 2 -- Inference
  • Inference through compactifying the network
  • Inference through clustering

8
Compact Representation through CSI (1)
  • We can use the information from the CSIs to come
    up with a more compact representation of the
    network, as in noisy-OR models etc., for faster
    inference
  • Idea condition on one parent to create values,
    then instantiate parent and pick the appropriate
    value

Rain in Paris
Rain in London
2
2
Wheres Waldo
WetWWP
WetWWL
Wet
9
Compact Representation through CSI (2)
  • If previous does not use all the structure from
    the CPT tree, can repeat. E.g. suppose we use the
    bad tree

Rain in Paris
2
Step 2
Step 1
1
(WetRL T) WW P
1
2
Wheres Waldo
Rain in Paris
(WetRL F) WW L
(WetRL T) WW L
(WetRL F) WW P
4
4
WetRL F
WetRL T
Wheres Waldo
Wheres Waldo
Rain in London
WetRL T
Rain in London
WetRL F
Wet
Wet
10
Clustering with CSI (1)
  • The idea of clustering is to instantiate
    variables, remove outgoing edges, and continue.
    But with CSI, we can remove more edges for some
    instantiation
  • Normally, would need to instantiate 2 variables
    to get a polytree. But now we only need the Where
    variable.

Rain in London
Where is couple
Rain in Paris
Waldo Wet
Wilda Wet
11
Clustering with CSI (2)
  • How do we pick which variable to instantiate
    first? Naïve greedy pick one which on average
    removes the most edges. But consider
  • Want to instantiate left two first -- this will
    always give a polytree. But instantiating only
    one helps nothing.

Rain in London
Waldo Left Seat
Rain in Paris
Waldo Driver
Waldo Wet
Wilda Wet
12
Clustering with CSI (3)
  • Solution look at reduced tree sizes.

Waldo Left Seat
Waldo Driver
Waldo Driver
Eliminating one of the top two variables leaves a
smaller number of probabilities in the tree,
suggesting they are better for instantiation.
Rain in London
Rain in Paris
Rain in Paris
Rain in London
Write a Comment
User Comments (0)
About PowerShow.com