MAP Estimation Algorithms in - PowerPoint PPT Presentation

1 / 213
About This Presentation
Title:

MAP Estimation Algorithms in

Description:

MAP Estimation Algorithms in Computer Vision - Part I M. Pawan Kumar, University of Oxford Pushmeet Kohli, Microsoft Research Aim of the Tutorial Description of some ... – PowerPoint PPT presentation

Number of Views:50
Avg rating:3.0/5.0
Slides: 214
Provided by: philg67
Category:

less

Transcript and Presenter's Notes

Title: MAP Estimation Algorithms in


1
MAP Estimation Algorithms in
Computer Vision - Part I
  • M. Pawan Kumar, University of Oxford
  • Pushmeet Kohli, Microsoft Research

2
Aim of the Tutorial
  • Description of some successful algorithms
  • Computational issues
  • Enough details to implement
  • Some proofs will be skipped -(
  • But references to them will be given -)

3
A Vision Application
Binary Image Segmentation
How ?
Cost function
Models our knowledge about natural images
Optimize cost function to obtain the segmentation
4
A Vision Application
Binary Image Segmentation
Graph G (V,E)
Object - white, Background - green/grey
Each vertex corresponds to a pixel
Edges define a 4-neighbourhood grid graph
Assign a label to each vertex from L obj,bkg
5
A Vision Application
Binary Image Segmentation
Graph G (V,E)
Object - white, Background - green/grey
Per Vertex Cost
Cost of a labelling f V ? L
Cost of label bkg high
Cost of label obj low
6
A Vision Application
Binary Image Segmentation
Graph G (V,E)
Object - white, Background - green/grey
Per Vertex Cost
Cost of a labelling f V ? L
Cost of label bkg low
Cost of label obj high
UNARY COST
7
A Vision Application
Binary Image Segmentation
Graph G (V,E)
Object - white, Background - green/grey
Per Edge Cost
Cost of a labelling f V ? L
Cost of same label low
Cost of different labels high
8
A Vision Application
Binary Image Segmentation
Graph G (V,E)
Object - white, Background - green/grey
Per Edge Cost
Cost of a labelling f V ? L
Cost of same label high
PAIRWISE COST
Cost of different labels low
9
A Vision Application
Binary Image Segmentation
Graph G (V,E)
Object - white, Background - green/grey
Problem Find the labelling with minimum cost f
10
A Vision Application
Binary Image Segmentation
Graph G (V,E)
Problem Find the labelling with minimum cost f
11
Another Vision Application
Object Detection using Parts-based Models
How ?
Once again, by defining a good cost function
12
Another Vision Application
Object Detection using Parts-based Models
H
T
1
L1
L2
L3
L4
Graph G (V,E)
Each vertex corresponds to a part - Head,
Torso, Legs
Edges define a TREE
Assign a label to each vertex from L positions
13
Another Vision Application
Object Detection using Parts-based Models
H
T
2
L1
L2
L3
L4
Graph G (V,E)
Each vertex corresponds to a part - Head,
Torso, Legs
Edges define a TREE
Assign a label to each vertex from L positions
14
Another Vision Application
Object Detection using Parts-based Models
H
T
3
L1
L2
L3
L4
Graph G (V,E)
Each vertex corresponds to a part - Head,
Torso, Legs
Edges define a TREE
Assign a label to each vertex from L positions
15
Another Vision Application
Object Detection using Parts-based Models
H
T
3
L1
L2
L3
L4
Graph G (V,E)
Cost of a labelling f V ? L
Unary cost How well does part match image patch?
Pairwise cost Encourages valid configurations
Find best labelling f
16
Another Vision Application
Object Detection using Parts-based Models
H
T
3
L1
L2
L3
L4
Graph G (V,E)
Cost of a labelling f V ? L
Unary cost How well does part match image patch?
Pairwise cost Encourages valid configurations
Find best labelling f
17
Yet Another Vision Application
Stereo Correspondence
Disparity Map
How ?
Minimizing a cost function
18
Yet Another Vision Application
Stereo Correspondence
Graph G (V,E)
Vertex corresponds to a pixel
Edges define grid graph
L disparities
19
Yet Another Vision Application
Stereo Correspondence
Cost of labelling f Unary cost Pairwise Cost
Find minimum cost f
20
The General Problem
1
2
b
c
Graph G ( V, E )
1
Discrete label set L 1,2,,h
a
d
3
Assign a label to each vertex f
V ? L
f
e
2
2
Cost of a labelling Q(f)
Unary Cost
Pairwise Cost
Find f arg min Q(f)
21
Outline
  • Problem Formulation
  • Energy Function
  • MAP Estimation
  • Computing min-marginals
  • Reparameterization
  • Belief Propagation
  • Tree-reweighted Message Passing

22
Energy Function
Label l1
Label l0
Vb
Vc
Vd
Va
Db
Dc
Dd
Da
Random Variables V Va, Vb, .
Labels L l0, l1, .
Data D
Labelling f a, b, . ? 0,1,
23
Energy Function
6
3
2
4
Label l1
Label l0
5
3
7
2
Vb
Vc
Vd
Va
Db
Dc
Dd
Da
Easy to minimize
Q(f)
?a ?af(a)
Neighbourhood
Unary Potential
24
Energy Function
6
3
2
4
Label l1
Label l0
5
3
7
2
Vb
Vc
Vd
Va
Db
Dc
Dd
Da
E (a,b) ? E iff Va and Vb are neighbours
E (a,b) , (b,c) , (c,d)
25
Energy Function
0
1
6
3
0
2
4
Label l1
1
2
4
1
3
1
Label l0
1
0
0
5
3
7
2
Vb
Vc
Vd
Va
Db
Dc
Dd
Da
Pairwise Potential
Q(f)
?a ?af(a)
?(a,b) ?abf(a)f(b)
26
Energy Function
0
1
6
3
0
2
4
Label l1
1
2
4
1
3
1
Label l0
1
0
0
5
3
7
2
Vb
Vc
Vd
Va
Db
Dc
Dd
Da
Q(f ?)
?a ?af(a)
?(a,b) ?abf(a)f(b)
Parameter
27
Outline
  • Problem Formulation
  • Energy Function
  • MAP Estimation
  • Computing min-marginals
  • Reparameterization
  • Belief Propagation
  • Tree-reweighted Message Passing

28
MAP Estimation
6
3
0
1
2
4
0
Label l1
1
2
4
1
3
1
Label l0
1
0
5
3
7
0
2
Vb
Vc
Vd
Va
Q(f ?) ?a ?af(a) ?(a,b) ?abf(a)f(b)
29
MAP Estimation
6
3
0
1
2
4
0
Label l1
1
2
4
1
3
1
Label l0
1
0
5
3
7
0
2
Vb
Vc
Vd
Va
Q(f ?) ?a ?af(a) ?(a,b) ?abf(a)f(b)
2 1 2 1 3 1 3 13
30
MAP Estimation
6
3
0
1
2
4
0
Label l1
1
2
4
1
3
1
Label l0
1
0
5
3
7
0
2
Vb
Vc
Vd
Va
Q(f ?) ?a ?af(a) ?(a,b) ?abf(a)f(b)
31
MAP Estimation
6
3
0
1
2
4
0
Label l1
1
2
4
1
3
1
Label l0
1
0
5
3
7
0
2
Vb
Vc
Vd
Va
Q(f ?) ?a ?af(a) ?(a,b) ?abf(a)f(b)
5 1 4 0 6 4 7 27
32
MAP Estimation
6
3
0
1
2
4
0
Label l1
1
2
4
1
3
1
Label l0
1
0
5
3
7
0
2
Vb
Vc
Vd
Va
q min Q(f ?) Q(f ?)
Q(f ?) ?a ?af(a) ?(a,b) ?abf(a)f(b)
f arg min Q(f ?)
33
MAP Estimation
f 1, 0, 0, 1
16 possible labellings
q 13
f(a) f(b) f(c) f(d) Q(f ?)
0 0 0 0 18
0 0 0 1 15
0 0 1 0 27
0 0 1 1 20
0 1 0 0 22
0 1 0 1 19
0 1 1 0 27
0 1 1 1 20
f(a) f(b) f(c) f(d) Q(f ?)
1 0 0 0 16
1 0 0 1 13
1 0 1 0 25
1 0 1 1 18
1 1 0 0 18
1 1 0 1 15
1 1 1 0 23
1 1 1 1 16
34
Computational Complexity
Segmentation
2V
V number of pixels 320 480 153600
35
Computational Complexity
Detection
LV
L number of pixels 153600
36
Computational Complexity
Stereo
LV
V number of pixels 153600
Can we do better than brute-force?
MAP Estimation is NP-hard !!
37
Computational Complexity
Stereo
LV
V number of pixels 153600
Exact algorithms do exist for special cases
Good approximate algorithms for general case
But first two important definitions
38
Outline
  • Problem Formulation
  • Energy Function
  • MAP Estimation
  • Computing min-marginals
  • Reparameterization
  • Belief Propagation
  • Tree-reweighted Message Passing

39
Min-Marginals
6
3
0
1
2
4
0
Label l1
1
2
4
1
3
1
Label l0
1
0
5
3
7
0
2
Vb
Vc
Vd
Va
Not a marginal (no summation)
such that f(a) i
f arg min Q(f ?)
Min-marginal qai
40
Min-Marginals
qa0 15
16 possible labellings
f(a) f(b) f(c) f(d) Q(f ?)
0 0 0 0 18
0 0 0 1 15
0 0 1 0 27
0 0 1 1 20
0 1 0 0 22
0 1 0 1 19
0 1 1 0 27
0 1 1 1 20
f(a) f(b) f(c) f(d) Q(f ?)
1 0 0 0 16
1 0 0 1 13
1 0 1 0 25
1 0 1 1 18
1 1 0 0 18
1 1 0 1 15
1 1 1 0 23
1 1 1 1 16
41
Min-Marginals
qa1 13
16 possible labellings
f(a) f(b) f(c) f(d) Q(f ?)
1 0 0 0 16
1 0 0 1 13
1 0 1 0 25
1 0 1 1 18
1 1 0 0 18
1 1 0 1 15
1 1 1 0 23
1 1 1 1 16
f(a) f(b) f(c) f(d) Q(f ?)
0 0 0 0 18
0 0 0 1 15
0 0 1 0 27
0 0 1 1 20
0 1 0 0 22
0 1 0 1 19
0 1 1 0 27
0 1 1 1 20
42
Min-Marginals and MAP
  • Minimum min-marginal of any variable
  • energy of MAP labelling

qai
mini
)
mini (
such that f(a) i
minf Q(f ?)
Va has to take one label
minf Q(f ?)
43
Summary
Energy Function
Q(f ?) ?a ?af(a) ?(a,b) ?abf(a)f(b)
MAP Estimation
f arg min Q(f ?)
Min-marginals
s.t. f(a) i
qai min Q(f ?)
44
Outline
  • Problem Formulation
  • Reparameterization
  • Belief Propagation
  • Tree-reweighted Message Passing

45
Reparameterization
f(a) f(b) Q(f ?)
0 0 7
0 1 10
1 0 5
1 1 6
2
- 2
2
4
0
1
1
2
- 2
5
0
2
Vb
Va
Add a constant to all ?ai
Subtract that constant from all ?bk
46
Reparameterization
f(a) f(b) Q(f ?)
0 0 7 2 - 2
0 1 10 2 - 2
1 0 5 2 - 2
1 1 6 2 - 2
2
- 2
2
4
0
1
1
2
- 2
5
0
2
Vb
Va
Add a constant to all ?ai
Subtract that constant from all ?bk
Q(f ?) Q(f ?)
47
Reparameterization
3
f(a) f(b) Q(f ?)
0 0 7
0 1 10
1 0 5
1 1 6
- 3
2
4
0
- 3
1
1
5
0
2
Vb
Va
Add a constant to one ?bk
Subtract that constant from ?abik for all i
48
Reparameterization
3
f(a) f(b) Q(f ?)
0 0 7
0 1 10 - 3 3
1 0 5
1 1 6 - 3 3
- 3
2
4
0
- 3
1
1
5
0
2
Vb
Va
Add a constant to one ?bk
Subtract that constant from ?abik for all i
Q(f ?) Q(f ?)
49
Reparameterization
- 4
4
1
- 4
- 2
- 1
1
- 2
- 4
1
- 2
2
Mabk
?bk ?bk
Mbai
?ai ?ai
Q(f ?) Q(f ?)
?abik ?abik
- Mabk
- Mbai
50
Reparameterization
? is a reparameterization of ?, iff
? ? ?
Q(f ?) Q(f ?), for all f
Kolmogorov, PAMI, 2006
?bk ?bk
51
Recap
MAP Estimation
f arg min Q(f ?)
Q(f ?) ?a ?af(a) ?(a,b) ?abf(a)f(b)
Min-marginals
s.t. f(a) i
qai min Q(f ?)
Reparameterization
? ? ?
Q(f ?) Q(f ?), for all f
52
Outline
  • Problem Formulation
  • Reparameterization
  • Belief Propagation
  • Exact MAP for Chains and Trees
  • Approximate MAP for general graphs
  • Computational Issues and Theoretical Properties
  • Tree-reweighted Message Passing

53
Belief Propagation
  • Remember, some MAP problems are easy
  • Belief Propagation gives exact MAP for chains
  • Exact MAP for trees
  • Clever Reparameterization

54
Two Variables
2
2
4
0
1
1
5
0
5
2
Vb
Vb
Va
Va
Add a constant to one ?bk
Subtract that constant from ?abik for all i
?bk qbk
Choose the right constant
55
Two Variables
2
2
4
0
1
1
5
0
5
2
Vb
Vb
Va
Va
5 0
?a0 ?ab00
Mab0
min
2 1
?a1 ?ab10
?bk qbk
Choose the right constant
56
Two Variables
2
2
4
0
1
-2
5
-3
5
5
Vb
Vb
Va
Va
?bk qbk
Choose the right constant
57
Two Variables
f(a) 1
2
2
4
0
1
-2
5
-3
5
5
Vb
Vb
Va
Va
?b0 qb0
Potentials along the red path add up to 0
?bk qbk
Choose the right constant
58
Two Variables
2
2
4
0
1
-2
5
-3
5
5
Vb
Vb
Va
Va
5 1
?a0 ?ab01
Mab1
min
2 0
?a1 ?ab11
?bk qbk
Choose the right constant
59
Two Variables
f(a) 1
f(a) 1
2
2
6
-2
-1
-2
5
-3
5
5
Vb
Vb
Va
Va
?b0 qb0
?b1 qb1
Minimum of min-marginals MAP estimate
?bk qbk
Choose the right constant
60
Two Variables
f(a) 1
f(a) 1
2
2
6
-2
-1
-2
5
-3
5
5
Vb
Vb
Va
Va
?b0 qb0
?b1 qb1
f(b) 0
f(a) 1
?bk qbk
Choose the right constant
61
Two Variables
f(a) 1
f(a) 1
2
2
6
-2
-1
-2
5
-3
5
5
Vb
Vb
Va
Va
?b0 qb0
?b1 qb1
We get all the min-marginals of Vb
?bk qbk
Choose the right constant
62
Recap
We only need to know two sets of equations
General form of Reparameterization
Reparameterization of (a,b) in Belief Propagation
Mabk mini ?ai ?abik
Mbai 0
63
Three Variables
0
2
4
6
0
l1
3
1
1
2
l0
5
0
1
3
2
Vb
Vc
Va
Reparameterize the edge (a,b) as before
64
Three Variables
f(a) 1
-2
2
6
6
0
l1
3
-2
-1
2
l0
5
-3
1
3
5
Vb
Vc
Va
f(a) 1
Reparameterize the edge (a,b) as before
65
Three Variables
f(a) 1
-2
2
6
6
0
l1
3
-2
-1
2
l0
5
-3
1
3
5
Vb
Vc
Va
f(a) 1
Reparameterize the edge (a,b) as before
Potentials along the red path add up to 0
66
Three Variables
f(a) 1
-2
2
6
6
0
l1
3
-2
-1
2
l0
5
-3
1
3
5
Vb
Vc
Va
f(a) 1
Reparameterize the edge (b,c) as before
Potentials along the red path add up to 0
67
Three Variables
f(a) 1
f(b) 1
-2
2
6
12
-6
l1
-3
-2
-1
-4
l0
5
-3
-5
9
5
Vb
Vc
Va
f(a) 1
f(b) 0
Reparameterize the edge (b,c) as before
Potentials along the red path add up to 0
68
Three Variables
f(a) 1
f(b) 1
-2
2
6
12
-6
qc1
l1
-3
-2
-1
-4
qc0
l0
5
-3
-5
9
5
Vb
Vc
Va
f(a) 1
f(b) 0
Reparameterize the edge (b,c) as before
Potentials along the red path add up to 0
69
Three Variables
f(a) 1
f(b) 1
-2
2
6
12
-6
qc1
l1
-3
-2
-1
-4
qc0
l0
5
-3
-5
9
5
Vb
Vc
Va
f(a) 1
f(b) 0
f(b) 0
f(a) 1
f(c) 0
Generalizes to any length chain
70
Three Variables
f(a) 1
f(b) 1
-2
2
6
12
-6
qc1
l1
-3
-2
-1
-4
qc0
l0
5
-3
-5
9
5
Vb
Vc
Va
f(a) 1
f(b) 0
f(b) 0
f(a) 1
f(c) 0
Only Dynamic Programming
71
Why Dynamic Programming?
3 variables ? 2 variables book-keeping
n variables ? (n-1) variables book-keeping
Start from left, go to right
Reparameterize current edge (a,b)
Mabk mini ?ai ?abik
Repeat
72
Why Dynamic Programming?
Messages
Message Passing
Why stop at dynamic programming?
Start from left, go to right
Reparameterize current edge (a,b)
Mabk mini ?ai ?abik
Repeat
73
Three Variables
-2
2
6
12
-6
l1
-3
-2
-1
-4
l0
5
-3
-5
9
5
Vb
Vc
Va
Reparameterize the edge (c,b) as before
74
Three Variables
-2
2
11
12
-11
l1
-7
-2
-1
-9
l0
5
-3
-9
9
9
Vb
Vc
Va
Reparameterize the edge (c,b) as before
?bi qbi
75
Three Variables
-2
2
11
12
-11
l1
-7
-2
-1
-9
l0
5
-3
-9
9
9
Vb
Vc
Va
Reparameterize the edge (b,a) as before
76
Three Variables
-9
9
11
12
-11
l1
-7
-9
-7
-9
l0
11
-9
-9
9
9
Vb
Vc
Va
Reparameterize the edge (b,a) as before
?ai qai
77
Three Variables
-9
9
11
12
-11
l1
-7
-9
-7
-9
l0
11
-9
-9
9
9
Vb
Vc
Va
Forward Pass ? ? Backward Pass
All min-marginals are computed
78
Belief Propagation on Chains
Start from left, go to right
Reparameterize current edge (a,b)
Mabk mini ?ai ?abik
Repeat till the end of the chain
Start from right, go to left
Repeat till the end of the chain
79
Belief Propagation on Chains
  • Generalizes to chains of any length
  • A way of computing reparam constants
  • Forward Pass - Start to End
  • MAP estimate
  • Min-marginals of final variable
  • Backward Pass - End to start
  • All other min-marginals

Wont need this .. But good to know
80
Computational Complexity
  • Each constant takes O(L)
  • Number of constants - O(EL)

O(EL2)
  • Memory required ?

O(EL)
81
Belief Propagation on Trees
Va
Vb
Vc
Vg
Vh
Vd
Ve
Forward Pass Leaf ? Root
Backward Pass Root ? Leaf
All min-marginals are computed
82
Outline
  • Problem Formulation
  • Reparameterization
  • Belief Propagation
  • Exact MAP for Chains and Trees
  • Approximate MAP for general graphs
  • Computational Issues and Theoretical Properties
  • Tree-reweighted Message Passing

83
Belief Propagation on Cycles
?b1
?a1
?b0
?a0
Va
Vb
?c1
?d1
?c0
?d0
Vd
Vc
Where do we start?
Arbitrarily
Reparameterize (a,b)
84
Belief Propagation on Cycles
?b1
?a1
?b0
?a0
Va
Vb
?c1
?d1
?c0
?d0
Vd
Vc
Potentials along the red path add up to 0
85
Belief Propagation on Cycles
?b1
?a1
?b0
?a0
Va
Vb
?c1
?d1
?c0
?d0
Vd
Vc
Potentials along the red path add up to 0
86
Belief Propagation on Cycles
?b1
?a1
?b0
?a0
Va
Vb
?c1
?d1
?c0
?d0
Vd
Vc
Potentials along the red path add up to 0
87
Belief Propagation on Cycles
?b1
?a1
?b0
?a0
Va
Vb
?c1
?d1
?c0
?d0
Vd
Vc
Potentials along the red path add up to 0
88
Belief Propagation on Cycles
- ?a1
?b1
?a1
?b0
?a0
- ?a0
Va
Vb
?c1
?d1
?c0
?d0
Vd
Vc
?a0 - ?a0 qa0
?a1 - ?a1 qa1
Potentials along the red path add up to 0
89
Belief Propagation on Cycles
- ?a1
?b1
?a1
?b0
?a0
- ?a0
Va
Vb
?c1
?d1
?c0
?d0
Vd
Vc
?a0 - ?a0 qa0
?a1 - ?a1 qa1
Pick minimum min-marginal. Follow red path.
90
Belief Propagation on Cycles
?b1
?a1
?b0
?a0
Va
Vb
?c1
?d1
?c0
?d0
Vd
Vc
Potentials along the red path add up to 0
91
Belief Propagation on Cycles
?b1
?a1
?b0
?a0
Va
Vb
?c1
?d1
?c0
?d0
Vd
Vc
Potentials along the red path add up to 0
92
Belief Propagation on Cycles
?b1
?a1
?b0
?a0
Va
Vb
?c1
?d1
?c0
?d0
Vd
Vc
Potentials along the red path add up to 0
93
Belief Propagation on Cycles
- ?a1
?b1
?a1
?b0
?a0
- ?a0
Va
Vb
?c1
?d1
?c0
?d0
Vd
Vc

?a1 - ?a1 qa1
?a0 - ?a0 qa0
Potentials along the red path add up to 0
94
Belief Propagation on Cycles
- ?a1
?b1
?a1
?b0
?a0
- ?a0
Va
Vb
?c1
?d1
?c0
?d0
Vd
Vc

?a1 - ?a1 qa1
?a0 - ?a0 qa0
Problem Solved
95
Belief Propagation on Cycles
- ?a1
?b1
?a1
?b0
?a0
- ?a0
Va
Vb
?c1
?d1
?c0
?d0
Vd
Vc

?a1 - ?a1 qa1
?a0 - ?a0 qa0
Problem Not Solved
96
Belief Propagation on Cycles
- ?a1
?b1
?a1
?b0
?a0
- ?a0
Va
Vb
?c1
?d1
?c0
?d0
Vd
Vc
Reparameterize (a,b) again
97
Belief Propagation on Cycles
?b1
?a1
?b0
?a0
Va
Vb
?c1
?d1
?c0
?d0
Vd
Vc
Reparameterize (a,b) again
But doesnt this overcount some potentials?
98
Belief Propagation on Cycles
?b1
?a1
?b0
?a0
Va
Vb
?c1
?d1
?c0
?d0
Vd
Vc
Reparameterize (a,b) again
Yes. But we will do it anyway
99
Belief Propagation on Cycles
?b1
?a1
?b0
?a0
Va
Vb
?c1
?d1
?c0
?d0
Vd
Vc
Keep reparameterizing edges in some order
Hope for convergence and a good solution
100
Belief Propagation
  • Generalizes to any arbitrary random field
  • Complexity per iteration ?

O(EL2)
  • Memory required ?

O(EL)
101
Outline
  • Problem Formulation
  • Reparameterization
  • Belief Propagation
  • Exact MAP for Chains and Trees
  • Approximate MAP for general graphs
  • Computational Issues and Theoretical Properties
  • Tree-reweighted Message Passing

102
Computational Issues of BP
O(EL2)
Complexity per iteration
?abik wabd(i-k)
Special Pairwise Potentials
O(EL)
Felzenszwalb Huttenlocher, 2004
103
Computational Issues of BP
O(EL)
Memory requirements
Half of original BP
Kolmogorov, 2006
Some approximations exist
Yu, Lin, Super and Tan, 2007
Lasserre, Kannan and Winn, 2007
But memory still remains an issue
104
Computational Issues of BP
Order of reparameterization
Randomly
In some fixed order
The one that results in maximum change
Residual Belief Propagation
Elidan et al. , 2006
105
Theoretical Properties of BP
Exact for Trees
Pearl, 1988
What about any general random field?
Run BP. Assume it converges.
106
Theoretical Properties of BP
Exact for Trees
Pearl, 1988
What about any general random field?
Choose variables in a tree. Change their labels.
Value of energy does not decrease
107
Theoretical Properties of BP
Exact for Trees
Pearl, 1988
What about any general random field?
Choose variables in a cycle. Change their labels.
Value of energy does not decrease
108
Theoretical Properties of BP
Exact for Trees
Pearl, 1988
What about any general random field?
For cycles, if BP converges then exact MAP
Weiss and Freeman, 2001
109
Results
Object Detection
Felzenszwalb and Huttenlocher, 2004
Labels - Poses of parts
Unary Potentials Fraction of foreground pixels
Pairwise Potentials Favour Valid Configurations
110
Results
Object Detection
Felzenszwalb and Huttenlocher, 2004
111
Results
Szeliski et al. , 2008
Binary Segmentation
Labels - foreground, background
Unary Potentials -log(likelihood) using learnt
fg/bg models
Pairwise Potentials 0, if same labels
1 - ?exp(Da - Db), if
different labels
112
Results
Szeliski et al. , 2008
Binary Segmentation
Belief Propagation
Labels - foreground, background
Unary Potentials -log(likelihood) using learnt
fg/bg models
Pairwise Potentials 0, if same labels
1 - ?exp(Da - Db), if
different labels
113
Results
Szeliski et al. , 2008
Binary Segmentation
Global optimum
Labels - foreground, background
Unary Potentials -log(likelihood) using learnt
fg/bg models
Pairwise Potentials 0, if same labels
1 - ?exp(Da - Db), if
different labels
114
Results
Szeliski et al. , 2008
Stereo Correspondence
Labels - disparities
Unary Potentials Similarity of pixel colours
Pairwise Potentials 0, if same labels
1 - ?exp(Da - Db), if
different labels
115
Results
Szeliski et al. , 2008
Stereo Correspondence
Belief Propagation
Labels - disparities
Unary Potentials Similarity of pixel colours
Pairwise Potentials 0, if same labels
1 - ?exp(Da - Db), if
different labels
116
Results
Szeliski et al. , 2008
Stereo Correspondence
Global optimum
Labels - disparities
Unary Potentials Similarity of pixel colours
Pairwise Potentials 0, if same labels
1 - ?exp(Da - Db), if
different labels
117
Summary of BP
Exact for chains
Exact for trees
Approximate MAP for general cases
Not even convergence guaranteed
So can we do something better?
118
Outline
  • Problem Formulation
  • Reparameterization
  • Belief Propagation
  • Tree-reweighted Message Passing
  • Integer Programming Formulation
  • Linear Programming Relaxation and its Dual
  • Convergent Solution for Dual
  • Computational Issues and Theoretical Properties

119
TRW Message Passing
  • A different look at the same problem
  • Convex (not Combinatorial) Optimization
  • A similar solution
  • Combinatorial (not Convex) Optimization

We will look at the most general MAP estimation
Not trees
No assumption on potentials
120
Things to Remember
  • BP is exact for trees
  • Every iteration provides a reparameterization
  • Forward-pass computes min-marginals of root
  • Basics of Mathematical Optimization

121
Mathematical Optimization
g0(x)
min g0(x) subject to gi(x)
0 i1, ,
N
x arg
Optimal Value
Optimal Solution
  • Constraints
  • Objective function
  • Feasible region x gi(x) 0

122
Integer Programming
g0(x)
min g0(x) subject to gi(x)
0 i1, ,
N
x arg
Optimal Value
Optimal Solution
  • Constraints
  • Objective function

xk ? Z
  • Feasible region x gi(x) 0

123
Feasible Region
Generally NP-hard to optimize
124
Linear Programming
g0(x)
min g0(x) subject to gi(x)
0 i1, ,
N
x arg
Optimal Value
Optimal Solution
  • Constraints
  • Objective function
  • Feasible region x gi(x) 0

125
Linear Programming
g0(x)
min g0(x) subject to gi(x)
0 i1, ,
N
x arg
Optimal Value
Optimal Solution
  • Linear objective function
  • Linear constraints
  • Feasible region x gi(x) 0

126
Linear Programming
cTx
min cTx subject to Ax b
i1, , N
x arg
Optimal Value
Optimal Solution
  • Linear objective function
  • Linear constraints
  • Feasible region x Ax b

Polynomial-time Solution
127
Feasible Region
Polynomial-time Solution
128
Feasible Region
Optimal solution lies on a vertex (obj func
linear)
129
Outline
  • Problem Formulation
  • Reparameterization
  • Belief Propagation
  • Tree-reweighted Message Passing
  • Integer Programming Formulation
  • Linear Programming Relaxation and its Dual
  • Convergent Solution for Dual
  • Computational Issues and Theoretical Properties

130
Integer Programming Formulation
2
4
0
Unary Potentials
Label l1
1
2
1
?b0 2
?a0 5
Label l0
5
0
?b1 4
2
?a1 2
Vb
Va
Labelling
f(a) 1
ya0 0
ya1 1
yb0 1
yb1 0
f(b) 0
Any f(.) has equivalent boolean variables yai
131
Integer Programming Formulation
2
4
0
Unary Potentials
Label l1
1
2
1
?b0 2
?a0 5
Label l0
5
0
?b1 4
2
?a1 2
Vb
Va
Labelling
f(a) 1
ya0 0
ya1 1
yb0 1
yb1 0
f(b) 0
Find the optimal variables yai
132
Integer Programming Formulation
2
4
0
Unary Potentials
Label l1
1
2
1
?b0 2
?a0 5
Label l0
5
0
?b1 4
2
?a1 2
Vb
Va
Sum of Unary Potentials
?a ?i ?ai yai
yai ? 0,1, for all Va, li
?i yai 1, for all Va
133
Integer Programming Formulation
2
4
0
Pairwise Potentials
Label l1
1
2
1
?ab01 1
?ab00 0
Label l0
5
0
?ab11 0
2
?ab10 1
Vb
Va
Sum of Pairwise Potentials
?(a,b) ?ik ?abik yaiybk
yai ? 0,1
?i yai 1
134
Integer Programming Formulation
2
4
0
Pairwise Potentials
Label l1
1
2
1
?ab01 1
?ab00 0
Label l0
5
0
?ab11 0
2
?ab10 1
Vb
Va
Sum of Pairwise Potentials
?(a,b) ?ik ?abik yabik
yabik yai ybk
yai ? 0,1
?i yai 1
135
Integer Programming Formulation
min ?a ?i ?ai yai ?(a,b) ?ik ?abik yabik
yai ? 0,1
?i yai 1
yabik yai ybk
136
Integer Programming Formulation
min ?Ty
yai ? 0,1
?i yai 1
yabik yai ybk
? ?ai . ?abik .
y yai . yabik .
137
One variable, two labels
y ya0 ya1
? ?a0 ?a1
ya0 ya1 1
ya1 ? 0,1
ya0 ? 0,1
138
Two variables, two labels
ya0 ya1 1
ya1 ? 0,1
ya0 ? 0,1
yb0 yb1 1
yb1 ? 0,1
yb0 ? 0,1
yab00 ya0 yb0
yab01 ya0 yb1
yab10 ya1 yb0
yab11 ya1 yb1
  • ?a0 ?a1 ?b0 ?b1
  • ?ab00 ?ab01 ?ab10 ?ab11

y ya0 ya1 yb0 yb1 yab00
yab01 yab10 yab11
139
In General
Marginal Polytope
140
In General
  • ? R(VL EL2)

y ? 0,1(VL EL2)
Number of constraints
VL V EL2
yai ? 0,1
?i yai 1
yabik yai ybk
141
Integer Programming Formulation
min ?Ty
yai ? 0,1
?i yai 1
yabik yai ybk
? ?ai . ?abik .
y yai . yabik .
142
Integer Programming Formulation
min ?Ty
yai ? 0,1
?i yai 1
yabik yai ybk
Solve to obtain MAP labelling y
143
Integer Programming Formulation
min ?Ty
yai ? 0,1
?i yai 1
yabik yai ybk
But we cant solve it in general
144
Outline
  • Problem Formulation
  • Reparameterization
  • Belief Propagation
  • Tree-reweighted Message Passing
  • Integer Programming Formulation
  • Linear Programming Relaxation and its Dual
  • Convergent Solution for Dual
  • Computational Issues and Theoretical Properties

145
Linear Programming Relaxation
min ?Ty
yai ? 0,1
?i yai 1
yabik yai ybk
Two reasons why we cant solve this
146
Linear Programming Relaxation
min ?Ty
yai ? 0,1
?i yai 1
yabik yai ybk
One reason why we cant solve this
147
Linear Programming Relaxation
min ?Ty
yai ? 0,1
?i yai 1
?k yabik ?kyai ybk
One reason why we cant solve this
148
Linear Programming Relaxation
min ?Ty
yai ? 0,1
?i yai 1
?k yabik yai?k ybk
1
One reason why we cant solve this
149
Linear Programming Relaxation
min ?Ty
yai ? 0,1
?i yai 1
?k yabik yai
One reason why we cant solve this
150
Linear Programming Relaxation
min ?Ty
yai ? 0,1
?i yai 1
?k yabik yai
No reason why we cant solve this

memory requirements, time complexity
151
One variable, two labels
y ya0 ya1
? ?a0 ?a1
ya0 ya1 1
ya1 ? 0,1
ya0 ? 0,1
152
One variable, two labels
y ya0 ya1
? ?a0 ?a1
ya0 ya1 1
ya1 ? 0,1
ya0 ? 0,1
153
Two variables, two labels
ya0 ya1 1
ya1 ? 0,1
ya0 ? 0,1
yb0 yb1 1
yb1 ? 0,1
yb0 ? 0,1
yab00 ya0 yb0
yab01 ya0 yb1
yab10 ya1 yb0
yab11 ya1 yb1
  • ?a0 ?a1 ?b0 ?b1
  • ?ab00 ?ab01 ?ab10 ?ab11

y ya0 ya1 yb0 yb1 yab00
yab01 yab10 yab11
154
Two variables, two labels
ya0 ya1 1
ya1 ? 0,1
ya0 ? 0,1
yb0 yb1 1
yb1 ? 0,1
yb0 ? 0,1
yab00 ya0 yb0
yab01 ya0 yb1
yab10 ya1 yb0
yab11 ya1 yb1
  • ?a0 ?a1 ?b0 ?b1
  • ?ab00 ?ab01 ?ab10 ?ab11

y ya0 ya1 yb0 yb1 yab00
yab01 yab10 yab11
155
Two variables, two labels
ya0 ya1 1
ya1 ? 0,1
ya0 ? 0,1
yb0 yb1 1
yb1 ? 0,1
yb0 ? 0,1
yab00 yab01 ya0
yab10 ya1 yb0
yab11 ya1 yb1
  • ?a0 ?a1 ?b0 ?b1
  • ?ab00 ?ab01 ?ab10 ?ab11

y ya0 ya1 yb0 yb1 yab00
yab01 yab10 yab11
156
Two variables, two labels
ya0 ya1 1
ya1 ? 0,1
ya0 ? 0,1
yb0 yb1 1
yb1 ? 0,1
yb0 ? 0,1
yab00 yab01 ya0
yab10 yab11 ya1
  • ?a0 ?a1 ?b0 ?b1
  • ?ab00 ?ab01 ?ab10 ?ab11

y ya0 ya1 yb0 yb1 yab00
yab01 yab10 yab11
157
In General
Local Polytope
Marginal Polytope
158
In General
  • ? R(VL EL2)

y ? 0,1(VL EL2)
Number of constraints
VL V EL
159
Linear Programming Relaxation
min ?Ty
yai ? 0,1
?i yai 1
?k yabik yai
No reason why we cant solve this
160
Linear Programming Relaxation
Extensively studied
Optimization
Schlesinger, 1976
Koster, van Hoesel and Kolen, 1998
Theory
Chekuri et al, 2001
Archer et al, 2004
Machine Learning
Wainwright et al., 2001
161
Linear Programming Relaxation
Many interesting Properties
  • Preserves solution for reparameterization
  • Global optimal MAP for trees

Wainwright et al., 2001
But we are interested in NP-hard cases
162
Linear Programming Relaxation
Many interesting Properties - Integrality Gap
  • Large class of problems
  • Metric Labelling
  • Semi-metric Labelling
  • Most likely, provides best possible integrality
    gap

Manokaran et al., 2008
163
Linear Programming Relaxation
Many interesting Properties - Dual
  • A computationally useful dual

Optimal value of dual Optimal value of primal
Easier-to-solve
164
Dual of the LP Relaxation
Wainwright et al., 2001
min ?Ty
Va
Vb
Vc
Vd
Ve
Vf
yai ? 0,1
Vg
Vh
Vi
?i yai 1
?
?k yabik yai
165
Dual of the LP Relaxation
Wainwright et al., 2001
?1
Va
Vb
Vc
?1
Va
Vb
Vc
?2
Vd
Ve
Vf
?2
Vd
Ve
Vf
Vg
Vh
Vi
?3
?3
Vg
Vh
Vi
?4
?5
?6
?
Va
Vb
Vc
?i 0
Vd
Ve
Vf
Vg
Vh
Vi
? ?i?i ?
?4
?5
?6
166
Dual of the LP Relaxation
Wainwright et al., 2001
?1
q(?1)
Va
Vb
Vc
Va
Vb
Vc
?2
Vd
Ve
Vf
q(?2)
Vd
Ve
Vf
?3
Vg
Vh
Vi
q(?3)
Vg
Vh
Vi
q(?4)
q(?5)
q(?6)
?
Va
Vb
Vc
?i 0
Dual of LP
Vd
Ve
Vf
max
? ?i q(?i)
Vg
Vh
Vi
? ?i?i ?
?4
?5
?6
167
Dual of the LP Relaxation
Wainwright et al., 2001
?1
q(?1)
Va
Vb
Vc
Va
Vb
Vc
?2
Vd
Ve
Vf
q(?2)
Vd
Ve
Vf
?3
Vg
Vh
Vi
q(?3)
Vg
Vh
Vi
q(?4)
q(?5)
q(?6)
?
Va
Vb
Vc
?i 0
Dual of LP
Vd
Ve
Vf
max
? ?i q(?i)
Vg
Vh
Vi
? ?i?i ? ?
?4
?5
?6
168
Dual of the LP Relaxation
Wainwright et al., 2001
max ? ?i q(?i)
? ?i?i ? ?
I can easily compute q(?i)
I can easily maintain reparam constraint
So can I easily solve the dual?
169
Outline
  • Problem Formulation
  • Reparameterization
  • Belief Propagation
  • Tree-reweighted Message Passing
  • Integer Programming Formulation
  • Linear Programming Relaxation and its Dual
  • Convergent Solution for Dual
  • Computational Issues and Theoretical Properties

170
TRW Message Passing
Kolmogorov, 2006
?4
?5
?6
?1
Va
Vb
Vc
?1
Vb
Vc
Va
?2
Vd
Ve
Vf
?2
Ve
Vf
Vd
Vg
Vh
Vi
?3
?3
Vh
Vi
Vg
?4
?5
?6
Va
Pick a variable
? ?i q(?i)
? ?i?i ? ?
171
TRW Message Passing
Kolmogorov, 2006
?1c1
?1b1
?1a1
?4a1
?4d1
?4g1
?1c0
?1b0
?1a0
?4a0
?4d0
?4g0
Vc
Vb
Va
Va
Vd
Vg
? ?i q(?i)
? ?i?i ? ?
172
TRW Message Passing
Kolmogorov, 2006
?1c1
?1b1
?1a1
?4a1
?4d1
?4g1
?1c0
?1b0
?1a0
?4a0
?4d0
?4g0
Vc
Vb
Va
Va
Vd
Vg
Reparameterize to obtain min-marginals of Va
?1 q(?1) ?4 q(?4) K
?1?1 ?4?4 ?rest ? ?
173
TRW Message Passing
Kolmogorov, 2006
?1c1
?1b1
?1a1
?4a1
?4d1
?4g1
?1c0
?1b0
?1a0
?4a0
?4d0
?4g0
Vc
Vb
Va
Va
Vd
Vg
One pass of Belief Propagation
?1 q(?1) ?4 q(?4) K
?1?1 ?4?4 ?rest
174
TRW Message Passing
Kolmogorov, 2006
?1c1
?1b1
?1a1
?4a1
?4d1
?4g1
?1c0
?1b0
?1a0
?4a0
?4d0
?4g0
Vc
Vb
Va
Va
Vd
Vg
Remain the same
?1 q(?1) ?4 q(?4) K
?1?1 ?4?4 ?rest ? ?
175
TRW Message Passing
Kolmogorov, 2006
?1c1
?1b1
?1a1
?4a1
?4d1
?4g1
?1c0
?1b0
?1a0
?4a0
?4d0
?4g0
Vc
Vb
Va
Va
Vd
Vg
?1 min?1a0,?1a1 ?4 min?4a0,?4a1 K
?1?1 ?4?4 ?rest ? ?
176
TRW Message Passing
Kolmogorov, 2006
?1c1
?1b1
?1a1
?4a1
?4d1
?4g1
?1c0
?1b0
?1a0
?4a0
?4d0
?4g0
Vc
Vb
Va
Va
Vd
Vg
Compute weighted average of min-marginals of Va
?1 min?1a0,?1a1 ?4 min?4a0,?4a1 K
?1?1 ?4?4 ?rest ? ?
177
TRW Message Passing
Kolmogorov, 2006
?1c1
?1b1
?1a1
?4a1
?4d1
?4g1
?1c0
?1b0
?1a0
?4a0
?4d0
?4g0
Vc
Vb
Va
Va
Vd
Vg
?a0 ?1?1a0 ?4?4a0
?a1 ?1?1a1 ?4?4a1
?1 ?4
?1 ?4
?1 min?1a0,?1a1 ?4 min?4a0,?4a1 K
?1?1 ?4?4 ?rest ? ?
178
TRW Message Passing
Kolmogorov, 2006
?1c1
?1b1
?a1
?a1
?4d1
?4g1
?1c0
?1b0
?a0
?a0
?4d0
?4g0
Vc
Vb
Va
Va
Vd
Vg
?a0 ?1?1a0 ?4?4a0
?a1 ?1?1a1 ?4?4a1
?1 ?4
?1 ?4
?1 min?1a0,?1a1 ?4 min?4a0,?4a1 K
?1?1 ?4?4 ?rest
179
TRW Message Passing
Kolmogorov, 2006
?1c1
?1b1
?a1
?a1
?4d1
?4g1
?1c0
?1b0
?a0
?a0
?4d0
?4g0
Vc
Vb
Va
Va
Vd
Vg
?a0 ?1?1a0 ?4?4a0
?a1 ?1?1a1 ?4?4a1
?1 ?4
?1 ?4
?1 min?1a0,?1a1 ?4 min?4a0,?4a1 K
?1?1 ?4?4 ?rest ? ?
180
TRW Message Passing
Kolmogorov, 2006
?1c1
?1b1
?a1
?a1
?4d1
?4g1
?1c0
?1b0
?a0
?a0
?4d0
?4g0
Vc
Vb
Va
Va
Vd
Vg
?a0 ?1?1a0 ?4?4a0
?a1 ?1?1a1 ?4?4a1
?1 ?4
?1 ?4
?1 min?a0,?a1 ?4 min?a0,?a1 K
?1?1 ?4?4 ?rest ? ?
181
TRW Message Passing
Kolmogorov, 2006
?1c1
?1b1
?a1
?a1
?4d1
?4g1
?1c0
?1b0
?a0
?a0
?4d0
?4g0
Vc
Vb
Va
Va
Vd
Vg
?a0 ?1?1a0 ?4?4a0
?a1 ?1?1a1 ?4?4a1
?1 ?4
?1 ?4
(?1 ?4) min?a0, ?a1 K
?1?1 ?4?4 ?rest ? ?
182
TRW Message Passing
Kolmogorov, 2006
?1c1
?1b1
?a1
?a1
?4d1
?4g1
?1c0
?1b0
?a0
?a0
?4d0
?4g0
Vc
Vb
Va
Va
Vd
Vg

min p1p2, q1q2
min p1, q1 min p2, q2
(?1 ?4) min?a0, ?a1 K
?1?1 ?4?4 ?rest ? ?
183
TRW Message Passing
Kolmogorov, 2006
?1c1
?1b1
?a1
?a1
?4d1
?4g1
?1c0
?1b0
?a0
?a0
?4d0
?4g0
Vc
Vb
Va
Va
Vd
Vg
Objective function increases or remains constant
(?1 ?4) min?a0, ?a1 K
?1?1 ?4?4 ?rest ? ?
184
TRW Message Passing
Initialize ?i. Take care of reparam constraint
Choose random variable Va
Compute min-marginals of Va for all trees
Node-average the min-marginals
Can also do edge-averaging
REPEAT
Kolmogorov, 2006
185
Example 1
?2 1
?3 1
?1 1
0
2
4
0
4
6
1
6
6
l1
1
2
4
1
3
1
l0
0
5
1
2
0
3
2
3
4
Vb
Vc
Va
Va
Vb
Vc
5
6
7
Pick variable Va. Reparameterize.
186
Example 1
?2 1
?3 1
?1 1
-3
5
4
0
4
6
-3
6
10
l1
2
1
-1
3
-3
-2
l0
-2
7
1
2
-3
3
2
3
7
Vb
Vc
Va
Va
Vb
Vc
5
6
7
Average the min-marginals of Va
187
Example 1
?2 1
?3 1
?1 1
-3
7.5
4
0
4
6
-3
6
7.5
l1
2
1
-1
3
-3
-2
l0
-2
7
1
2
-3
3
2
3
7
Vb
Vc
Va
Va
Vb
Vc
7
6
7
Pick variable Vb. Reparameterize.
188
Example 1
?2 1
?3 1
?1 1
-7.5
7.5
8.5
-5
9
6
-3
6
7.5
l1
1
-5.5
-3
-1
-3
-7
l0
-7
7
-3
6
-3
3
7
3
7
Vb
Vc
Va
Va
Vb
Vc
7
6
7
Average the min-marginals of Vb
189
Example 1
?2 1
?3 1
?1 1
-7.5
7.5
8.75
-5
8.75
6
-3
6
7.5
l1
1
-5.5
-3
-1
-3
-7
l0
-7
7
-3
6.5
-3
3
6.5
3
7
Vb
Vc
Va
Va
Vb
Vc
6.5
6.5
7
Value of dual does not increase
190
Example 1
?2 1
?3 1
?1 1
-7.5
7.5
8.75
-5
8.75
6
-3
6
7.5
l1
1
-5.5
-3
-1
-3
-7
l0
-7
7
-3
6.5
-3
3
6.5
3
7
Vb
Vc
Va
Va
Vb
Vc
6.5
6.5
7
Maybe it will increase for Vc
NO
191
Example 1
?2 1
?3 1
?1 1
-7.5
7.5
8.75
-5
8.75
6
-3
6
7.5
l1
1
-5.5
-3
-1
-3
-7
l0
-7
7
-3
6.5
-3
3
6.5
3
7
Vb
Vc
Va
Va
Vb
Vc
f1(a) 0
f1(b) 0
f2(b) 0
f2(c) 0
f3(c) 0
f3(a) 0
Strong Tree Agreement
Exact MAP Estimate
192
Example 2
?2 1
?3 1
?1 1
0
2
2
1
0
0
0
0
4
l1
1
0
1
1
0
1
l0
0
5
1
0
0
3
2
0
8
Vb
Vc
Va
Va
Vb
Vc
4
0
4
Pick variable Va. Reparameterize.
193
Example 2
?2 1
?3 1
?1 1
-2
4
2
1
0
0
0
0
4
l1
0
-1
0
1
-1
0
l0
-2
7
1
0
-1
3
2
0
9
Vb
Vc
Va
Va
Vb
Vc
4
0
4
Average the min-marginals of Va
194
Example 2
?2 1
?3 1
?1 1
-2
4
2
1
0
0
0
0
4
l1
0
-1
0
1
-1
0
l0
-2
8
1
0
-1
3
2
0
8
Vb
Vc
Va
Va
Vb
Vc
4
0
4
Value of dual does not increase
195
Example 2
?2 1
?3 1
?1 1
-2
4
2
1
0
0
0
0
4
l1
0
-1
0
1
-1
0
l0
-2
8
1
0
-1
3
2
0
8
Vb
Vc
Va
Va
Vb
Vc
4
0
4
Maybe it will decrease for Vb or Vc
NO
196
Example 2
?2 1
?3 1
?1 1
-2
4
2
1
0
0
0
0
4
l1
0
-1
0
1
-1
0
l0
-2
8
1
0
-1
3
2
0
8
Vb
Vc
Va
Va
Vb
Vc
f1(a) 1
f1(b) 1
f2(b) 1
f2(c) 0
f3(c) 1
f3(a) 1
f2(b) 0
f2(c) 1
Weak Tree Agreement
Not Exact MAP Estimate
197
Example 2
?2 1
?3 1
?1 1
-2
4
2
1
0
0
0
0
4
l1
0
-1
0
1
-1
0
l0
-2
8
1
0
-1
3
2
0
8
Vb
Vc
Va
Va
Vb
Vc
f1(a) 1
f1(b) 1
f2(b) 1
f2(c) 0
f3(c) 1
f3(a) 1
f2(b) 0
f2(c) 1
Weak Tree Agreement
Convergence point of TRW
198
Obtaining the Labelling
Only solves the dual. Primal solutions?
Fix the label Of Va
Va
Vb
Vc
Vd
Ve
Vf
Vg
Vh
Vi
? ? ?i?i ? ?
199
Obtaining the Labelling
Only solves the dual. Primal solutions?
Fix the label Of Vb
Va
Vb
Vc
Vd
Ve
Vf
Vg
Vh
Vi
? ? ?i?i ? ?
Continue in some fixed order
Meltzer et al., 2006
200
Outline
  • Problem Formulation
  • Reparameterization
  • Belief Propagation
  • Tree-reweighted Message Passing
  • Integer Programming Formulation
  • Linear Programming Relaxation and its Dual
  • Convergent Solution for Dual
  • Computational Issues and Theoretical Properties

201
Computational Issues of TRW
Basic Component is Belief Propagation
  • Speed-ups for some pairwise potentials

Felzenszwalb Huttenlocher, 2004
  • Memory requirements cut down by half

Kolmogorov, 2006
  • Further speed-ups using monotonic chains

Kolmogorov, 2006
202
Theoretical Properties of TRW
  • Always converges, unlike BP

Kolmogorov, 2006
  • Strong tree agreement implies exact MAP

Wainwright et al., 2001
  • Optimal MAP for two-label submodular problems

?ab00 ?ab11 ?ab01 ?ab10
Kolmogorov and Wainwright, 2005
203
Results
Szeliski et al. , 2008
Binary Segmentation
Labels - foreground, background
Unary Potentials -log(likelihood) using learnt
fg/bg models
Pairwise Potentials 0, if same labels
1 - ?exp(Da - Db), if
different labels
204
Results
Szeliski et al. , 2008
Binary Segmentation
TRW
Labels - foreground, background
Unary Potentials -log(likelihood) using learnt
fg/bg models
Pairwise Potentials 0, if same labels
1 - ?exp(Da - Db), if
different labels
205
Results
Szeliski et al. , 2008
Binary Segmentation
Belief Propagation
Labels - foreground, background
Unary Potentials -log(likelihood) using learnt
fg/bg models
Pairwise Potentials 0, if same labels
1 - ?exp(Da - Db), if
different labels
206
Results
Szeliski et al. , 2008
Stereo Correspondence
Labels - disparities
Unary Potentials Similarity of pixel colours
Pairwise Potentials 0, if same labels
1 - ?exp(Da - Db), if
different labels
207
Results
Szeliski et al. , 2008
Stereo Correspondence
TRW
Labels - disparities
Unary Potentials Similarity of pixel colours
Pairwise Potentials 0, if same labels
1 - ?exp(Da - Db), if
different labels
208
Results
Szeliski et al. , 2008
Stereo Correspondence
Belief Propagation
Labels - disparities
Unary Potentials Similarity of pixel colours
Pairwise Potentials 0, if same labels
1 - ?exp(Da - Db), if
different labels
209
Results
Kolmogorov, 2006
Non-submodular problems
BP
TRW-S
BP
TRW-S
30x30 grid
K50
BP outperforms TRW-S
210
Summary
  • Trees can be solved exactly - BP
  • No guarantee of convergence otherwise - BP
  • Strong Tree Agreement - TRW-S
  • Submodular energies solved exactly - TRW-S
  • TRW-S solves an LP relaxation of MAP estimation
  • Loopier graphs give worse results
  • Rother and Kolmogorov, 2006

211
Related New(er) Work
  • Solving the Dual

Weiss et al., 2006
Globerson and Jaakkola, 2007
Komodakis, Paragios and Tziritas 2007
Schlesinger and Giginyak, 2007
  • Solving the Primal

Ravikumar, Agarwal and Wainwright, 2008
212
Related New(er) Work
  • More complex relaxations

Kumar, Kolmogorov and Torr, 2007
Sontag and Jaakkola, 2007
Kumar and Torr, 2008
Sontag et al., 2008
Komodakis and Paragios, 2008
Werner, 2008
213
Questions on Part I ?
Code Standard Data
http//vision.middlebury.edu/MRF
Write a Comment
User Comments (0)
About PowerShow.com