Monte Carlo Hidden Markov Models - PowerPoint PPT Presentation

1 / 76
About This Presentation
Title:

Monte Carlo Hidden Markov Models

Description:

Monte Carlo Hidden Markov Models – PowerPoint PPT presentation

Number of Views:54
Avg rating:3.0/5.0
Slides: 77
Provided by: Sebasti56
Category:
Tags: carlo | ewe | hidden | markov | models | monte

less

Transcript and Presenter's Notes

Title: Monte Carlo Hidden Markov Models


1
Stanford CS223B Computer Vision, Winter
2008/09Lecture 12 Segmentation and Grouping
Professor Sebastian Thrun CAs Ethan Dreyfuss,
Young Min Kim, Alex Teichman
2

Pictures from Mean Shift A Robust Approach
toward Feature Space Analysis, by D. Comaniciu
and P. Meer http//www.caip.rutgers.edu/comanici/
MSPAMI/msPamiResults.html
3
Intro Segmentation and Grouping
Segmentation breaks an image into groups over
space and/or time
  • Tokens are
  • The things that are grouped (pixels, points,
    surface elements, flow, etc.)
  • top down segmentation
  • tokens grouped because they lie on the same object
  • Motivation
  • for recognition?
  • for compression
  • Relationship of sequence/set of tokens
  • Always for a goal or application
  • Currently, no real theory
  • bottom up segmentation
  • tokens belong together because of some local
    affinity measure
  • Bottom up/Top Down need not be mutually exclusive

4
Outline
  • Segmentation Challenges
  • Segmentation by Clustering
  • Segmentation by Graph Cuts
  • Segmentation by Spectral Clustering
  • Active Contours and Snakes

5
Biological
For humans at least, Gestalt psychology
identifies several properties that result In
grouping/segmentation
6
Biological
For humans at least, Gestalt psychology
identifies several properties that result In
grouping/segmentation
7
Groupings by Invisible Completions
Images from Steve Lehars Gestalt papers
http//cns-alumni.bu.edu/pub/slehar/Lehar.html
8
Groupings by Invisible Completions
Images from Steve Lehars Gestalt papers
http//cns-alumni.bu.edu/pub/slehar/Lehar.html
9
Here, the 3D nature of grouping is apparent
Why do these tokens belong together?
10
And the famous invisible dog eating under a tree
11
A Final Segmentation Challenge
12
Outline
  • Segmentation Challenges
  • Segmentation by Clustering
  • Segmentation by Graph Cuts
  • Segmentation by Spectral Clustering
  • Active Contours and Snakes

13
Segmentation as clustering
  • Cluster together (pixels, tokens, etc.) that
    belong together
  • Agglomerative clustering
  • attach closest to cluster it is closest to
  • repeat
  • Divisive clustering
  • split cluster along best boundary
  • repeat
  • Point-Cluster distance
  • single-link clustering
  • complete-link clustering
  • group-average clustering
  • Dendrograms
  • yield a picture of output as clustering process
    continues

From Marc Pollefeys COMP 256 2003
14
Example
Image
Clusters on intensity
Clusters on color
From Marc Pollefeys COMP 256 2003
15
Simple clustering algorithms
From Marc Pollefeys COMP 256 2003
16
From Marc Pollefeys COMP 256 2003
17
Mean Shift Segmentation
  • One of the most popular techniques

http//www.caip.rutgers.edu/comanici/MSPAMI/msPam
iResults.html
18
Mean Shift Algorithm
  • Mean Shift Algorithm
  • Choose a search window size.
  • Choose the initial location of the search window.
  • Compute the mean location (centroid of the data)
    in the search window.
  • Center the search window at the mean location
    computed in Step 3.
  • Repeat Steps 3 and 4 until convergence.

The mean shift algorithm seeks the mode or
point of highest density of a data distribution
19
Mean Shift Segmentation Results
http//www.caip.rutgers.edu/comanici/MSPAMI/msPam
iResults.html
20
K-Means
  • Choose a fixed number of clusters
  • Choose cluster centers and point-cluster
    allocations to minimize error
  • cant do this by exhaustive search, because there
    are too many possible allocations.
  • Algorithm
  • fix cluster centers allocate points to closest
    cluster
  • fix allocation compute best cluster centers
  • x could be any set of features for which we can
    compute a distance (careful about scaling)

From Marc Pollefeys COMP 256 2003
21
K-Means
22
K-Means
From Marc Pollefeys COMP 256 2003
23
Image Segmentation by K-Means
  • Select a value of K
  • Select a feature vector for every pixel (color,
    texture, position, or combination of these etc.)
  • Define a similarity measure between feature
    vectors (Usually Euclidean Distance).
  • Apply K-Means Algorithm.
  • Apply Connected Components Algorithm.
  • Merge any components of size less than some
    threshold to an adjacent component that is most
    similar to it.

From Marc Pollefeys COMP 256 2003
24
Results of K-Means Clustering
Image
Clusters on intensity
Clusters on color
K-means clustering using intensity alone and
color alone
From Marc Pollefeys COMP 256 2003
25
K-Means
  • Is an approximation to EM
  • Model (hypothesis space) Mixture of N Gaussians
  • Latent variables Correspondence of data and
    Gaussians
  • We notice
  • Given the mixture model, its easy to calculate
    the correspondence
  • Given the correspondence its easy to estimate
    the mixture models

26
Generalized K-Means (EM)
27
Idea
  • Data generated from mixture of Gaussians
  • Latent variables Correspondence between Data
    Items and Gaussians

28
Learning a Gaussian Mixture(with known
covariance)
29
Generalized K-Means
  • Converges!
  • Proof Neal/Hinton, McLachlan/Krishnan
  • E/M step does not decrease data likelihood
  • Converges at local minimum or saddle point

30
EM Clustering Results
http//www.ece.neu.edu/groups/rpl/kmeans/
31
Application Clustering Flow
ML correspondence
Clustered Flow
32
Outline
  • Segmentation Challenges
  • Segmentation by Clustering
  • Segmentation by Graph Cuts
  • Segmentation by Spectral Clustering
  • Active Contours and Snakes

33
Graph theoretic clustering
  • Represent tokens (which are associated with each
    pixel) using a weighted graph.
  • affinity matrix (pii has affinity of 0)
  • Cut up this graph to get subgraphs with strong
    interior links and weaker exterior links

Application to vision originated with Prof. Malik
at Berkeley
34
Graphs Representations
a
b
c
e
d
Adjacency Matrix W
From Khurram Hassan-Shafique CAP5415 Computer
Vision 2003
35
Weighted Graphs
a
b
c
e
6
d
Weight Matrix W
From Khurram Hassan-Shafique CAP5415 Computer
Vision 2003
36
Minimum Cut
A cut of a graph G is the set of edges S such
that removal of S from G disconnects G. Minimum
cut is the cut of minimum weight, where weight of
cut ltA,Bgt is given as
From Khurram Hassan-Shafique CAP5415 Computer
Vision 2003
37
Minimum Cut and Clustering
From Khurram Hassan-Shafique CAP5415 Computer
Vision 2003
38
Image Segmentation Minimum Cut
Pixel Neighborhood
w
Image Pixels
Similarity Measure
Minimum Cut
From Khurram Hassan-Shafique CAP5415 Computer
Vision 2003
39
Minimum Cut
  • There can be more than one minimum cut in a given
    graph
  • All minimum cuts of a graph can be found in
    polynomial time1.

1H. Nagamochi, K. Nishimura and T. Ibaraki,
Computing all small cuts in an undirected
network. SIAM J. Discrete Math. 10 (1997)
469-481.
From Khurram Hassan-Shafique CAP5415 Computer
Vision 2003
40
Drawbacks of Minimum Cut
  • Weight of cut is directly proportional to the
    number of edges in the cut.

Cuts with lesser weight than the ideal cut
Ideal Cut
Slide from Khurram Hassan-Shafique CAP5415
Computer Vision 2003
41
Normalized Cuts1
  • Normalized cut is defined as
  • Ncut(A,B) is the measure of dissimilarity of sets
    A and B.
  • Small if
  • Weights between clusters small
  • Weights within a cluster large
  • Minimizing Ncut(A,B) maximizes a measure of
    similarity within the sets A and B

1J. Shi and J. Malik, Normalized Cuts Image
Segmentation, IEEE Trans. of PAMI, Aug 2000.
Slide from Khurram Hassan-Shafique CAP5415
Computer Vision 2003
42
Finding Minimum Normalized-Cut
  • Finding the Minimum Normalized-Cut is NP-Hard.
  • Polynomial Approximations are generally used for
    segmentation

Slide from Khurram Hassan-Shafique CAP5415
Computer Vision 2003
43
Finding Minimum Normalized-Cut
From Khurram Hassan-Shafique CAP5415 Computer
Vision 2003
44
Finding Minimum Normalized-Cut
  • It can be shown that
  • such that
  • If y is allowed to take real values then the
    minimization can be done by solving the
    generalized eigenvalue system

See Forsyth Chapters in segmentation (pages
323-326)
Slide from Khurram Hassan-Shafique CAP5415
Computer Vision 2003
45
Algorithm
  • Compute matrices W D
  • Solve for eigen
    vectors with the smallest eigen values
  • Use the eigen vector with second smallest eigen
    value to bipartition the graph
  • Recursively partition the segmented parts if
    necessary.

Slide from Khurram Hassan-Shafique CAP5415
Computer Vision 2003
46
Example Results
Figure from Image and video segmentation the
normalised cut framework, by Shi and Malik, 1998
Slide from Khurram Hassan-Shafique CAP5415
Computer Vision 2003
47
More Results
F igure from Normalized cuts and image
segmentation, Shi and Malik, 2000
Slide from Khurram Hassan-Shafique CAP5415
Computer Vision 2003
48
Drawbacks of Minimum Normalized Cut
  • Huge Storage Requirement and time complexity
  • Bias towards partitioning into equal segments
  • Have problems with textured backgrounds

Slide from Khurram Hassan-Shafique CAP5415
Computer Vision 2003
49
Outline
  • Segmentation Challenges
  • Segmentation by Clustering
  • Segmentation by Graph Cuts
  • Segmentation by Spectral Clustering
  • Active Contours and Snakes

50
Finding the Minimal Cuts in Color SpaceSpectral
Clustering Overview
Data
Similarities
Block-Detection
Slides from Dan Klein, Sep Kamvar, Chris
Manning, Natural Language Group Stanford
University
51
Eigenvectors and Blocks
  • Block matrices have block eigenvectors
  • Near-block matrices have near-block eigenvectors
    Ng et al., NIPS 02

l2 2
l3 0
l1 2
l4 0
eigensolver
l3 -0.02
l1 2.02
l2 2.02
l4 -0.02
eigensolver
Slides from Dan Klein, Sep Kamvar, Chris
Manning, Natural Language Group Stanford
University
52
Spectral Space
  • Can put items into blocks by eigenvectors
  • Resulting clusters independent of row ordering

e1
e2
e1
e2
e1
e2
e1
e2
Slides from Dan Klein, Sep Kamvar, Chris
Manning, Natural Language Group Stanford
University
53
The Spectral Advantage
  • The key advantage of spectral clustering is the
    spectral space representation

Slides from Dan Klein, Sep Kamvar, Chris
Manning, Natural Language Group Stanford
University
54
Clustering and Classification
  • Once our data is in spectral space
  • Clustering
  • Classification

Slides from Dan Klein, Sep Kamvar, Chris
Manning, Natural Language Group Stanford
University
55
Measuring Affinity
Intensity
Distance
Texture
From Marc Pollefeys COMP 256 2003
56
Scale affects affinity
From Marc Pollefeys COMP 256 2003
57
From Marc Pollefeys COMP 256 2003
58
Outline
  • Segmentation Challenges
  • Segmentation by Clustering
  • Segmentation by Graph Cuts
  • Segmentation by Spectral Clustering
  • Active Contours and Snakes

59
Snakes Introduction
  • The active contour model, or snake, is defined as
    an energy-minimizing spline.
  • Active contours results from work of Kass et.al.
    in 1987.
  • Active contour models may be used in image
    segmentation and understanding.
  • The snakes energy depends on its shape and
    location within the image.
  • Snakes can be closed or open

60
Example
61
Introduction (3)
  • First an initial spline (snake) is placed on the
    image, and then its energy is minimized.
  • Local minima of this energy correspond to desired
    image properties.
  • Unlike most other image models, the snake is
    active, always minimizing its energy functional,
    therefore exhibiting dynamic behavior.
  • Also suitable for analysis of dynamic data or 3D
    image data.

62
Kass Algorithm
  • The snake is defined parametrically as
    v(s)x(s),y(s), where s?0,1 is the normalized
    arc length along the contour. The energy
    functional to be minimized may be written as
  • Econt snake continuity
  • Ecurv snake curvature
  • Eimage image forces (e.g., edge attraction)

63
Internal Energy
  • Continuity
  • Curvature

64
Image Forces
  • Dark/Bright Lines
  • Edges

65
Trade-offs
  • a, b, g determine trade-off

66
Numerical Algorithm
  • Select N initial locations p1,, pN
  • Update until convergence

67
Snakes, Done Right
  • Define Spline over p1,, pN
  • Optimize criterion for all points on spline
  • Allow for corners
  • (optimization becomes tricky, fills entire
    literature)

68
Applications
http//www.markschulze.net/snakes/
  • Main Applications are
  • Segmentation
  • Tracking
  • Registration

69
Examples (1)
Julien Jomier
70
Examples (2)
Julien Jomier
71
Examples (3)
Heart
Julien Jomier
72
Examples (4)
  • 3D Segmentation of the Hippocampus

Julien Jomier
73
Examples (5)
Julien Jomier
74
Example (6)
Julien Jomier
75
Example (7)
Julien Jomier
76
Problems with Snakes
  • Snakes sometimes degenerate in shape by shrinking
    and flattening.
  • Stability and convergence of the contour
    deformation process unpredictable.Solution Add
    some constraints
  • Initialization is not straightforward.Solution
    Manual, Learned, Exhaustive
Write a Comment
User Comments (0)
About PowerShow.com