Embedding Gestalt Laws in Markov Random Fields by Song-Chun Zhu - PowerPoint PPT Presentation

1 / 34
About This Presentation
Title:

Embedding Gestalt Laws in Markov Random Fields by Song-Chun Zhu

Description:

Embedding Gestalt Laws in Markov Random Fields by Song-Chun Zhu Purpose of the Paper Proposes functions to measure Gestalt features of shapes Adapts [Zhu, Wu Mumford ... – PowerPoint PPT presentation

Number of Views:94
Avg rating:3.0/5.0
Slides: 35
Provided by: csUmdEdu2
Learn more at: http://www.cs.umd.edu
Category:

less

Transcript and Presenter's Notes

Title: Embedding Gestalt Laws in Markov Random Fields by Song-Chun Zhu


1
Embedding Gestalt Lawsin Markov Random
Fieldsby Song-Chun Zhu
2
Purpose of the Paper
  • Proposes functions to measure Gestalt features of
    shapes
  • Adapts Zhu, Wu Mumford FRAME method to shapes
  • Exhibits effect of MRF model obtained by putting
    these together.

3
Recall Gestalt Features
  • (à la Lowe, and others)
  • Colinearity
  • Cocircularity
  • Proximity
  • Parallelism
  • Symmetry
  • Continuity
  • Closure
  • Familiarity

4
FRAME
  • Zhu, Wu, Mumford
  • F ilters
  • R andom fields
  • A nd
  • M aximum
  • E ntropy
  • A general procedure for constructing MRF models

5
Three Main Parts
  • Data
  • Learn MRF models from data
  • Test generative power of learned model

6
Elements of Data
  • A set of images representative of the chosen
    application domain
  • An adequate collection of feature measures or
    filters
  • The (marginal) statistics of applying the feature
    measures or filters to the set of images

7
Data Images
  • Zhu considers 22 animal shapes and their
    horizontal flips
  • The resulting histograms are symmetric
  • More data can be obtained
  • But are there other effects?

8
Sample Animate Images
9
Contour-based Feature Measures
  • Goal is to be generic
  • But generic shape features are hard to find
  • f1 ?(s), the curvature
  • ?(s) 0 implies the linelets on either side of
    G(s) are colinear
  • f2 ?'(s), its derivative
  • ?'(s) 0 implies three sequential linelets are
    cocircular
  • Other contour-based shape filters can be defined
    in the same way

10
Zhu's Symmetry Function
  • ?(s) pairs linelets across medial axes
  • Defined and computed by minimizing an energy
    functional constructed so that
  • Paired linelets are as close, parallel and
    symmetric as possible, and
  • There are as few discontinuities as possible

11
Region-based Feature Measures
  • f3(s) dist(s, ?(s))
  • Measures proximity of paired linelets across a
    region
  • f4(s) f3'(s), the derivative
  • f4(s) 0 implies paired linelets are parallel
  • f5(s) f'4(s) f3''(s)
  • f5(s) 0 implies paired linelets are symmetric

12
Another Possible Shape Feature
  • f6(s) 1 where ?(s) is discontinuous
  • 0 otherwise
  • Counts the number of parts a shape has
  • Can Gestalt familiarity be (statistically?)
    measured?

13
The Statistic
  • The histogram of feature f over curve G is
  • H(z fk, G) ?d(z-fk(s)) ds
  • d is the Dirac function mass 1 at 0, and 0
    otherwise
  • µ(z fk) denotes the average over all images
  • Zhu claims µ is a close estimation of the
    marginal distribution of the true distribution
    over shape space, assuming the total number of
    linelets is small.

14
Statistical Observations
On 22 images and their flips
f1 at scales 0, 1, 2
f5
f3
f4
15
Construct a Model
  • O is the space of shapes
  • F is a finite subset of feature filters
  • We seek a probability distribution p on O
  • ?O p(G) dG 1 (1)
  • That reproduces the statistics for all f in O
  • ?O p(G) d(z-f(s)) dG µ(z f) (2)

16
Construct a Model, 2
  • Idea Choose the p with maximal entropy
  • Seems reasonable and fair, but is it really the
    best target/energy function?
  • Lagrange multipliers and calculus of variations
    lead to
  • p(G F, ?) exp(?f?F ? ?f(z) H(f, G, z) dz) / Z
  • where Z is the usual normalizing factor
  • ? ?f f?F

17
It's a Gibbs Distribution
  • In other words, it has the form of a Gibbs
    distribution, and therefore determines a Markov
    Random Field (MRF) model.

18
Markov Chain Monte Carlo
  • Too hard to compute ?'s and p analytically
  • Idea Sample O according to the distribution
    p, stochastically update ? to update p, and
    repeat until p reproduces all µ(z f) for f ?
    F
  • Monte Carlo because of random walk
  • Markov Chain in the nature of the loop

19
Markov Chain Monte Carlo, 2
  • From the sampling produce µ'(z f)
  • Same as µ(z f) except based on a random sample
    of shape space
  • For the purposes of today's discussion, the
    details are not important
  • For f ? F
  • µ'(z f) µ(z f)
  • Zhu et al. assume there exists a true underlying
    distribution

20
The Nonaccidental Statistic
  • For f' not in the set F we expect
  • µ'(z f') ? µ(z f')
  • µ'(z f') is the accidental statistic for f'
  • It is a measure of correlation between f' and
    F
  • The distance (L1, L2, or other) between
    µ'(z f') and µ(z f') is the nonaccidental
    statistic for f'
  • It is a measure of how much additional
    information f' carries above what is already
    in F

21
The Algorithm (simplified)
  • Enter your set G ? of shapes
  • Enter a (large) set f of candidate feature
    measures
  • Compute µ(f, G) for all f in F
  • Compute µ'(f) relative to a uniform
    distribution on O
  • Until the nonaccidental statistic of all unused
    features is small enough, repeat

22
Algorithm, 2
  • Of the remaining f , add to F one with maximal
    nonaccidental statistic
  • Update
  • Set of Lagrange multipliers ? ?
  • Probability distribution model p(F, ?)
  • The µ'(f) for remaining candidate features f

23
Experiments and Discussion
  • Let my description of these experiments stimulate
    your thoughts on such issues as
  • Are there better Gestalt feature measures?
  • What is the best possible outcome of a generative
    model of shape?
  • What feature measures should be added to the
    Gestalt ones?
  • How useful were these experiments and what other
    might be worth doing?

24
Experiment 1
  • When the only feature used is the curvature the
    model generated

25
Experiment 1, continued
  • A Gaussian model (with the same ?-variance)
    produced

26
Experiment 2
  • Experiment 2 uses both ? and ?'
  • The nonaccidental statistic of ?' with respect
    to the model based on ? can be seen here

27
Experiment 2, continued
  • This time the model generated these shapes,
    purported to be smoother and more scale invariant

28
Experiment 3
  • The nonaccidental statistics of the three
    region-based shape features relative to the model
    produced in Experiment 2

29
Experiment 3, continued
  • So r'' was omitted, this model has
  • F ?, ?', r, r'

30
Experiment 3, continued
  • This model produced such shapes as

31
Concluding Discussion
  • Zhu acknowledges that the selection of training
    shapes might introduce a bias but

32
Discussion, continued
  • Zhu acknowledges that the paucity of Gestalt
    features limits the possible neighborhood
    structures used to define a MRF.
  • Zhu acknowledges that these models do not account
    for high-level shape properties, and suggests
    that a composition system might address this
    problem.

33
Questions and Comments
  • Although it is in the nature of an MRF-model to
    propagate local properties, I think there needs
    to be a higher-level basis (than linelets) for
    measuring the Gestalt features of a shape!
  • Are there better Gestalt feature measures?
  • What feature measures should be added to the
    Gestalt ones?

34
More Questions for Discussion
  • What is the best possible outcome of a generative
    model of shape? Is such a thing worth pursuing?
  • How useful were Zhu' experiments and what others
    might be worth doing?
Write a Comment
User Comments (0)
About PowerShow.com