Outline - PowerPoint PPT Presentation

About This Presentation
Title:

Outline

Description:

Outline Statistical Modeling and Conceptualization of Visual Patterns S. C. Zhu, Statistical modeling and conceptualization of visual patterns, IEEE ... – PowerPoint PPT presentation

Number of Views:119
Avg rating:3.0/5.0
Slides: 96
Provided by: Xiu5
Learn more at: http://www.cs.fsu.edu
Category:

less

Transcript and Presenter's Notes

Title: Outline


1
Outline
  • Statistical Modeling and Conceptualization of
    Visual Patterns
  • S. C. Zhu, Statistical modeling and
    conceptualization of visual patterns, IEEE
    Transactions on Pattern Analysis and Machine
    Intelligence, vol. 25, no. 6, 1-22, 2003

2
A Common Framework of Visual Knowledge
Representation
  • Visual patterns in natural images
  • Natural images consist of an overwhelming number
    of visual patterns
  • Generated by very diverse stochastic processes
  • Comments
  • Any single image normally consists of a few
    recognizable/segmentable visual patterns
  • Scientifically, given that visual patterns are
    generated by stochastic processes, shall we model
    the underlying stochastic processes or model
    visual patterns presented in the observations
    from the stochastic processes?

3
A Common Framework of Visual Knowledge
Representation cont.
4
A Common Framework of Visual Knowledge
Representation cont.
  • The image analysis as an image parsing problem
  • Parse generic images into their constituent
    patterns (according to the underlying stochastic
    processes)
  • Perceptual grouping when applied to points,
    lines, and curves processes
  • Image segmentation when applied to region
    processes
  • Object recognition when applied to high level
    objects

5
A Common Framework of Visual Knowledge
Representation cont.
6
A Common Framework of Visual Knowledge
Representation cont.
  • Required components for parsing
  • Mathematical definitions and models of various
    visual patterns
  • Definitions and models are intrinsically
    recursive
  • Grammars (or called rules)
  • Which specifies the relationships among various
    patterns
  • Grammars should be stochastic in nature
  • A parsing algorithm

7
Syntactical Pattern Recognition
8
A Common Framework of Visual Knowledge
Representation cont.
  • Conceptualization of visual patterns
  • The concept of a pattern is an abstraction of
    some properties decided by certain visual
    purposes
  • They are feature statistics computed from
  • Raw signals
  • Some hidden descriptions inferred from raw
    signals
  • Mathematically, each pattern is equivalent to a
    set of observable signals governed by a
    statistical model

9
A Common Framework of Visual Knowledge
Representation cont.
  • Statistical modeling of visual patterns
  • Statistical models are intrinsic representations
    of visual knowledge and image regularities
  • Due to noise and distortion in imaging process?
  • Due to noise and distortion in the underlying
    generative process?
  • Due to transformations in the underlying
    stochastic process?
  • Pattern theory

10
A Common Framework of Visual Knowledge
Representation cont.
  • Statistical modeling of visual patterns -
    continued
  • Mathematical space for patterns and spaces
  • Depends on the forms
  • Parametric
  • Non-parametric
  • Attributed graphs
  • Different models
  • Descriptive models
  • Bottom-up, feature-based models
  • Generative models
  • Hidden variables for generating images in a
    top-down manner

11
A Common Framework of Visual Knowledge
Representation cont.
  • Learning a visual vocabulary
  • Hierarchy of visual descriptions for general
    visual patterns
  • Vocabulary of visual description
  • Learning from an ensemble of natural images
  • Vocabulary is far from enough
  • Rich structures in physics
  • Large vocabulary in speech and language

12
A Common Framework of Visual Knowledge
Representation cont.
  • Computational tractability
  • Computational heuristics for effective inference
    of visual patterns
  • Discriminative models
  • A framework
  • Discriminative probabilities are used as proposal
    probabilities that drive the Markov chain search
    for fast convergence and mixing
  • Generative models are top-down probabilities and
    the hidden variables to be inferred from
    posterior probabilities

13
A Common Framework of Visual Knowledge
Representation cont.
  • Discussion
  • Images are generated by rendering 3D objects
    under some external conditions
  • All the images from one object form a low
    dimensional manifold in a high dimensional image
    space
  • Rendering can be modeled fairly accurately
  • Describing a 3D object requires a huge amount of
    data
  • Under this setting
  • A visual pattern simply corresponds to the
    manifold
  • Descriptive model attempts to characterize the
    manifold
  • Generative model attempts to learn the 3D objects
    and the rendering

14
3D Model-Based Recognition
15
Literature Survey
  • To develop a generic vision system, regularities
    in images must be modeled
  • The study of natural image statistics
  • Ecologic influence on visual perception
  • Natural images have high-order (i.e.,
    non-Gaussian) structures
  • The histograms of Gabor-type filter responses on
    natural images have high kurtosis
  • Histograms of gradient filters are consistent
    over a range of scales

16
Natural Image Statistics Example
17
Analytical Probability Models for Spectral
Representation
  • Transported generator model (Grenander and
    Srivastava, 2000)
  • where
  • gis are selected randomly from some generator
    space G
  • the weigths ais are i.i.d. standard normal
  • the scales ris are i.i.d. uniform on the
    interval 0,L
  • the locations zis as samples from a 2D
    homogenous Poisson process, with a uniform
    intensity l, and
  • the parameters are assumed to be independent of
    each other

18
Analytical Probability Models - continued
  • Define
  • Model u by a scaled ?-density

19
Analytical Probability Models - continued
20
Analytical Probability Models - continued
21
Analytical Probability Models - continued
22
Analysis of Natural Image Components
  • Harmonic analysis
  • Decomposing various classes of functions by
    different bases
  • Including Fourier transform, wavelet transforms,
    edgelets, curvelets, and so on

23
Sparse Coding
From S. C. Zhu
24
Grouping of Natural Image Elements
  • Gestalt laws
  • Gestalt grouping laws
  • Should be interpreted as heuristics rather than
    deterministic laws
  • Nonaccidental property

25
Illusion
26
Illusion cont.
27
Ambiguous Figure
28
Statistical Modeling of Natural Image Patterns
  • Synthesis-by-analysis

29
Analog from Speech Recognition
30
Modeling of Natural Image Patterns
  • Shape-from-X problems are fundamentally ill-posed
  • Markov random field models
  • Deformable templates for objects
  • Inhomogeneous MRF models on graphs

31
Four Categories of Statistical Models
  • Descriptive models
  • Constructed based on statistical descriptions of
    the image ensembles
  • Homogeneous models
  • Statistics are assumed to be the same for all
    elements in the graph
  • Inhomogeneous models
  • The elements of the underlying graph are labeled
    and different features and statistics are used at
    different sites

32
Variants of Descriptive Models
  • Casual Markov models
  • By imposing a partial ordering among the vertices
    of the graph, the joint probability can be
    factorized as a product of conditional
    probabilities
  • Belief propagation networks
  • Pseudo-descriptive models

33
Generative Models
  • Use of hidden variables that can explain away
    the strong dependency in observed images
  • This requires a vocabulary
  • Grammars to generate images from hidden variables
  • Note that generative models can not be separated
    from descriptive models
  • The description of hidden variables requires
    descriptive models

34
Discriminative Models
  • Approximation of posterior probabilities of
    hidden variables based on local features
  • Can be seen as importance proposal probabilities

35
An Example
36
Problem formation
  • Input a set of images

Output a probability model
Here, f(I) represents the ensemble of images in a
given domain, we shall discuss the
relationship between ensemble and probability
later.
37
Problem formation

The model p approaches the true density
38
Maximum Likelihood Estimate
39
Model Pursuit
1. What is W -- the family of models ? 2. How
do we augment the space W?
40
Two Choices of Models
  1. The exponential family descriptive models

--- Characterize images by features and statistics
2. The mixture family -- generative models
--- Characterize images by hidden variables
41
I Descriptive Models
  • Step 1 extracting image features/statistics as
    transforms

For example histograms of Gabor
filter responses.
Other features/statistics Gabors, geometry,
Gestalt laws, faces.
42
I.I Descriptive Models
Step 2 using features/statistics to constrain
the model
Two cases
  1. On infinite lattice Z2 --- an equivalence class.
  2. On any finite lattice --- a conditional
    probability model.

image space on Z2
image space on lattice L
43
I.I Descriptive Model on Finite Lattice
Modeling by maximum entropy
Subject to
Remark p and f have the same projected
marginal statistics.
44
Minimax Entropy Learning
For a Gibbs (max. entropy) model p, this leads to
the minimax entropy principle (Zhu,Wu,
Mumford 96,97)
45
FRAME Model
  • FRAME model
  • Filtering, random field, and maximum entropy
  • A well-defined mathematical model for textures by
    combining filtering and random field models

46
I.I Descriptive Model on Finite Lattice
The FRAME model (Zhu, Wu, Mumford, 1996)
This includes all Markov random field models.
Remark all known exponential models are from
maxent., and maxent was proposed in Physics
(Jaynes, 1957). The nice thing is that it
provides a parametric model integrating features.

47
I.I Descriptive Model on Finite Lattice
Two learning phases 1. Choose information
bearing features -- augmenting the
probability family. 2. Compute the
parameter L by MLE -- learning within
a family.
48
Maximum Entropy
  • Maximum entropy
  • Is an important principle in statistics for
    constructing a probability distribution on a set
    of random variables
  • Suppose the available information is the
    expectations of some known functions ?n(x), that
    is
  • Let W be the set of all probability distributions
    p(x) which satisfy the constraints

49
Maximum Entropy cont.
  • Maximum Entropy continued
  • According to the maximum entropy principle, a
    good choice of the probability distribution is
    the one that has the maximum entropy
  • subject to

50
Maximum Entropy cont.
  • Maximum Entropy continued
  • By Lagrange multipliers, the solution for p(x) is
  • where

51
Maximum Entropy cont.
  • Maximum Entropy continued
  • are determined by
    the constraints
  • But a closed form solution is not available
    general
  • Numerical solutions

52
Maximum Entropy cont.
  • Maximum Entropy continued
  • The solutions are guaranteed to exist and be
    unique by the following properties

53
Minimax Entropy Learning (cont.)
Intuitive interpretation of minimax entropy.
54
Learning A High Dimensional Density
55
Toy Example I
56
Toy Example II
57
FRAME Model
  • Texture modeling
  • The features can be anything you want ?n(x)
  • Histograms of filter responses are a good feature
    for textures

58
FRAME Model cont.
  • The FRAME algorithm
  • Initialization
  • Input a texture image Iobs
  • Select a group of K filters SKF(1), F(2), ....,
    F(K)
  • Compute Hobs(a), a 1, ....., K
  • Initialize
  • Initialize Isyn as a uniform white noise image

59
FRAME Model cont.
  • The FRAME algorithm continued
  • The algorithm
  • Repeat
  • calculate Hsyn(a), a1,..., K from Isyn and
    use it as
  • Update by
  • Apply Gibbs sampler to flip Isyn for w sweeps
  • until

60
FRAME Model cont.
  • The Gibbs sampler

61
FRAME Model cont.
  • Filter selection
  • In practice, we want a small number of good
    filters
  • One way to do that is to choose filters that
    carry the most information
  • In other words, minimum entropy

62
FRAME Model cont.
  • Filter selection algorithm
  • Initialization

63
FRAME Model cont.
64
Descriptive Models cont.
65
Existing Texture Features
66
Existing Feature Statistics
67
Most General Feature Statistics
68
Julesz Ensemble cont.
  • Definition
  • Given a set of normalized statistics on lattice ?
  • a Julesz ensemble W(h) is the limit of the
    following set as ? ? Z2 and H ? h under some
    boundary conditions

69
Julesz Ensemble cont.
  • Feature selection
  • A feature can be selected from a large set of
    features through information gain, or the
    decrease in entropy

70
Example 2D Flexible Shapes
71
A Random Field for 2D Shape
The neighborhood
Co-linearity, co-circularity, proximity,
parallelism, symmetry,
72
A Descriptive Shape Model
Random 2D shapes sampled from a Gibbs model.
(Zhu, 1999)
73
A Descriptive Shape Model
Random 2D shapes sampled from a Gibbs model.
74
Example Face Modeling
75
Generative Models
  • Use of hidden variables that can explain away
    the strong dependency in observed images
  • This requires a vocabulary
  • Grammars to generate images from hidden variables
  • Note that generative models can not be separated
    from descriptive models
  • The description of hidden variables requires
    descriptive models

76
Generative Models cont.
77
Philosophy of Generative Models
?
World structure H
observer
78
Example of Generative Model image coding
Random variables
Parameters wavelets
Assumptions 1. Overcomplete basis 2.
High kurtosis for iid a, e.g.
79
A Generative Model

(Zhu and Guo, 2000)
occlusion
occlusion
additive
80
Example Texton map
One layer of hidden variables the texton map
81
Learning with Generative Model
82
Learning with Generative Model
Learning by MLE
83
Stochastic Inference by DDMCMC
Goal sampling H p(H Iobs Q)
Method a symphony algorithm by data driven
Markov chain Monte
Carlo. (Zhu, Zhang and Tu 1999)
84
Example of A Generative Model
An observed image
85
Data Clustering
The saliency maps used as proposal probabilities
86
(No Transcript)
87
(No Transcript)
88
A Descriptive Model for Texton Map
89
Example of A Generative Model
90
Data Clustering
91
A Descriptive Model on Texton Map
92
(No Transcript)
93
(No Transcript)
94
Example of A Generative Model
95
A Descriptive Model for Texton Map
Write a Comment
User Comments (0)
About PowerShow.com