HMM-BASED PATTERN DETECTION - PowerPoint PPT Presentation

1 / 14
About This Presentation
Title:

HMM-BASED PATTERN DETECTION

Description:

HMM-BASED. PATTERN DETECTION. Image Processing and Reconstruction. Winter 2002. Outline ... Application: Pattern Detection. SNR=-5. SNR=10. Simulations ... – PowerPoint PPT presentation

Number of Views:99
Avg rating:3.0/5.0
Slides: 15
Provided by: morteza4
Category:

less

Transcript and Presenter's Notes

Title: HMM-BASED PATTERN DETECTION


1
HMM-BASED PATTERN DETECTION
  • Image Processing and Reconstruction
  • Winter 2002

2
Outline
  • Markov Process
  • Hidden Markov Models
  • Elements
  • Basic Problems
  • Evaluation
  • Optimization
  • Training
  • Implementation
  • 2-D HMM
  • Application
  • Simulation and Results

3
Markov Process
S2
  • Can be described at any time to
  • be in one state among N distinct
  • states
  • Its probabilistic description
  • just requires a fixed specification
  • of current and previous states
  • actual state at time t
  • state transition probability
  • Each state corresponds to a physical (observable)
    event
  • Too restrictive for sophisticated applications

S3
S1
a31
4
Extension to Hidden Markov Models
  • A conditionally independent process on a Markov
    chain
  • States correspond to clusters of context with
    similar distribution
  • Elements of HMM
  • State transition probability
  • The observation symbol probability in each state
  • The initial state distribution

5
Fundamental Problems for HMM
  • Evaluation
  • the probability of the observation OO1O2OT
    given the model ?, P(O ?)
  • Optimization
  • Choosing optimal state sequence given the
    observation and the model ?.
  • Training
  • Estimating model parameters to maximize P(O ?)

6
Evaluation the Model Forward-Backward Algorithm
  • This calculation is on order of
  • Forward-Backward Procedure with order of
  • Forward variable
  • Backward variable

7
Optimal States Sequence Solution(s)
  • One solution choose the states which are
    individually most likely.
  • This optimal solution has to be a valid state
    sequence!!
  • Vitterbi Algorithm find the single best state
    sequence that maximizes P(QO,?)

8
Training the Model
9
Continuous Observation Distributions
  • In most of the applications (Speech, Image, ),
    observations can not be characterized as discrete
    symbols from finite alphabet and should be
    considered by probability density function (PDF).
  • The most general representation of the PDF is a
    finite mixture of normal distributions with
    different means and variances for each state.
  • Estimating mean and variance instead of
    estimating bj(k)

10
Implementation Considerations
  • Scaling Dynamic range of ? and ? will exceed the
    precision range of any machine
  • Multiple observations for training
  • Initial Estimation of HMM Parameters
  • for convergence, good initial values of PDF are
    really helpful.
  • Choice of Model, Number of states, Choice of
    observation PDF

11
Two-Dimensional HMM
  • Set of Markovian states within each super-state
  • Transition probability
  • Useful for segmentation

Sub-State
Si-1,j
Si,j
Si,j-1
Super-State
12
Application Pattern Detection
  • SNR-5
  • SNR10

13
Simulations
  • Feature Vector DCT Coefficients or their
    averages over some of them
  • Block Size 1616
  • Both images in training set and test set have
    different rotation of jincs, but the distance
    and center of them are fixed.
  • Running K-means Clustering Algorithm For initial
    estimation
  • Comparing with template matching and Learning
    Vector Quantization

Average of Absolute value of the Coefficients
14
Results and Conclusion!
  • Detection Error
Write a Comment
User Comments (0)
About PowerShow.com