Tutorial on Particle Filters - Jan 2001 - PowerPoint PPT Presentation

About This Presentation
Title:

Tutorial on Particle Filters - Jan 2001

Description:

Tutorial on Particle Filters assembled and extended by Longin Jan Latecki Temple University, latecki_at_temple.edu using s from Keith Copsey, Pattern and ... – PowerPoint PPT presentation

Number of Views:709
Avg rating:3.0/5.0
Slides: 65
Provided by: KDCo5
Category:

less

Transcript and Presenter's Notes

Title: Tutorial on Particle Filters - Jan 2001


1
Tutorial on Particle Filters assembled and
extended by Longin Jan Latecki Temple University,
latecki_at_temple.edu using slides from
Keith Copsey, Pattern and Information Processing
Group, DERA Malvern D. Fox, J. Hightower, L.
Liao, D. Schulz, and G. Borriello, Univ. of
Washington, Seattle Honggang Zhang, Univ. of
Maryland, College Park Miodrag Bolic, University
of Ottawa, Canada Michael Pfeiffer, TU Gratz,
Austria
2
Outline
  • Introduction to particle filters
  • Recursive Bayesian estimation
  • Bayesian Importance sampling
  • Sequential Importance sampling (SIS)
  • Sampling Importance resampling (SIR)
  • Improvements to SIR
  • On-line Markov chain Monte Carlo
  • Basic Particle Filter algorithm
  • Example for robot localization
  • Conclusions

3
Particle Filters
  • Sequential Monte Carlo methods for on-line
    learning within a Bayesian framework.
  • Known as
  • Particle filters
  • Sequential sampling-importance resampling (SIR)
  • Bootstrap filters
  • Condensation trackers
  • Interacting particle approximations
  • Survival of the fittest

4
History
  • First attempts simulations of growing polymers
  • M. N. Rosenbluth and A.W. Rosenbluth, Monte
    Carlo calculation of the average extension of
    molecular chains, Journal of Chemical Physics,
    vol. 23, no. 2, pp. 356359, 1956.
  • First application in signal processing - 1993
  • N. J. Gordon, D. J. Salmond, and A. F. M. Smith,
    Novel approach to nonlinear/non-Gaussian
    Bayesian state estimation, IEE Proceedings-F,
    vol. 140, no. 2, pp. 107113, 1993.
  • Books
  • A. Doucet, N. de Freitas, and N. Gordon, Eds.,
    Sequential Monte Carlo Methods in Practice,
    Springer, 2001.
  • B. Ristic, S. Arulampalam, N. Gordon, Beyond the
    Kalman Filter Particle Filters for Tracking
    Applications, Artech House Publishers, 2004.
  • Tutorials
  • M. S. Arulampalam, S. Maskell, N. Gordon, and T.
    Clapp, A tutorial on particle filters for online
    nonlinear/non-gaussian Bayesian tracking, IEEE
    Transactions on Signal Processing, vol. 50, no.
    2, pp. 174188, 2002.

5
Problem Statement
  • Tracking the state of a system as it evolves over
    time
  • Sequentially arriving (noisy or ambiguous)
    observations
  • We want to know Best possible estimate of the
    hidden variables

6
Solution Sequential Update
  • Storing and processing all incoming measurements
    is inconvenient and may be impossible
  • Recursive filtering
  • Predict next state pdf from current estimate
  • Update the prediction using sequentially
    arriving new measurements
  • Optimal Bayesian solution recursively
    calculating exact posterior density

7
Particle filtering ideas
  • Particle filter is a technique for implementing
    recursive Bayesian filter by Monte Carlo sampling
  • The idea represent the posterior density by a
    set of random particles with associated weights.
  • Compute estimates based on these samples and
    weights

Posterior density
Sample space
8
Global Localization of Robot with
Sonarhttp//www.cs.washington.edu/ai/Mobile_Robot
ics/mcl/animations/global-floor.gif
9
(No Transcript)
10
Tools needed
Recall law of total probability (or
marginalization) and Bayes rule
11
Recursive Bayesian estimation (I)
  • Recursive filter
  • System model
  • Measurement model
  • Information available

12
Recursive Bayesian estimation (II)
  • Seek
  • i 0 filtering.
  • i gt 0 prediction.
  • ilt0 smoothing.
  • Prediction
  • since

13
Recursive Bayesian estimation (III)
  • Update
  • where
  • since

14
Bayes Filters (second pass)
Estimating system state from noisy observations
15
(No Transcript)
16
Assumptions Markov Process
Predict
Update
17
(No Transcript)
18
Example 1
19
Example 1 (continue)
1
20
Classical approximations
  • Analytical methods
  • Extended Kalman filter,
  • Gaussian sums (Alspach et al. 1971)
  • Perform poorly in numerous cases of interest
  • Numerical methods
  • point masses approximations,
  • splines. (Bucy 1971, de Figueiro 1974)
  • Very complex to implement, not flexible.

21
Perfect Monte Carlo simulation
  • Recall that
  • Random samples are drawn from the
    posterior distribution.
  • Represent posterior distribution using a set of
    samples or particles.
  • Easy to approximate expectations of the form
  • by

22
Random samples and the pdf (I)
  • Take p(x)Gamma(4,1)
  • Generate some random samples
  • Plot histogram and basic approximation to pdf

200 samples
23
Random samples and the pdf (II)
500 samples
1000 samples
24
Random samples and the pdf (III)
200000 samples
5000 samples
25
Importance Sampling
  • Unfortunately it is often not possible to sample
    directly from the posterior distribution, but we
    can use importance sampling.
  • Let p(x) be a pdf from which it is difficult to
    draw samples.
  • Let xi q(x), i1, , N, be samples that are
    easily generated from a proposal pdf q, which is
    called an importance density.
  • Then approximation to the density p is given by

where
26
Bayesian Importance Sampling
  • By drawing samples from a known easy to
    sample proposal distribution
    we obtain

where
are normalized weights.
27
(No Transcript)
28
Sequential Importance Sampling (I)
  • Factorizing the proposal distribution
  • and remembering that the state evolution is
    modeled as a Markov process
  • we obtain a recursive estimate of the importance
    weights
  • Factorizing is obtained by recursively applying

29
Sequential Importance Sampling (SIS) Particle
Filter
SIS Particle Filter Algorithm
for i1N Draw a particle Assign a weight end
(k is index over time and i is the particle index)
30
Derivation of SIS weights (I)
  • The main idea is Factorizing

and
Our goal is to expand p and q in time t
31
Derivation of SIS weights (II)
32
Derivation of SIS weights (II)
and under Markov assumptions
33
SIS Particle Filter Foundation
  • At each time step k
  • Random samples are drawn from the
    proposal distribution for i1, , N
  • They represent posterior distribution using a set
    of samples or particles
  • Since the weights are given by
  • and q factorizes as

34
Sequential Importance Sampling (II)
  • Choice of the proposal distribution
  • Choose proposal function to minimize variance of
    (Doucet et al. 1999)
  • Although common choice is the prior distribution
  • We obtain then

35
Sequential Importance Sampling (III)
  • Illustration of SIS
  • Degeneracy problems
  • variance of importance ratios
    increases stochastically over
    time (Kong et al. 1994 Doucet et al. 1999).
  • In most cases then after a few iterations, all
    but one particle will have negligible weight

36
Sequential Importance Sampling (IV)
  • Illustration of degeneracy

37
SIS - why variance increase
  • Suppose we want to sample from the posterior
  • choose a proposal density to be very close to the
    posterior density
  • Then
  • and
  • So we expect the variance to be close to 0 to
    obtain reasonable estimates
  • thus a variance increase has a harmful effect on
    accuracy

38
(No Transcript)
39
Sampling-Importance Resampling
  • SIS suffers from degeneracy problems so we dont
    want to do that!
  • Introduce a selection (resampling) step to
    eliminate samples with low importance ratios and
    multiply samples with high importance ratios.
  • Resampling maps the weighted random measure
    on to the equally weighted random measure
  • by sampling uniformly with replacement from
    with probabilities
  • Scheme generates children such that
    and satisfies

40
Basic SIR Particle Filter - Schematic
Initialisation
measurement
Resampling step
Importance sampling step
Extract estimate,
41
Basic SIR Particle Filter algorithm (I)
  • Initialisation
  • For sample
  • and set
  • Importance Sampling step
  • For sample
  • For compute the importance
    weights wik
  • Normalise the importance weights,

and set
42
Basic SIR Particle Filter algorithm (II)
  • Resampling step
  • Resample with replacement particles
  • from the set
  • according to the normalised importance weights,
  • Set
  • proceed to the Importance Sampling step, as the
    next measurement arrives.

43
Resampling
x
44
Generic SIR Particle Filter algorithm
M. S. Arulampalam, S. Maskell, N. Gordon, and T.
Clapp, A tutorial on particle filters , IEEE
Trans. on Signal Processing, 50( 2), 2002.
45
Improvements to SIR (I)
  • Variety of resampling schemes with varying
    performance in terms of the variance of the
    particles
  • Residual sampling (Liu Chen, 1998).
  • Systematic sampling (Carpenter et al., 1999).
  • Mixture of SIS and SIR, only resample when
    necessary (Liu Chen, 1995 Doucet et al.,
    1999).
  • Degeneracy may still be a problem
  • During resampling a sample with high importance
    weight may be duplicated many times.
  • Samples may eventually collapse to a single point.

46
Improvements to SIR (II)
  • To alleviate numerical degeneracy problems,
    sample smoothing methods may be adopted.
  • Roughening (Gordon et al., 1993).
  • Adds an independent jitter to the resampled
    particles
  • Prior boosting (Gordon et al., 1993).
  • Increase the number of samples from the proposal
    distribution to MgtN,
  • but in the resampling stage only draw N particles.

47
Improvements to SIR (III)
  • Local Monte Carlo methods for alleviating
    degeneracy
  • Local linearisation - using an EKF (Doucet, 1999
    Pitt Shephard, 1999) or UKF (Doucet et al,
    2000) to estimate the importance distribution.
  • Rejection methods (Müller, 1991 Doucet, 1999
    Pitt Shephard, 1999).
  • Auxiliary particle filters (Pitt Shephard,
    1999)
  • Kernel smoothing (Gordon, 1994 Hürzeler
    Künsch, 1998 Liu West, 2000 Musso et al.,
    2000).
  • MCMC methods (Müller, 1992 Gordon Whitby,
    1995 Berzuini et al., 1997 Gilks Berzuini,
    1998 Andrieu et al., 1999).

48
Improvements to SIR (IV)
  • Illustration of SIR with sample smoothing

49
Ingredients for SMC
  • Importance sampling function
  • Gordon et al ?
  • Optimal ?
  • UKF ? pdf from UKF at
  • Redistribution scheme
  • Gordon et al ? SIR
  • Liu Chen ? Residual
  • Carpenter et al ? Systematic
  • Liu Chen, Doucet et al ? Resample when
    necessary
  • Careful initialisation procedure (for efficiency)

50
Particle filters
  • Also known as Sequential Monte Carlo Methods
  • Representing belief by sets of samples or
    particles
  • are nonnegative weights called importance
    factors
  • Updating procedure is sequential importance
    sampling with re-sampling

51
Example 2 Particle Filter
52
Example 2 Particle Filter
Particles are more concentrated in the region
where the person is more likely to be
53
Compare Particle Filter with Bayes Filter with
Known Distribution
Updating
Example 1
Example 2
Predicting
Example 1
Example 2
54
Particle Filters
55
Sensor Information Importance Sampling
56
Robot Motion

57
Sensor Information Importance Sampling
58
Robot Motion
59
(No Transcript)
60
(No Transcript)
61
(No Transcript)
62
(No Transcript)
63
(No Transcript)
64
Application Examples
  • Robot localization
  • Robot mapping
  • Visual Tracking, e.g., human motion (body parts)
  • Prediction of (financial) time series, e.g.,
    mapping gold price to stock price
  • Target recognition from single or multiple images
  • Guidance of missiles
  • Contour grouping http//www.cis.temple.edu/latec
    ki/Papers/PFgrouping_ICCV09_final.pdf
  • Matlab PF software for tracking
    http//www.gatsby.ucl.ac.uk/fwood/pf_tutorial/
  • Nice video demos
  • http//www.cs.washington.edu/ai/Mobile_Robotics/m
    cl/
Write a Comment
User Comments (0)
About PowerShow.com