Jose-Luis Blanco, Javier Gonz - PowerPoint PPT Presentation

About This Presentation
Title:

Jose-Luis Blanco, Javier Gonz

Description:

Orange title on dark blue background with orange stripes on bottom border – PowerPoint PPT presentation

Number of Views:151
Avg rating:3.0/5.0
Slides: 48
Provided by: JoseL243
Category:
Tags: blanco | gonz | javier | jose | luis

less

Transcript and Presenter's Notes

Title: Jose-Luis Blanco, Javier Gonz


1
An Optimal Filtering Algorithm for Non-Parametric
Observation Models in Robot Localization
  • Jose-Luis Blanco, Javier González, Juan-Antonio
    Fernández-Madrigal

May 19-23 Pasadena, CA (USA)
2
Outline of the talk
1. Introduction
2. The proposed method
3. Experimental results
4. Conclusions
3
Outline of the talk
1. Introduction
2. The proposed method
3. Experimental results
4. Conclusions
4
1. Introduction
The addressed problem Bayesian filtering
p(x) prior belief
Two choices determine the tools suitable to solve
this problem
  • The representation of the prior/posterior
    densities Gaussian vs. samples.
  • Assumptions about the form of the likelihood.

5
1. Introduction
In this work
Representation of pdfs?
Observation likelihood?
6
1. Introduction
The role of the proposal distribution in particle
filters
The basic particle filtering algorithm
p(x) prior belief
What happens to each particle?
7
1. Introduction
The role of the proposal distribution in particle
filters
The basic particle filtering algorithm
What happens to each particle?
8
1. Introduction
The role of the proposal distribution in particle
filters
The basic particle filtering algorithm
  • Weights are updated, depending on
  • The observation likelihood, and
  • The proposal distribution.

What happens to each particle?
9
1. Introduction
The role of the proposal distribution in particle
filters
The basic particle filtering algorithm
  • Weights are updated, depending on
  • The observation likelihood, and
  • The proposal distribution.

p(yx) observation likelihood
What happens to each particle?
10
1. Introduction
The role of the proposal distribution in particle
filters
The basic particle filtering algorithm
p(xy) posterior belief
The goal ? To approximate as well as possible the
posterior
How much does the choice of the proposal
distribution matter?
11
1. Introduction
The role of the proposal distribution in particle
filters
The basic particle filtering algorithm
q() proposal distribution
p(xy) posterior belief
How much does the choice of the proposal
distribution matter?
12
1. Introduction
The role of the proposal distribution in particle
filters
The basic particle filtering algorithm
For a large mismatch between proposal and
posterior, the particles represent the density
very poorly
q() proposal distribution
p(xy) posterior belief
How much does the choice of the proposal
distribution matter?
13
1. Introduction
The role of the proposal distribution in particle
filters
The proposal distribution q() is the key for
the efficiency of a particle filter!
Doucet et al. 2000 introduced the optimal
proposal.
14
1. Introduction
Relation of our method to other Bayesian
filtering approaches
Non-Linear Observation model Optimal proposal Algorithms
? Gaussian - Kalman Filter
? Gaussian - EKF, UKF
? Arbitrary ? SIR, APF, FastSLAM
? Gaussian ? FastSLAM 2.0, Grisetti et al. 2007
? Arbitrary ? This work
15
Outline of the talk
1. Introduction
2. The proposed method
3. Experimental results
4. Conclusions
16
2. The proposed method
Our method
  • A particle filter based on the optimal proposal
    Doucet et al. 2000.
  • Can deal with non-parameterized observation
    models, using rejection sampling to approximate
    the actual densities.
  • Integrates KLD-sampling Fox 2003 for a dynamic
    sample size
  • (optional its not fundamental to the approach).
  • The weights of all the samples are always equal.

17
2. The proposed method
The theoretical model for each step of our method
is this sequence of operations
Duplication ? SIR with optimal proposal ?
Fixed/Dyn. sample-size resampling
18
2. The proposed method
The theoretical model for each step of our method
is this sequence of operations
Duplication ? SIR with optimal proposal ?
Fixed/Dyn. sample-size resampling
19
2. The proposed method
The theoretical model for each step of our method
is this sequence of operations
Duplication ? SIR with optimal proposal ?
Fixed/Dyn. sample-size resampling
20
2. The proposed method
The theoretical model for each step of our method
is this sequence of operations
Duplication ? SIR with optimal proposal ?
Fixed/Dyn. sample-size resampling
21
2. The proposed method
Illustrative example of how our method works
t
t1
1
2
3
4
22
2. The proposed method
Illustrative example of how our method works
t
t1
Group 1
1
2
3
4
Each particle propagates in time
probabilistically this is the reason of the
duplication
23
2. The proposed method
Illustrative example of how our method works
t
t1
Group 2
Group 1
1
2
3
4
Each particle propagates in time
probabilistically this is the reason of the
duplication
24
2. The proposed method
Illustrative example of how our method works
t
t1
Group 2
Group 1
1
2
3
Group 3
4
Each particle propagates in time
probabilistically this is the reason of the
duplication
25
2. The proposed method
Illustrative example of how our method works
t
t1
Group 2
Group 1
1
2
3
Group 3
4
Group 4
Each particle propagates in time
probabilistically this is the reason of the
duplication
26
2. The proposed method
Illustrative example of how our method works
t
t1
Group 2
Group 1
1
2
Observation likelihood
3
Group 3
4
Too distant particles do not contribute to the
posterior!
Group 4
The observation likelihood states what particles
are really important
27
2. The proposed method
Illustrative example of how our method works
t
t1
Group 2
Group 1
1
2
Observation likelihood
3
Group 3
4
Group 4
We can predict which groups will be more
important, before really generating the new
samples!
28
2. The proposed method
The optimal proposal distribution
Importance weights update as
29
2. The proposed method
Illustrative example of how our method works
t
t1
Group 2
Group 1
1
2
Observation likelihood
Group 1 ? 55
Group 2 ? 0
3
Group 3 ? 45
Group 4 ? 0
Group 3
4
Group 4
30
2. The proposed method
Illustrative example of how our method works
t
t1
Group 2
Group 1
1
2
Observation likelihood
3
A fixed or dynamic number of samples can be
generated in this way.
Group 3
4
Group 4
Given the predictions, we draw particles
according to the optimal proposal, only for those
groups that really contribute to the posterior.
31
2. The proposed method
Comparison to basic Sequential Importance
Resampling (SIR)
32
2. The proposed method
Comparison to basic Sequential Importance
Resampling (SIR)
t
t1
1
2
Observation likelihood
3
4
1 particle ? 1 particle Prone to
particle depletion!
33
2. The proposed method
Comparison to Auxiliary Particle Filter (APF)
Pitt Shephard, 1999
34
2. The proposed method
Comparison to Auxiliary Particle Filter (APF)
Pitt Shephard, 1999
t
t1
1
2
Observation likelihood
3
4
1 particle ? variable number of particles
Propagation does not use optimal
proposal!
35
Outline of the talk
1. Introduction
2. The proposed method
3. Experimental results
3.1. Numerical simulation
3.2. Robot localization
4. Conclusions
36
3.1. Results
Numerical simulations A Gaussian model for both
the filtered density and the observation model.
We compare the closed form optimal solution
(Kalman filter) to
? PF using the standard proposal distribution.
? Auxiliary PF method Pitt Shephard, 1999.
? This work (optimal PF).
(Fixed sample size for these experiments)
37
3.1. Results
Results from the numerical simulations, and
comparison to 1D Kalman filter
38
3.1. Results
Results from the numerical simulations, and
comparison to 1D Kalman filter
39
Outline of the talk
1. Introduction
2. The proposed method
3. Experimental results
3.1. Numerical simulation
3.2. Robot localization
4. Conclusions
40
3.2. Results
Localization with real data Path of the
robot ground truth from a RBPF with a large
number of particles.
41
3.2. Results
Localization with real data
Average errors in tracking (the particles are
approximately at the right position from the
beginning).
42
3.2. Results
Localization with real data
Ratio of convergence from global localization
Our method
Ratio of convergence success
SIR with standard proposal
Initial sample size (particles/m2)
43
3.2. Results
44
Outline of the talk
1. Introduction
2. The proposed method
3. Experimental results
4. Conclusions
45
Conclusions
  • A new particle filter algorithm has been
    introduced.
  • It can cope with non-parameterized observation
    likelihoods, and a
  • dynamic number of particles.
  • Compared to standard SIR, it provides more
    robust global localization
  • and pose tracking for similar computation
    times.
  • It is a generic algorithm can be applied to
    other problems in
  • robotics, computer vision, etc.

46
Finally
  • Source code (MRPT C libs), datasets, slides and
    instructions to reproduce the experiments
    available online

http//mrpt.sourceforge.net/
papers
ICRA 08
47
An Optimal Filtering Algorithm for Non-Parametric
Observation Models in Robot Localization
  • Jose-Luis Blanco, Javier González, Juan-Antonio
    Fernández-Madrigal

Thanks for your attention!
Write a Comment
User Comments (0)
About PowerShow.com