Evaluation of EKF and FastSlam Algorithms for BearingOnly Visual SLAM - PowerPoint PPT Presentation

1 / 33
About This Presentation
Title:

Evaluation of EKF and FastSlam Algorithms for BearingOnly Visual SLAM

Description:

State Definition ... Run SIFT/Harris on the current image to extract the features and their ... Run Unscented Transform on each of new features to find the new ... – PowerPoint PPT presentation

Number of Views:279
Avg rating:3.0/5.0
Slides: 34
Provided by: par6163
Category:

less

Transcript and Presenter's Notes

Title: Evaluation of EKF and FastSlam Algorithms for BearingOnly Visual SLAM


1
Evaluation of EKF and FastSlam Algorithms for
Bearing-Only Visual SLAM
  • Proposal of Term Project
  • CMPUT 631
  • Winter 2008
  • By Kiana Hajebi and Parisa Mosayebi

2
Main Steps of this term project
  • Mapping
  • Constructing the environment map from an image
    sequence and almost perfect odometry data.
  • Visual SLAM
  • Localization and Mapping are done simultaneously.
  • Modifying Tim Barely simulator for a bearing-only
    visual system
  • Research
  • A research on comparison between EKF and Fast
    SLAM in visual context.

3
Outline
  • Bearing-Only Visual SLAM
  • EKF-SLAM
  • Feature Extraction and Matching
  • Data Association
  • Feature Initialization
  • Mapping
  • Visual Simulation
  • Performance Measurement
  • New Research

4
Bearing Only Visual SLAM system
  • Main Block Diagram

5
Outline
  • Bearing Only Visual SLAM
  • EKF-SLAM
  • Feature Extraction and Matching
  • Data Association
  • Feature Initialization
  • Mapping
  • Visual Simulation
  • Performance Measurement
  • New Research

6
EKF-SLAM
  • SLAM systems are specific instances of the
    general Bayes filter
  • Two main steps of Bayes Filters
  • Prediction
  • Correction

7
State Definition
  • The system tracks robot pose along with landmark
    positions.
  • The state vector x
  • the robot position and
    heading
  • the position
    of the ith map landmark
  • Covariance matrix

8
1.Prediction Step
  • In each time step k, a new predicted state is
    generated from previous state and a motion model
    f
  • where
  • u which is assumed to be a Gaussian random
    variable with covariance matrix Q.

9
Motion Model
  • The motion model used in this project is the
    Odometry model for a differential drive motor.
  • The Odometry data given us in this project is the
    robot pose in each time step.

10
Motion Model
  • The control data
    is obtained from this Odometry data .

11
Motion Model, cont.
  • The EKF state update function is defined as
  • landmarks are assumed to be stationary and

12
cont.
  • Defining covariance of the noise of control
    vector (Q)
  • The values of are given as input data

13
Correction Step
  • The EKF uses the difference between a measurement
    (z) and a prediction ( ) based on the current
    state to correct the state estimate.
  • Feature extraction and matching on the current
    camera image generates the observation.
  • The observation model h is generated by
    projecting 3-D landmark position to a 2-D image
    point, given the current robot position.

14
Correction Step, cont.
  • The innovation and its corresponding covariance S
    is given by
  • is defined using the Jacobians with
    respect to the robot pose and observed map
    landmark
  • R is the observation noise covariance matrix
    mapped into the image-space.

15
Correction Step, cont.
  • Finally, the Kalman gain W is calculated and a
    corrected estimate of new state and its
    covariance is defined.
  • This new estimate is now used as the starting
    point for the next iteration

16
Outline
  • Bearing Only Visual SLAM
  • EKF-SLAM
  • Feature Extraction and Matching
  • Data Association
  • Feature Initialization
  • Mapping
  • Visual Simulation
  • Performance Measurement
  • New Research

17
SLAM System
Robot Odometry
EKF Predict
EKF Correct
u
z
Feature Extraction
Feature Matching
Feature initialization
image
features
features
Camera
new features
Landmark description
Visual system
18
Feature Extraction and Matching
  • In each time step, consider the corresponding
    image and probably the last two steps.
  • Undistort the image
  • Run SIFT/Harris on the current image to extract
    the features and their corresponding descriptors.
  • Match the features of this new image with the
    matched features in the previous 2 images.
  • Check manually whether these matching are correct
    or not.
  • Note These five steps will be done offline.

19
Outline
  • Bearing Only Visual SLAM
  • EKF-SLAM
  • Feature Extraction and Matching
  • Data Association
  • Feature Initialization
  • Mapping
  • Visual Simulation
  • Performance Measurement
  • New Research

20
Data Association
  • Project all the 3-D landmarks into the 2-D camera
    coordinate frame using h function.
  • If new features are very close to any existing
    projected landmark, delete it from the list of
    new landmarks.
  • Compare the descriptors of any of the remained
    new features with the map descriptors, and if
    they are very close to an existing descriptor,
    delete the corresponding feature from the list.
  • Give remained new features to the feature
    Initialization block.
  • Use the 2-D position of features corresponding to
    map landmarks as z for the input of EKF
    correction step.

21
Outline
  • Bearing Only Visual SLAM
  • EKF-SLAM
  • Feature Extraction and Matching
  • Data Association
  • Feature Initialization
  • Mapping
  • Visual Simulation
  • Performance Measurement
  • New Research

22
Feature Initialization
  • Run Unscented Transform on each of new features
    to find the new landmark position and
    covariance1.
  • Landmark Validity
  • If the ratio of the magnitude of depth to the
    uncertainty in depth is less than 30, new
    landmarks are accepted.
  • SLAM Map Augmentation
  • Transform the landmark from robot frame R to the
    world frame W.
  • Add the new transformed landmark state and
    covariance to the SLAM state vector obtained from
    Correction step.

23
Given Data
  • Almost perfect Odometry Data which should be
    corrupted by noise in order to run SLAM.
  • The sequence of input images
  • The ground truth data
  • Camera intrinsic and camera extrinsic parameters.
  • Robot motion model parameters

24
Outline
  • Bearing Only Visual SLAM
  • EKF-SLAM
  • Feature Extraction and Matching
  • Data Association
  • Feature Initialization
  • Mapping
  • Visual Simulation
  • Performance Measurement
  • New Research

25
Mapping
  • Mapping is the first phase of this term project
    which
  • consists of the following steps
  • Run Feature extraction and matching on the new
    image.
  • Do Data Association.
  • Do feature initialization
  • Use (almost perfect) odometry data
  • Add a local noise to the odometry data instead of
    doing EKF steps, in each time step.
  • Do other steps of feature initialization on the
    new features using this noisy odometry data

26
Outline
  • Bearing Only Visual SLAM
  • EKF-SLAM
  • Feature Extraction and Matching
  • Data Association
  • Feature Initialization
  • Mapping
  • Visual Simulation
  • Performance Measurement
  • New Research

27
Visual Simulation
  • We will show in different colors on the current
    image
  • The new accepted landmarks
  • The new unaccepted landmarks
  • The previous projected landmarks
  • Their associated covariance
  • We try to make a 3D space showing the robot pose
    and landmarks and their estimated covariance in
    the global frame.

28
Outline
  • Bearing Only Visual SLAM
  • EKF-SLAM
  • Feature Extraction and Matching
  • Data Association
  • Feature Initialization
  • Mapping
  • Visual Simulation
  • Performance Measurement
  • New Research

29
Performance Measurement
  • Estimating consistency of the estimates with
    ground truth.
  • Consistency is tested by calculating the
    normalized estimation error squared (NEES)
  • For increasing accuracy, we calculate the average
    NEES
  • For a 3-D vehicle pose, the range of consistency
    is bounded by the interval 2.36 3.72. Above
    this interval filter is optimistically
    inconsistent and below this interval it is
    conservatively inconsistent.

30
Outline
  • Bearing Only Visual SLAM
  • EKF-SLAM
  • Feature Extraction and Matching
  • Data Association
  • Feature Initialization
  • Mapping
  • Visual Simulation
  • Performance Measurement
  • New Research

31
Research
  • Implementing visual SLAM system using Fast SLAM
    simulator of Tim Barely
  • Try different number of particles
  • Compare performance between the two filters and
    the differences in the context visual slam on our
    datasets.
  • Consistency
  • Efficiency (e.g. FastSLAM seems to handle data
    association better)
  • Accuracy (The RMS vehicle pose error)
  • Computational time

32
References
  • J. Klippenstein, H. Zhang, X. Wang Feature
    Initialization for Bearing-Only Visual SLAM Using
    Triangulation and the Unscented Transform,
    Proceedings of the 2007 IEEE International
    Conference on Mechatronics and Automation.
  • J. M. M. Montiel, J. Civera, and A. J. Davison,
    "Unified inverse depth parameterization for
    monocular SLAM," in Proc. of Robotics Science
    and Systems, Philadelphia, USA, August 2006.
  • Tim Bailey, et al., "Consistency of the EKF-SLAM
    Algorithm", Proceedings of 2006 IEEE/RSJ
    International Conference on Intelligent Robots
    and Systems (IROS), pp. 3562-3568, October, 2006.
  • Tim Bailey, et al., "Consistency of the FastSLAM
    Algorithm", Proceedings of 2006 IEEE
    International Conference on Robotics and
    Automation (ICRA), pp. 424- 429, May 2006.

33
Thank You
Questions?
Write a Comment
User Comments (0)
About PowerShow.com