Toward image-based localization for AIBO using wavelet transform - PowerPoint PPT Presentation

About This Presentation
Title:

Toward image-based localization for AIBO using wavelet transform

Description:

This paper describes a similarity measure for images to be used in image-based ... autonomously build two 180 degrees panorama images using its standard camera and ... – PowerPoint PPT presentation

Number of Views:49
Avg rating:3.0/5.0
Slides: 2
Provided by: stefano2
Category:

less

Transcript and Presenter's Notes

Title: Toward image-based localization for AIBO using wavelet transform


1
Toward image-based localization for AIBO using
wavelet transform
A. Pretto, E. Menegatti, E. Pagello, Y.
Jitsukawa, R. Ueda, T. Arai
Centre for Engineering University of Tokyo, Japan
Department of Information Engineering University of Padua, Italy
Abstract This paper describes a similarity
measure for images to be used in image-based
localization for autonomous robots with low
computational resources. We propose a novel
signature which allows memory saving and fast
similarity calculation. The signature is based on
the calculation of the 2D Haar Wavelet Transform
of the gray-level image. We present experiments
showing the effectiveness of the proposed image
similarity measure. The used images were
collected using the AIBOs ERS-7 of the RoboCup
Team Araibo of the University of Tokyo on a
RoboCup field, however, the proposed image
similarity measure does not use any information
on the structure of the environment and do not
exploit the peculiar features of the RoboCup
environment.
  • Wavelet signature
  • We use as image signature a 2-D Haar Wavelet
    Transform of the grey-level values of the image.
    We decide to stop at 4-th level decomposition,
    and to characterize images only by the detailed
    coefficient (horizontal, vertical and diagonal)
    of this level.
  • Haar Wavelet because
  • very effective in detecting discontinuity (one of
    the most important features in image-based
    localization)
  • easily implemented and very fast to compute
  • If one is interested in image reconstruction
    phase, the Haar Wavelets are not the good choice,
    because they tend to produce a lot of squared
    artifacts
  • Memory saving
  • a gray-scale omnidirectional image 720x160
    115.2 Kbyte gt only 1.3 Kbyte.
  • Introduction
  • Three are the main problems to be solved by any
    techniques of image-based localization
  • how to reduce the number of images necessary to
    fully describe the environment
  • how to efficiently store a large data set of
    reference images (it is common to have several
    hundred reference images for typical
    environments)
  • how to calculate in a fast and efficient way the
    similarity
  • Each of the referred works addressed one or all
    of these problems.
  • The problem
  • How to implement an image-based localization
    approach on a robot with limited storage memory
    and limited computational resources, as the Sony
    AIBO ERS-7?
  • How to store in a memory-saving way the reference
    images?
  • How to efficiently compare them with the input
    images?
  • Which feature should we relay on, for a general
    approach not targeted to a particular
    environment?

Experiments We tested the system in a RoboCup
Four-legged League 540x360 cm soccer field,
using a grid of 13 by 9 reference images. The
proposed systems do not relay on RoboCup features
  • Our approach
  • To minimize the reference images to be stored,
    keep as reference images two 180 degree panoramic
    views of the environment at every reference
    location
  • We developed an algorithm that allows the ERS-7
    robot to autonomously build two 180 degrees
    panorama images using its standard camera and to
    stitch them together.
  • We use as image signature a 2-D Haar Wavelet
    Transform of the grey-level values of the image.
    We decide to stop at 4-th level decomposition,
    and to characterize images only by the detailed
    coefficients (horizontal, vertical and diagonal)
    of this level.
  • References
  • 1. R. Cassinis, D. Duina, S. Inelli, and A.
    Rizzi. Unsupervised matching of visual landmarks
    for robotic homing using Fourier-Mellin
    transform. Robotics and Autonomous Systems,
    40(2-3), August 2002.
  • 4. M. N. Do and M. Vetterli. Wavelet-based
    texture retrieval using generalized Gaussian
    density and Kullback-Leibler distance. IEEE Tr.
    Im. Proc., 11(2)146158, February 2002.
  • 6. Emanuele Frontoni and Primo Zingaretti. An
    efficient similarity metric for om-
  • nidirectional vision sensors. Robotics and
    Autonomous Systems, 54(9)750757, 2006.
  • 7. J. Gaspar, N. Winters, and J. Santos-Victor.
    Vision-based navigation and environmental
    representations with an omnidirectional camera.
    IEEE Transaction on Robotics and Automation, Vol
    16(number 6), Dec. 2000.
  • 8. H.-M. Gross, A. Koenig, Ch. Schroeter, and
    H.-J. Boehme. Omnivision-based prob-abilistic
    self-localization for a mobile shopping assistant
    continued. In IEEE/RSJ Int. Conference on
    Intelligent Robots and Systems (IROS 2003), pages
    pp. 15051511, October 2003, Las Vegas USA.
  • 11. B.J.A. Koerse, N. Vlassis, R. Bunschoten, and
    Y. Motomura. A probabilistic
  • model for appareance-based robot localization.
    Image and Vision Computing, vol.19(6)pp.
    381391, April 2001.
  • 12. Emanuele Menegatti, Takeshi Maeda, and
    Hiroshi Ishiguro. Image-based memory for robot
    navigation using properties of the
    omnidirectional images. Robotics and Autonomous
    Systems, Elsevier, 47(4)pp. 251267, July 2004.
  • 16. M. Vetterli and J Kovacevic. Wavelets and
    Subband Coding. Signal Processing Series. 1995.
  • 17. J. Wolf, W. Burgard, and H. Burkhardt. Robust
    vision-based localization by combining an image
    retrieval system with monte carlo localization.
    IEEE Transactions on Robotics, 21(2)208216,
    2005.
Write a Comment
User Comments (0)
About PowerShow.com