Sensor - PowerPoint PPT Presentation

1 / 38
About This Presentation
Title:

Sensor

Description:

Red / Green / Blue components (triple grey scale) HSI / HSV. Hue / Saturation / Intensity (value) ... if something standing on the carpet, it reads as occupied ... – PowerPoint PPT presentation

Number of Views:93
Avg rating:3.0/5.0
Slides: 39
Provided by: SHIN80
Category:
Tags: carpet | fashion | red | sensor

less

Transcript and Presenter's Notes

Title: Sensor


1
Sensor
  • Heesang Shin

2
What is sensor?
  • A sensor is type of transducer
  • Convert some form of energy into another form of
    energy
  • Antenna
  • converts electromagnetic waves into electric
    current and vice versa
  • Microphone
  • converts changes in air pressure into an
    electrical signal
  • LCD
  • Converts electrical energy into light

3
What is sensor?
  • In robotics
  • Acoustic
  • Sonar
  • Motion
  • Turn coordinator
  • Light
  • CCD
  • Etc

4
Passive / Active Sensor
  • Passive
  • Rely on the environment to provide the medium for
    observation
  • e.g. a camera requires a certain amount of
    ambient light to produce a useable picture
  • Active
  • Put out energy in the environment to either
    change the energy or enhance it
  • e.g. A sonar sends out sound, receives the echo,
    and measures the time of flight
  • Although a camera is a passive device, a camera
    with a flash is active

5
Active Sensing
  • Active sensing connotes the system for using an
    effectors to dynamically position a sensor for a
    "better look"
  • Active sensor is not the same as active sensing
  • A camera on a pan/tilt head with algorithms to
    direct the camera to turn to get a better view is
    using active sensing

6
Modality
  • Sensors which measure the same form of energy and
    process it in similar ways form a sensor modality
  • A sensor modality refers to what the raw input to
    the sensor is sound, pressure, temperature,
    light , and so on.
  • A modality can be further subdivided
  • e.g. vision can be decomposed into visible light,
    infrared light, X-rays

7
Logical Sensors
  • Abstraction of sensors
  • Introduced by Henderson and Shilcrat
  • A unit of sensing or module that supplies a
    particular percept
  • Similar like Abstract Data Type in C, e.g.
  • There is logical sensor called range_360
  • data type
  • range_360 provides an object specifying the polar
    plot
  • public data / method
  • range_360 may uses sonar, a laser, stereo vision
  • private data / method
  • each different sensor may use different way to
    process signals

8
Logical Equivalence
  • Sensor modules would be logically equivalent
  • As long as logical sensor returns SAME percept
    data structure
  • Wouldn't necessarily be equivalent in performance
    or update rate
  • So they falls into same logical sensor

9
Sensing Model
10
Behavioural Sensor Fusion
  • Combines information from multiple sensors into a
    single percept
  • Redundant (Competing)
  • Physical redundancy
  • sensors are both returning the same percept
  • some sensors may return better percept than the
    other in different situation
  • Logically redundant
  • returns identical precepts but use different
    modalities or processing algorithms
  • Why? overcome false positive and false negative
  • False Positive
  • when a sensor leads the robot to believe that a
    percept is present, but it is not
  • e.g. misses the glass wall with stereo vision
  • False Negative
  • when robot misses a percept
  • e.g. misses thin obstacles with sonar (legs of
    chair)

11
Behavioural Sensor Fusion
  • Complementary
  • Complementary sensors provide disjoint types of
    information about a percept
  • e.g. urban search and rescue robot
  • search for survivors by fusing observations from
    a themal sensor for body heat with camera
    detecting motion
  • thermal sensor or camera alone neither provides a
    complete view
  • Coordinated
  • Sequence of sensors
  • cue-ing or providing focus-of-attention
  • A predator might see motion, causing it to stop
    and examine the scene more closely for signs of
    prey

12
Sensor Fission
  • each sensor reading supports specific behaviour
    which leads intermediary action
  • intermediary actions combined in resulting action
  • mainly competitive method

13
Action-Oriented Sensor Fusion
  • sensor data is being transformed into a
    behaviour-specific representation in order to
    support a particular action
  • e.g. if a cat hears a noise and sees a movement,
    it will react more strongly than if it has only a
    single stimulus
  • covers competing and complementary sensing

14
Sensor Fashion
  • changing sensors when circumstances changes
  • covers coordinated sensing

15
Sensor Fission, Fusion and Fashion
16
Sensor Suites Reference Frames
  • Proprioception
  • relative to internal frame of reference
  • e.g. shaft encoder
  • records wheel revolution
  • wheel might slip
  • Exteroception
  • relative to robot frame of reference
  • e.g. camera
  • environment supply current position
  • camera may heading wrong direction

17
Sensor Suites Reference Frames
  • Exproprioception
  • relative to external frame of reference
  • e.g. GPS
  • satellites transmit positioning information
  • only available on open sky environment
  • Sensor Suite set of sensors for a particular
    robot
  • Always have some type of exteroceptive sensor
  • otherwise the robot cannot be considered
    reactive there would be no stimulus from the
    world to generate a reaction

18
Sensor Suites Sensor Attributes
  • Field of view (FOV)
  • exteroceptive sensor has a region of space to
    cover
  • usually expressed in degrees A wide angle lens
    will often cover up to 70 degree, while a regular
    lends may only have a field of view around 27
    degree
  • distance that the field extends is called the
    range
  • Horizontal FOV
  • Vertical FOV
  • FOV and range are critical in matching a sensor
    to an application

19
Sensor Suites Sensor Attributes
  • Accuracy-repeatability
  • how correct the reading from the sensor is
  • poor accuracy means little repeatability
  • sensor resolution how precise sensor reading is
  • Responsiveness
  • some sensor function poorly in particular
    environments
  • sonar is often unusable for navigating in an
    office foyer with large amounts of glass
  • normal camera is often unusable in dark room

20
Sensor Suites Sensor Attributes
  • Power consumption
  • always concern for robots
  • less power they consume, the longer they run
  • amount of power on a mobile robot required to
    support a sensor package is sometimes called the
    hotel load
  • the power needed to move the robot is called the
    locomotion load
  • Hardware reliability
  • physical limitations on how well they work
  • e.g. Polaroid sonar will produce incorrect range
    reading when the voltage drops below 12V

21
Sensor Suites Sensor Attributes
  • Size
  • the size and weight of a sensor does affect the
    overall design
  • Computational Complexity
  • Estimation of how many operations an algorithm or
    program performs
  • The BIG O problem
  • Interpretation reliability
  • Interpretation algorithms must be reliable
  • Hard to catch errors by untrained human
  • robot may "hallucinate" and do the wrong thing if
    sensor is providing incorrect information

22
Sensor Suite Attributes
  • Simplicity
  • simple sensors are more desirable than complex
    and require costant maintenance
  • Modularity
  • designer must be able to remove one sensor and/or
    its perceptual schema without impacting any other
    sensing capability
  • Redundancy (Physical Logical)
  • a faulty sensor can cause the robot to
    "hallucinate"
  • sensor suites may offer some sort of redundancy
  • Physical redundancy multiple of same sensor
  • Logical redundancy another sensor using a
    different sensing modality can produce the same
    percept or releaser
  • Fault tolerance
  • physical redundancy introduces fault tolerance
  • Robots can be programmed in most cases to
    tolerate faults as long as they can identify when
    they occur

23
Proprioceptive Sensors
  • Shaft Encoder
  • remembers motor activity
  • only for estimate, wheel could slip on surface
  • Inertial navigation systems (INS)
  • accelerometers measure acceleration
  • integrate readings to get velocity and position
  • not suitable for jerky, bumping motion
  • still expensive

24
Exproprioceptive Sensors
  • Global Positioning System (GPS)
  • selective availability reduces accuracy was
    problem, however
  • On May 2, 2000 "Selective Availability" was
    discontinued as a result of the 1996 executive
    order, allowing users to receive a non-degraded
    signal globally.
  • no more needs of position averaging and
    differential GPS technique to obtain precise
    position
  • however it is still restricted to use GPS signal
    altitude above 18km (60,000 ft) and travelling
    over 515m/s (1,000 knots) out of US

25
Exproprioceptive Sensors
  • Radio triangulation
  • triangulation is the process of finding
    coordinates and distance to a point by
    calculating the length of one side of a triangle,
    given measurements of angles and sides of the
    triangle formed by that point and two other known
    reference points, using the law of Sines.
  • GPS uses Trilateration not triangulation
  • triangulation uses angles
  • Trilateration uses known reference positions

26
Proximity Sensors
  • Sonar / Ultrasonic
  • refers to any system for using sound to measure
    range
  • different applications operate at different
    frequencies
  • active sensors
  • time of flight time from emission to bounce
    back
  • speed of sound time of flight sufficient to
    compute the range of the object
  • speed of sound vary with environments
  • Infrared (IR)
  • emit near infrared energy and measure whether any
    significant amount of the IR light is returned
  • range of inches to several feet
  • often fail in practice because the light emitted
    is often "washed out" by bright ambient lighting
    or is absorbed by dark materials

27
Proximity Sensors
  • Contact Sensors
  • Tactile, or touch, done with bump and feeler
    sensors
  • feelers or whiskers can be constructed from
    sturdy wires
  • bump sensors are usually a protruding ring around
    the robot consisting of two layers
  • useful to protect robot body from colliding

28
Sonar regions and problems
  • Sonar (ultrasonic) sensor regions defined as 4
    areas
  • Region I area that associated with the range
    reading, width of region I is tolerance
  • Region II the area that is empty, if not range
    reading would have been shorter
  • Region III the area that theoretically covered
    by the sonar beam
  • Region IV outside of the beam and not of
    interest

29
Sonar regions and problems
30
Sonar regions and problems
  • Problems
  • Foreshortening
  • sound broadcast in cone shape, it is possible to
    some sound may bounce back before other sound
    reaches the object
  • Specular reflection
  • when the wave form hits a surface at an acute
    angle and the wave bounces away from the
    transducer.
  • especially glass induces serious specular
    reflection
  • Cross-talk
  • bounced away wave may returns to another sonar to
    give erroneous reading
  • if each sonar uses different frequencies, then
    there is sophisticated aliasing techniques can be
    applied

31
Computer Vision
  • This is big chapter alone (machine vision paper
    dedicated for it)
  • Let's briefly through the book
  • CCD cameras
  • Video camera uses CCD (charged couple device)
    technology to detect visible light
  • two way to "grab" image from camera
  • digital camera (usb / firewire etc)
  • frame grabber

32
Computer Vision
  • Colour spaces
  • Binary (Monochrome)
  • on / off (white / black)
  • Gray scale
  • level of grey
  • RGB
  • Red / Green / Blue components (triple grey scale)
  • HSI / HSV
  • Hue / Saturation / Intensity (value)
  • Colour histogramming
  • generate total pixel of each colour used

33
Computer Vision
  • Visual Erosion
  • degradation in segmentation quality is called
    visual erosion
  • object appears to erode with changes in lighting
  • Region segmentation
  • identify a region in the image with a particular
    colour
  • Thresholding
  • separate image with given thresholding value
  • Foreground / Background
  • separate interested object in foreground and
    others to background

34
Computer Vision
  • Range from Vision
  • Stereopsis / optic flow
  • way that human percept range and depth of an
    object
  • reason we have two eyes
  • Stereo camera pairs
  • mimic human eyes

35
Computer Vision
  • Light stripers
  • projecting a line, grid or pattern of dots on the
    environment
  • regular vision camera observes how the pattern is
    distorted in the image
  • Laser ranging
  • emit light as sonar emit sound wave
  • very accuracy (a range of 30 meters and an
    accuracy of a few millimetres)
  • expensive
  • Texture
  • detects patterns or colour (e.g. carpet)
  • if something standing on the carpet, it reads as
    occupied
  • strong shadows could create ghost object

36
References
  • R. Murphy, Introduction To AI Robotics, MIT Press
    2000.
  • Slides from Dr. Chris Messom
  • Wikipedia, the free encyclopedia
  • J. Allocca and A. Stuart, Transducers Theory and
    Application, Reston 1984.

37
Questions?
38
Thank You!
Write a Comment
User Comments (0)
About PowerShow.com