Minimum Audible Movement Angle as a Function of the Azimuth and Elevation of the Source - PowerPoint PPT Presentation

1 / 32
About This Presentation
Title:

Minimum Audible Movement Angle as a Function of the Azimuth and Elevation of the Source

Description:

In the past, location of sounds in a cockpit were not very informative. ... (sensitivity, discernment, etc.) 4. Study. Auditory localization ... – PowerPoint PPT presentation

Number of Views:326
Avg rating:3.0/5.0
Slides: 33
Provided by: micros203
Category:

less

Transcript and Presenter's Notes

Title: Minimum Audible Movement Angle as a Function of the Azimuth and Elevation of the Source


1
Minimum Audible Movement Angle as a Function of
the Azimuth and Elevation of the Source
  • Strybell, Manlignas and Perrott
  • 1992

2
  • In the past, location of sounds in a cockpit were
    not very informative.
  • Technology of headphones now allows for 3-D
    auditory cues.
  • In 1986, Doll et. al pointed out that auditory
    directional cues could be used to enhance
    situational awareness

3
Utility of auditory spatial cues depends on 2
factors
  • Quality of the 3-D auditory simulation.
  • Pilots ability to use the information.
  • (sensitivity, discernment, etc.)

4
Study
  • Auditory localizationPilots ability to
    determine location of sound.
  • Spatial AcuityPilots ability to discriminate
    between the positions of two concurrent sources.

5
  • Past research focused on static sounds that
  • were real (not simulated) on a horizontal plane.

6
  • MAA (Minimum Audible Angle)

7
The focus on stationary stimulus is of limited
use in cockpit situations where acoustic events
are moving.
  • Pilots head is moving
  • Aircraft is moving
  • Auditory stimulus could also be moving

8
  • Relatively few studies about dynamic listening
    conditions.
  • Most studies looked only at 0º azimuth.
  • MAMA (Minimum Audible Movement Angle) is an index
    used to measure dynamic acoustical events.

9
  • Like the MAA, the MAMA acuity increases as
    azimuth angle increases.
  • MAMA affected by the spectral content of the
    signal and the velocity of the moving sound. (It
    is smallest for broadband stimulus moving at low
    velocities.)

10
  • 2 studies showed that the MAMA acuity
  • is somewhat poorer than the MAA acuity.

11
Purpose of Study
  • To examine auditory spatial conditions more
    representative of those encountered in
    head-coupled display systems.
  • To measure MAMAs for sound sources located
    between 80º and -80º azimuth and between 0º and
    87.5º elevation

12
How they did it
13
MAMAs were measured at 16 azimuth elevations
  • 9 horizontal azimuths at 0º elevation.
  • 5 elevations at 0º azimuth.
  • 2 azimuth locations (-40º, -80º) at 80º
    elevation.
  • The loudspeaker traveled 20 degrees/second
    clockwise or counterclockwise.
  • Adaptive psychophysical procedure.

14
Results
15
Conclusion
  • MAMA ranges from 1-3 degrees for sources located
    in front of the subject.
  • For sources outside the semi elliptical area,
    MAMA increases 3º - 10º.
  • No reason to believe that asymmetries exist
    across left/right quadrants, but its likely that
    differences would exist below the horizon due to
    the differential shadowing effect of the torso.

16
Cockpit Applications
  • Since moving events in the periphery must travel
    further (200-300) than centrally located events,
    displays of moving acoustic stimuli in the
    cockpit must take into account the location of
    the event in order to insure that the pilot can
    actually detect the direction of movement within
    a critical period.

17
Spatial Intercoms for Air Battle Managers Does
visually Cueing Talker Location Improve Speech
Intelligibility?
  • Bolia, 2003

18
  • Air battle managers sometimes monitor as many as
    8 communication channels simultaneously.
  • Spatial intercoms have been suggested as a way of
    improving intelligibility and situation
    awareness.
  • Situation Awareness (SA) is the idea that by
    localizing a virtual sound that mimics the
    location of the real world sound, its easier for
    the listener to keep track of where
    communications are coming from.

19
What if the reverse were true?
  • If a spatialized audio cue can help
  • a listener identify a spatial location, can a
    visual cue be used help a listener attend more to
    an audio cue?

20
To investigate this question, Bolia used a form
of the Coordinate Response System (CRM).
  • Call sign
  • Color/number combination
  • Ready baron, go to the blue five now.

21
  • For this experiment, there were 2 talkers (same
    sex) located at 20º or 90º of azimuth (0º
    elevation).
  • 2 target call signs were assigned.
  • Baron in 87 of the trials
  • Charlie in 10 of the trials
  • Both Baron Charlie were used in 3 of the
    trials
  • Charlie was considered higher priority in the
    case of conflict.

22
3 Conditions
  • Diotic
  • Binaural uncued
  • Binaural cued

Modified version of the NASA Task Load Index was
used to rate workload after the trials
23
Results
  • Binaural conditions were significantly better
    than the diotic conditions.
  • But cued conditions did not improve performance
    over uncued conditions.
  • However, cued conditions did not degrade
    performance, and it rated as lower in task load
    and was perceived by participants as helping.

24
Effects of Spatial Intercoms and Active Noise
Reduction Headsets on Speech Intelligibility in
an AWACS Environment
  • Bolia, 2003

25
USAF-AWACSUnited States Air ForceAirborne
Warning and Control System
  • Confusing environment
  • Noisysometimes more that 85 dBs
  • 8 simultaneous communication channels.

26
2 technologies proposed to ameliorate the
situation
  • Spatial Intercoms(shown to improve speech
    intelligibility)
  • Uses spatial cues from interaural differences in
    sound levels (ILDs). Good below 1500Hz.
  • Interaural time differences (ITDs). Good above
    3000Hz.
  • ANRActive Noise Reduction
  • attenuates external sound by measuring it, then
    adding a waveform with the same spectral
    characteristicsbut an opposite phase. Good above
    1000 Hz.

27
Both technologies had been studied extensively,
but Bolia wanted to find out how they interact.
The purpose of the study was to determine,
within the context of an AWACS noise environment,
the extent to which spatial intercoms and ANR
contribute to improving speech intelligibility in
a multi-channel listening task, and the degree
to which a facilitative interaction exists.
28
Apparatus
  • Used speech phrases from CRM (Coordinate Response
    Measure) convolved with Head Related Transfer
    Functions.
  • HRTFs are digital representation of spatial cues
    used for sound localization.
  • They capture both ITDs and frequency-dependent
    ILDs for various measured locations.

29
Experiment
  • Participants assigned a call sign
  • Responded by clicking on color/number matrix
  • Listened to 2 simultaneous phrasesone with a
    call sign, one withoutpresented at 20º and 90º
    of azimuth. Phrases were always in different
    hemifields.

In the non-spatial condition, both phrases were
convolved with the same pair of HRTFs (0º azimuth
and 0º elevation)
30
Asymmetry of Sound Field
Louder in participants right ear
Target Left
Target Right
Diotic
ANR on
ANR off
31
Results
  • Spatial conditions70 correct
  • Diotic40 correct
  • Left Hemifield89 correct
  • Right Hemifield53 correct
  • With ANR increased 20

Enhanced intelligibility with ANR may be due to
the increased audibility in the frequency range
in which ITDs operate, i.e. below 1500 Hz
32
Conclusion
  • Generally, ANR doesnt improve speech
    intelligibility.
  • In certain high noise environments, ANR can
    improve intelligibility.
  • Might be applicable to the cockpit environment.
Write a Comment
User Comments (0)
About PowerShow.com