Errors Perceiving Depth with Stereoscopic Mixed Reality Displays - PowerPoint PPT Presentation

1 / 43
About This Presentation
Title:

Errors Perceiving Depth with Stereoscopic Mixed Reality Displays

Description:

(e.g. half-silvered mirror) SV: stereoscopic video. local (mediated reality) or ... operator controls local simulation (red lines) commands sent in batch mode ... – PowerPoint PPT presentation

Number of Views:125
Avg rating:3.0/5.0
Slides: 44
Provided by: paulmi9
Category:

less

Transcript and Presenter's Notes

Title: Errors Perceiving Depth with Stereoscopic Mixed Reality Displays


1
Errors Perceiving Depth withStereoscopicMixed
RealityDisplays
  • David Drascic
  • Ergonomics in Teleoperation Control Lab
  • Mechanical Industrial Engineering Dept.
  • University of Toronto
  • http//etclab.mie.utoronto.ca

2
Overview
  • Classification of Mixed Reality Displays
  • Practical Context / Motivation
  • Review of Perceptual Issues
  • Experimental Investigation

Context Ergonomics- what makes mixed reality
useful?- what are its limitations?
3
Three Modes of Reality
  • DV direct view
  • directly or
  • through an optical system(e.g. half-silvered
    mirror)
  • SV stereoscopic video
  • local (mediated reality) or
  • remote (telepresence)
  • SG stereoscopic graphics
  • a.k.a. CGI (computer-generated images)

4
What is Mixed Reality?
  • MR combines views of the real world with CGI in a
    unified display.

Mixed Reality ( MR )
Reality - Virtuality Continuum
REAL
VIRTUAL
Reality - Virtuality Continuum
Local Remote
Simulation Fantasy
RealWorld
Virtual Environment
Augmented Reality (AR)
Augmented Virtuality (AV)
Mostly real with some CGI elements
Mostly CGI with some real elements
Direct ormediated
5
Classes ofMixed Reality Displays
  • Monitor Based (Window-on-World)
  • Dynamic Head Mounted Virtual Window
  • Optical See-Through HMD
  • Video See-Through HMD
  • Large Screen Projection

6
Example 1 Offline Teleoperation
  • operator controls local simulation (red lines)
  • commands sent in batch mode to remote robot
  • local simulation super-imposed on static remote
    view
  • stereoscopic displays required for accurate
    spatial perception

7
Example 2 Remote Mining
Real Robot
Computer
Jammed Rocks
Virtual Robot
Crushing
Machine
Real World Scene
(Robot Site)

Operator
8
Example 3 Virtual Tape Measure for Minimally
Invasive Surgery (video graphics)
8.26 mm
9
Example 4 Large Screen Projection Displays
  • high immersion via large screen size
  • most of what you see is virtual graphics,with
    some Direct Viewing (DV) overlays
  • high resolution reality (DV)versus low
    resolution graphics
  • head tracking
  • viewpoint dependent imaging
  • real-world registration is critical

10
Some Ergonomic Issues with MR Stereo Displays
  • Two Different Functions of MR
  • output displaying info to user
  • input getting info from user
  • What are requirements for each use?
  • task dependent

11
MR as Output Device
  • system displays info to user
  • task determines requirements
  • monocular, biocular, or stereoscopic
  • resolution
  • precision / accuracy
  • update rate, lag
  • many useful tasks need only low fidelity

12
MR as Input Device
  • user indicate spatial information by manipulating
    virtual objects
  • user relies on relative depth perception to
    communicate absolute spatial information
  • simple, quick, effective
  • mixed reality can bridge thecommunication gap
    regarding spacebetween computers and people
  • need accurate visual alignment of virtual and
    real objects

13
Problems with Mixed Reality
  • Implementation Problems
  • calibration, measurements
  • differences between appearance ofDV, SV, and SG
  • Technological Problems
  • registration, lag, field of view, etc.
  • Perceptual Problems

14
Implementation Problems 1
  • Video Calibration Errors
  • errors measuring stereo camera and/or stereo
    display parameters
  • these errors warp visual spaceand distort
    perceived velocities
  • MR using direct view must be orthoscopic
  • for other MR systems, it dependson the task and
    circumstances

15
Implementation Problems 2
  • Video/Graphic Mismatches
  • differences in video graphics parameters
  • registration errors
  • affects overlay compatibility
  • reduces effective stereo-acuity
  • User Visual Measurement Errors
  • left and right pupil and centre of rotation
  • affects perception of absolute and relative
    distances, warps space and velocity

16
Current Technology Limitations
  • static dynamic registration mismatches
    (tracking)
  • restricted fields of view
  • display resolution limitations mismatches
  • display luminance limitations mismatches
  • contrast mismatches
  • depth resolution limitations
  • vertical alignment mismatches

17
Fundamental Perceptual Problems
  • Interposition Inconsistencies
  • Accommodation Vergence Conflicts
  • Accommodation Mismatches
  • Absence of Shadow Cues
  • Image Quality Differences

18
Interposition Inconsistencies
  • video-based displays
  • SV never occludes SG (can be faked)
  • see-through HMD displays
  • reflected SG always transparent
  • DV never occludes SG (can be faked)
  • SG never occludes DV (new tech?)
  • large-screen displays
  • DV always occludes SG (hybrid?)

19
Interposition Surface Effects
  • when virtual object movesbehind surface of a
    real object
  • SG image might break down (be hard to fuse)
    behind real surface
  • real SV objects might appear transparent
  • displacement in apparent depth?
  • role of texture? object complexity?

20
Accommodation Vergence Conflict
  • accommodation distance to screen differsfrom
    convergence distance to fused image
  • common problem with most stereo displays
  • likely cause of eye strain
  • perceived distance different from intended

Accommodation
Convergence
Perceived Distance?
21
Accommodation Mismatches
  • example real hand touching virtual object
  • accommodation to graphic box (fSG) ?
    accommodation to real hand (fDV) on screen
  • if box originallyperceived to beat position
    p,where willhand point?
  • role of depth offield?

f
SG
p
f
DV
disparity
22
Image Quality Differences
  • Real objects appear sharp, in focus, with high
    contrast
  • Virtual objects often appear fuzzy, with poor
    focus and low contrast
  • Problems (with all else being equal)
  • Fuzzy images appear further away
  • Low contrast images appear further away
  • Darker images appear further away
  • Accidental Depth Cues?

23
Hypothesized Perceptual Process
  • Computer draws object at Position 1 (Vergence)
  • User perceives object at 2 (Accom-Verg conflict)
  • User reaches for object, but perceives (Disparity)
    error due to binocular disparity
  • User adjusts hand to Position 3, (Accom.
    Mismatch) balancing disparity with other cues
  • As hand grasps object, perceived (Occlusion)posit
    ion moves to 4

Where is the object??
Where is the object??
24
Ergonomic Implications
  • Perceptual limitations of MR must influence
    design of interactive systems
  • Accurate perception impossible?
  • errors may be predictable
  • system must take them into account
  • MR using SV and SG easier?
  • both components equally affected by most display
    limitations

25
Our Plan Measure MR Error
  • measure bias
  • is distance consistently wrong?
  • measure variance
  • different variance for different display
    modalities?
  • Exp 1 MR video graphics
  • ARGOS Augmented Reality through Graphic
    Overlays on Stereo-Video
  • Exp 2 MR graphics direct view
  • MARS Multipurpose Augmented Reality System

26
Overview of Experiments
  • task align pointer with target
  • Real Pointer - Virtual Target (RP-VT)
  • Real Pointer - Real Target (RP-RT)
  • Virtual Pointer - Virtual Target (VP-VT)
  • Virtual Pointer - Real Target (VP-RT)
  • two measures of performance
  • bias (mean error is non-zero)
  • variance
  • only one degree of freedom depth axis

27
Experimental Hypotheses
  • Hypothesis 1 all means are equal
  • no systematic bias
  • Hypothesis 2 all mean errors are zero
  • Pointer is aligned with the Target
  • Hypothesis 3 all variances are the same
  • no performancedifference betweenreal and
    virtualPointers Targets

Frequency
Position
Target Position
28
Exp 1 Video Graphics
  • 2 pointers real and virtual
  • 2 targets real and virtual
  • 4 target distances 0.8, 1.0, 1.3, 1.8
    metresfrom camera, viewed on monitor
  • 10 subjects
  • 10 replications per cell
  • subjects sat 60 cm from screen
  • 14 inch monitor

29
Exp 1 Video and Graphics
  • align pointer (real or virtual)with target (real
    or virtual)

Users View
Pointer
Target
Real Pointer
Real Target
Top View
Target Holders
30
Exp 1 Hyp 3 Variances Equal
  • Virtual Pointer has higher variance

Real Target Dist.
1.81 m
1.259 m
.965 m
.821 m
40 pixels
20 pixels
Disparity
0 pixels
-15 pixels

VP VT
VP RT
Std Dev of Target Position Error
(arc seconds)
RP VT
RP RT
Subject Convergence Angle to Target on Display
(degrees)
31
Exp 1 Hyp 3 Variances Equal
  • reject
  • only relevant factor Pointer Type
  • StdDev(Virtual) 1.6 StdDev(Real)
  • used different input devices
  • rope for real, trackball for virtual
  • had very different levels of realism
  • update rate
  • accuracy of positioning static aliasing
  • smoothness of motion dynamic aliasing
  • lack of Target effect suggests that SG images are
    perceived in desired location

32
Exp 1 Hyp 1 All Errors Equal
  • position error is function of target distance
  • no other significant effects

0.020
0.015
VP-RT
0.010
VP-VT
Pointer Error (metres)
RP-RT
0.005
RP-VT
0.000
-0.005
0.75
1.00
1.25
1.50
1.75
2.00
Distance of Target from Cameras (metres)
33
Exp 1 Hyp 1 All Errors Equal
  • no factors significant, so cannot reject

100
VP-RT
VP-VT
50
RP-RT
RP-VT
0
Pointer Disparity Error (arc seconds)
-50
-100
-150
4.50
5.00
5.50
6.00
6.50
7.00
Convergence Angle to Image of Target (degrees)
34
Exp 1 Hyp 2 Mean Error Zero
  • small negative bias of mean -25 arc sec
  • 1 pixel 114 arc sec

Histogram of Position Errors
User is more likely to put pointer too close to
self rather than too far away
VP Error
Virtual Pointer
RP Error
Frequency
Real Pointer
Angular Error (seconds of arc)
35
Exp 1 Summary
  • found no bias due to mixed reality
  • larger variance with SG pointer due to input mode
    and much shorter task time
  • confounded with input device

36
Exp 2 Direct View and Stereo Graphics
ß
PointerReal or Virtual
Monitor
Target Real or Virtual
37
Exp 2 Direct View Graphics
  • same CoRD (Computerised Rope Device) input device
    for all conditions
  • 2 pointers real and virtual
  • 2 targets real and virtual
  • 4 target distances .5, .7, .9, 1.1m from screen
  • 5 subjects
  • 12 replications per cell
  • subjects sat 2m from screen, head fixed
  • 19 inch monitor, 11 degrees field of view

38
Exp 2 Hyp 3 Variances Equal
  • Mixed Reality has higher variance f(1,4)25.925
    0.007
  • Virtual Pointer more variable than Virtual Target
    NS

Direct View Stereo Graphics Std. Deviations
0.020

VP-RT
0.015
RP-VT
0.010
Std Dev of Target Position Error
(arc minutes)
VP-VT
0.005
RP-RT
0.000
0.50
0.70
0.90
1.10
Distance of Target from Screen (m)
(Subject at 2m)
39
Exp 2 Hyp 1 All Errors Equal
  • main effects of Pointer and Target
  • interaction effects of both cross Distance

0.06
0.04
VP-RT
0.02
Pointer Position Error (metres)
RP-RT
0
VP-VT
-0.02
RP-VT
-0.04
0.50
0.70
0.90
1.10
Target Distance from Monitor (metres)
40
Exp 2 Hyp 1 All Errors Equal
  • main effects of Pointer and Target
  • interaction effects of both cross Distance

5
VP-RT
2.5
RP-RT
Pointer Position Error (arc mins)
0
VP-VT
-2.5
RP-VT
-5
0.50
0.70
0.90
1.10
Target Distance from Monitor (metres)
41
Exp 2 Hyp 2 Mean is Zero
  • Unmixed Reality
  • real pointer and target 12 arc seconds
  • virtual pointer and target 11 arc seconds
  • 1 pixel is28 arc seconds

42
Exp 2 Summary
  • mixed reality shows more variability
  • twice that of virtual reality
  • up to 4 times that of direct view
  • found significant bias due to mixed reality
  • in the direction expected byaccommodation
    vergence conflict
  • role of depth of field still to be determined
  • small bias towards the user for the unmixed
    reality conditions

43
Implications
  • MR without direct view shows greater potential
    for accurate spatial perception
  • MR with direct view shows strong perceptual
    biases that may be linked to accommodation
    vergence conflict and accommodation mismatch
  • MR systems must be designed to take the
    perceptual limitations into account
Write a Comment
User Comments (0)
About PowerShow.com