Visualisation, Animation and Virtual Reality - PowerPoint PPT Presentation

1 / 77
About This Presentation
Title:

Visualisation, Animation and Virtual Reality

Description:

Most suited to public exhibitions (cheap glasses) Z-Screen - 120Hz (frame sequential stereo) ... ultrasonic time-of-flight measurement. Pulsed Infra-red. GPS ... – PowerPoint PPT presentation

Number of Views:479
Avg rating:3.0/5.0
Slides: 78
Provided by: bobh46
Category:

less

Transcript and Presenter's Notes

Title: Visualisation, Animation and Virtual Reality


1
Visualisation, Animation and Virtual Reality
  • Lecture 9
  • Tracking And Interaction

2
Introduction
  • Bob Hobbs
  • K210
  • r.g.hobbs_at_staffs.ac.uk

3
How does VR work
  • we live in a 3D world
  • We have developed many methods to make sense of
    the world around us
  • VR techniques have to try to recreate these
    methods

4
2D images can be confusing
5
shadows and high lighting create the illusion of
3D
6
but which is closer?
7
occlusion
8
A simple shape
Defined as a set of points or Vertices With
surfaces or facets defined by spline boundaries
created by joining points with lines
9
More complex shapes
  • Shape defined as polygons(triangles)
  • Rounded surfaces created by more polygons

10
Defining the surface
  • Colour
  • variation over surface
  • Texture
  • rough, smooth, etc
  • Lighting
  • creates shadowing
  • Reflectance
  • dependant on texture and colour

11
Realism added by surface mapping
12
Lighting and reflectance
  • Exhibits shadowing and shading
  • Gouraud shading
  • Ray-tracing used to calculate light paths based
    on reflectance values

13
Perspective and Z-buffering
  • Objects appear smaller further away
  • Zero-point
  • Uses Z co-ordinate to compute
  • Relative position
  • Occlusion

14
Perspective
15
Depth of field
  • Further away objects become hazier
  • Focus attention on nearer objects
  • Occurs naturally but must be added to virtual
    environment

16
Blue hazing with distance
  • look at a distant hill or building
  • fuzzy, less contrast, bluish tinge
  • scattering effect of air
  • brains get used to it
  • blue objects seem further away
  • red ones closer
  • use in visualisation and VR (also used in garden
    design!)

17
Anti-aliasing
  • Sharp contrast unreal
  • Curved lines - stepped
  • Edges blurred removing stepping
  • More natural

18
Underlying geometry
  • Vertices
  • Lines
  • Faces
  • Transforms

19
Geometry pipeline
Animation/Interaction time
Modeling shapes
Shading reflection and lighting
Transformation viewing
Hidden Surface Elimination
Imaging Pipeline
20
Imaging pipeline
Geometry Pipeline
Rasterization and Sampling
Texture Mapping
Image Composition
Intensity and Colour Quantization
Computer Monitor
Framebuffer/Display
21
Example
22
Wireframe model Orthographic views
23
Perspective View
24
Depth Cue
25
Hidden Line Removal add colour
26
Constant Shading - Ambient
27
Faceted Shading - Flat
28
Gouraud shading, no specular highlights
29
Specular highlights added
30
Phong shading
31
Texture Mapping
32
Texture Mapping
33
Reflections, shadows Bump mapping
34
Basic Analysis
  • 1 Define points in 3D space
  • 2 Define lines and Facets which join points
    together
  • 3 Define light sources to generate shadows and
    shading
  • 4 Apply texture to facets
  • 5 Define reflectance properties and colour of
    surface
  • 6 Redraw image as viewpoint changes applying
    perspective and occlusion to induce Reality

35
Viewing Frustrum
  • Controls visibility,depth and perspective of
    scene
  • Linked to virtual camera
  • Stereo has a frustrum for each eye

36
Recap
Initialize world
Calculate Geometry
Draw Wire Frame
Render Surfaces
Enhance Surfaces and lighting
Sensor input and output
37
Stereo
  • Human visual system gets two slightly different
    images, one from each eye
  • Two new camera attributes, distance to zero
    parallax distance and eye separation
  • Zero parallax- Distance of projection plane does
    matter- Objects appear at the screen depth
  • Positive parallax- Projected objects are on the
    same side as the corresponding eye- Objects
    appear behind the screen
  • Negative parallax- Projected objects are on the
    opposite side as the corresponding eye- Objects
    appear in front of the screen

38
(No Transcript)
39
Symmetric frustum and trim
  • Computing stereo pairs with asymmetric frustum

40
Toe in camera (Symmetric frustum)
41
Projection Mechanisms
  • Goal Presenting left and right images
    independently to each eye- Main determinant of
    quality is the degree of cross-talk or
    interference
  • Requires "perfect" syncronisation of left and
    right images
  • Active Stereo
  • Passive Stereo

42
Active stereo
  • 120Hz (frame sequential stereo), 60Hz per eye-
    Flicker becomes objectionable for most people
    around 110Hz- Project onto any surface- Good
    quality glasses cost upwards of 300- Works with
    monitors for personal viewing

43
Passive Stereo
  • 60-80 Hz per eye- Optionally circular
    polarisers- Most suited to public exhibitions
    (cheap glasses)

44
  • Z-Screen- 120Hz (frame sequential stereo)-
    Projector or monitor

45
Anaglyphic Stereo
Like 3D stereo Polaroid glasses
46
Sensing position
  • Tracking

47
Head tracking
48
Head tracking
49
Head tracking
50
Head tracking
51
Head tracking
52
Different sensing methods
53
Accelerometer
Phantom
Fast Track
54
How the tracker works
  • Distance detection

Receiver
Transmitter
55
How the tracker works
  • Orientation detection

Receiver
Transmitter
56
Head tracking
57
Head tracking
  • Latency
  • Filtering, keep steady
  • Transients of sound

58
(No Transcript)
59
Human Dynamics
  • Users described as participants
  • basic interaction involves control of camera
    (viewpoint)
  • exploratory navigation / locomotion
  • Walk through systems
  • More advanced environment allow interaction
  • Touch , selection, manipulation
  • referred to as direct manipulation

60
Components of interaction
  • VR model
  • Simulation of body
  • Interaction with virtual body
  • Object pair collision
  • General collision detection

61
VR Model
  • Goal of Being There
  • Presence or Telepresnce
  • Held and Durlach 1992, Draper 1998
  • Must model expectations -gt realism
  • Ideal VR model must Immerse participant in
    visual, audio, touch , smell and taste
  • Humans can process several audio streams and can
    focus and segrgate on one - Wenzel 1992

62
VR model - Immersion
  • Surrounds body
  • fills visual field
  • extensive
  • inclusive (replaces reality)
  • Vivid
  • human body
  • in CAVE actual body can obscure projection of
    virtual objects
  • In HMD body must be represented

63
VR model - HCI
  • Mouse and keyboard has two problems
  • gulf of execution
  • gulf of evaluation
  • Hutchins 1986
  • Direct Manipulation paradigm
  • Tracked HMD is simplest form - 1 to 1 mapping,
    Low cognitive overhead
  • Using mouse - must map actions to different
    translations

64
VR Model - Interaction
  • Immersion and tracking rely on registration
  • Registration implies that motion of limbs
    accurate
  • Better appreciation of 3D environment
  • Cannot lose interaction - reduces gulf of
    execution
  • Gulf of evaluation reduced when whole virtual
    body used - Slater and Usoh 1994, Mine 1997

65
Simulation of Body
  • Body model is the description of the interface
  • eyes are visual interface, ears are audio
    interface
  • geometric description drawn from egocentric point
    of view
  • description of hand and fingers forms basis of
    grasping simulation for picking up objects
    (Boulic 1996)

66
Simulation of Body- Building the body
  • The more points representing the body the more
    realistic the movement
  • Up to 90 points for motion-capture in animation
  • Standard for human skeleton (H-Anim 1999)
  • More typically head, Torso, Both hands
  • Inferred movement from limited points
  • Inverse kinematics problem - infinite
    possibilities of movement in virtual environment,
    consistent restraint
  • Elbow position in 4- Tracker system (Badler, 1993)

67
H-Anim
Humanoid
L Midtarsal
L Ankle
L Knee
L Hip
Sacroiliac
R Midtarsal
R Ankle
R Knee
R Hip
L Wrist
L Elbow
L Shoulder
vl5
R Wrist
R Elbow
R Shoulder
Skullbase
68
Simulation Of body - tracking the participant
  • Choice of system depends on 5 factors
  • accuracy, resolution, range, lag, update rate
  • Many different tracking technologies
  • Meyer 1992
  • frequency and time
  • ultrasonic time-of-flight measurement
  • Pulsed Infra-red
  • GPS
  • Optical Gyroscopes
  • Phase difference

69
Simulation Of body - tracking the participant
  • Spatial Scan
  • Outside-in
  • Inside-out
  • Inertial sensing
  • mechanical gyroscope
  • Accelerometer
  • Mechanical Linkages
  • Direct - Field Sensing

70
Interaction with virtual Body
  • Limitations mean reliance on metaphors for
  • object manipulation (grasping and moving)
  • locomotion (movement)
  • Limitations in haptics mean that restraint on the
    virtual environment exists

71
  • Sensors in joints detect position
  • 3D viewer updates
  • Robot applies force to joints
  • Force is felt on hand

72
Object Manuipulation
World
World
Body B
Object O
Body B
Object O
Hand H
Hand H
Object P
Releasing
Object P
Grasping
73
Object Manipulation
  • Hand posture may not be tracked - makes grasping
    difficult
  • Must establish a point at which union is deemed
    to have taken place
  • Moved by repositioning in the scene graph
  • Robinett and Holloway 1992

74
Locomotion
  • Tracker has a limited range
  • Must use locomotion metaphor to move greater
    distances
  • Locomotion is on an even plane , virtual terrain
    may not be
  • Collision detection can be employed to raise or
    lower the participant accordingly

75
Directions of locomotion
Fly in direction of aim Fly in direction of
pointing Fly in direction of gaze Fly in
direction of torso
76
Books and Articles
  • The Handbook of Virtual Environments (2002), Kay
    Stanney (ed), Lawrence Erlbaum.
  • Isdale, J., 1998, What is VR? http//www.isdale.co
    m/jerry/VR/WhatIsVR.html
  • Kalawsky, R., 1993, The Science of Virtual
    Reality and Virtual Environments, Addison Wesley.
  • Rheingold, H., 1991, Virtual Reality, Secker and
    Warburg, London.
  • Wilson, J.R., DCruz, M., Cobb, S. and Eastgate,
    R., 1996, Virtual Reality for Industrial
    Applications, Nottingham University Press.

77
Resources
  • www.vrweb.com (VR Solutions Company)
  • www.barco.com/projection_systems/virtual_and_augme
    nted_reality/
  • www.sgi.com (VR Solutions Company)
  • www.ptc.com (free modelling program)
  • www.sense8.com (trial VR program)
  • www.crystalspace.com (free Games Engine)
Write a Comment
User Comments (0)
About PowerShow.com