Physical HCI - PowerPoint PPT Presentation

1 / 110
About This Presentation
Title:

Physical HCI

Description:

The computer stores and manipulates information. Useful information reflects the physical state of the external world ... electrocardiogram (EKG) ... – PowerPoint PPT presentation

Number of Views:986
Avg rating:3.0/5.0
Slides: 111
Provided by: scotth74
Category:

less

Transcript and Presenter's Notes

Title: Physical HCI


1
Physical HCI
  • Beyond the Point-N-Click/Keyboard/Screen
  • Human Computer Interface

2
Some key concepts
  • The computer stores and manipulates information
  • Useful information reflects the physical state of
    the external world
  • Especially the world of the human user
  • To make a computer more effective in interacting
    with the world, provide ways to get information
    into and out of the computer
  • Input sense the physical world -- sensors
  • Output change the physical world actuators

3
  • Most of this material is from notes on a course
    on Human Interface Technologies taught at the
    University of British Columbia, Canada
  • http//www.ece.ubc.ca/elec596/
  • Instructor Sidney Fels

4
Course (ELEC596) Basis
  • The communication of human experience is central
    to the future of computing
  • Techniques needed for
  • sensing, encoding, transmitting, storing,
    indexing, retrieving, compressing, recognizing
    and synthesizing
  • Human body has many I/O channels
  • Integrate Cognitive, Physical and Emotional
    aspects of interaction
  • Interface should disappear.

5
Human Information Processing
  • Input ways to send information to humans
  • Visual channel
  • Auditory channel
  • Position and Motion Sensing Channel
  • Somatic (touch) Channel
  • Taste and Smell Channels
  • Output ways to receive information from humans
  • Intentional
  • neuromuscular, movable, verbal
  • Non-intentional
  • Galvanic Skin Response (GSR), Heart Rate, Brain,
    Muscle, other
  • Decisions
  • Tracking
  • Memory
  • Learning
  • Indviduals vs. Groups

6
Visual Channel
  • Light/dark, color, depth (binocular and
    apparent), size, motion
  • Types of eye movement (six muscles)
  • compensatory (must have target)
  • pursuit (must have target)
  • Tremor, flick and drift
  • saccadic (jump from one fixation to another)
  • Perceiving Motion
  • 8Hz gives sensation of motion
  • 5 ways to make a light move
  • Familiarity helps interpret movement
  • Movement implies life
  • Movement links images (strongly)

7
Five ways to make a spot of light appear to move
Goldstein, E. Bruce Sensation and Perception, 3rd
ed. Wadsworth
8
Auditory Channel
  • Senses vibration of air molecules
  • Frequency and amplitude
  • Localization (two ears)
  • Well-adapted to speech

9
Position and Motion Sensing
  • Inner ear has mechanisms for angular position and
    acceleration
  • vestibular sensing system
  • like a biological gyroscope
  • six orthogonal semicircular canals
  • head movement and eye movement coordinated
    instantaneously
  • Body has proprioceptors
  • embedded in muscles, joints and tendons
  • provide kinesthetic sensation for position
    information
  • important for balance

10
Somatic channel
  • Heat and cold (separate sensors)
  • Touch (pressure)
  • Rate is very important
  • light touch quickly applied produces sensation
  • Hair acts as lever
  • same as proprioceptors
  • Pain sensing
  • mechanical, chemical, thermal or electrical
    sensitive
  • Critical feedback channel for manual tasks
  • Considerable work with touch and force feedback
  • haptic feedback

11
Taste and Smell
  • Chemical senses
  • Taste buds for
  • sensations of sour, salty, bitter and sweet
  • extremely complex and poorly understood
  • Olfactory cells for
  • different theories chemical, infrared
    absorption,
  • different perceptual mappings
  • small prism
  • four odors fragrant, acrid, burnt and caprylic
    (fatty acid an unpleasant odor like that of
    goats or sweat Dictionary.com)
  • Acuity is great - 10,000 times more sensitive
    than taste
  • Negative adaptation occurs

12
Summary of Input Channels
  • Usually combination of senses active
  • such as hand-eye coordination co-interpret
    visual and motion channels
  • We also can sense
  • time
  • protensity the attribute of a mental process
    characterised by its temporality or movement
    forward in time cancerweb.com
  • probability
  • intensity
  • Break-off phenomenon
  • Sensory detachment, shutdown

13
Intentional Output Neuromuscular
  • Motor control associated with cerebral cortex
  • Volitional and non-volitional
  • can see in facial expression
  • Muscles contract when stimulated by nerves

14
Intentional Output Movable Controls
  • Affordances
  • keyboards, touch pads, phone dials, etc.
  • verbal control/non-verbal control
  • tongue movement
  • breath control
  • facial control
  • gait
  • hand motion

15
Non-intentional Output
  • GSR
  • Surface conductance of skin changes
  • Related to mental activity
  • A change in the ability of the skin to conduct
    electricity, caused by an emotional stimulus,
    such as fright dictionary.com
  • Heart response
  • Resting range around 72 pulses/sec
  • varies from 45 to 90 normally
  • Change related to mental state
  • Measure electrical change during beating
  • electrocardiogram (EKG)
  • signal processing of EKG is correlated with
    stress for Human Input (Rowe, 1998)

16
Non-intentional Output
  • Brain Response
  • Brain produces electrical activity under various
    conditions
  • electroencephalogram (EEG)
  • difficult to interpret what is going on
  • Muscle Response
  • Nerves electrically stimulate muscles
  • electromyogram (EMG)
  • Rest at 3-4 pulses, thinking about moving or
    moving will increase this
  • Reliable measure of fatigue cost (Inman et al.)
  • Gradient may be useful

17
Summary
  • Multitude of input/output channels
  • all active at once
  • I/O mechanisms usually depend upon
  • cognitive context
  • emotional contexts
  • All these channels are available to assist humans
  • Human Interface Technology is about finding ways
    to manipulate and/or measure these channels for
  • improved performance (cognitive, physical or
    emotional)
  • entertainment and expression

18
Driving Trends for Human Interface Tech.
  • Virtual Reality, Immersive Environments,
    Augmented Reality
  • Ubiquitous Computing/Intelligent Environments
  • Wearable Computing, Tangible Bits
  • Games, Arts, Interactive Theatre, Interactive Art
  • WWW, Agents, Collaborative work

19
Wearable Computing
  • Wearable examples
  • video camera (glasses)
  • heads up display (glasses)
  • compute device (shoes)
  • body monitoring devices
  • communication devices
  • tracking devices
  • audio devices
  • etc.

20
Wearable Computing
  • Applications
  • altered realities
  • reeze frame, color
  • augmented realities
  • extra information such as people id tags
  • prosthetics visual, audio, memory
  • Social implications?
  • New protocols possibly needed
  • security

21
Virtual Reality/Environments
  • Real-time, interactive graphics with 3D models
    display technology that gives user immersion in
    the model world with direct manipulation
  • Popular in late 80s and early 90s
  • is changing to interactive information
    visualization
  • Drove a lot of HIT
  • 3D graphics, trackers, gloves, head-mounted
    displays and more

22
VR/VE
  • Applications
  • entertainment
  • vehicle simulation
  • airplanes, cars, expensive machinery
  • physical data visualization
  • planet surfaces
  • NMR data
  • information visualization
  • chemical models
  • mathematical relationships

23
VR/VE
  • Research problems
  • Visual displays
  • field of view, resolution
  • Audition (speech and non-speech, input and
    output)
  • Haptics (force feedback and tactile feedback)
  • Tracking
  • Emotion
  • Motion sickness
  • Software tools and models
  • Evaluation

24
VR/VE
  • Depends on
  • high speed computing
  • high speed rendering
  • low latency
  • good engineering design

25
NCSA CAVE
  • The CAVE (Cave Automatic Virtual Environment) is
    an integral part of the research activities of
    the NCSA Visualization and Virtual Environments
    Group.
  • Its true stereoscopic capabilities, coupled with
    its uniquely immersive design, enable scientists
    and researchers to interact with their data in
    ways never before possible.
  • An atmospheric scientist, for example, can
    actually "climb inside" of a hurricane and
    visualize its complex and chaotic elements from
    any angle or visual perspective.
  • A biological researcher, examining a tightly
    coiled strand of DNA, can virtually "unravel"
    this strand and manipulate it in an environment
    that preserves the critical depth information of
    the data.

26
NCSA CAVE
  • The CAVE works by reproducing many of the visual
    cues that your brain uses to decipher the world
    around you
  • differing perspectives presented by your eyes
  • depth occlusion
  • parallax, etc.
  • The CAVE provides true stereoscopic imagery
    through the use of four rear-projected screens
    using an active stereo system (flicker glasses)
  • The CAVE has an extremely advanced tracking
    system that enables it to constantly track the
    position and orientation of the special tracked
    glasses and the CAVE Wand.
  • The person wearing the tracked glasses controls
    the viewpoint of the CAVE.
  • They can look around the corner of an object,
    step behind it, look underneath it, or anything
    else that they could do in real life.

27
Using Elumens VisionDome to interact with an
architecture design
http//www.mediarelations.ksu.edu/WEB/News/Webzine
/0102/visiondome.html
http//www.elumens.com/
28
World Wide Web/Info. Spaces
  • Cyberspace, Information Space
  • its own reality
  • mediates human-human interaction
  • Must use Human Interface Technology to access
    this space
  • intelligent agents
  • mobile, goal oriented, user context awareness
  • Computer application interfaces
  • Enabling Technologies
  • browsers, GUIs, direct manipulation devices
  • email agents, meeting scheduling agents
  • face recognition synthesis
  • speech synthesis recognition

29
Entertainment, Art, Music
  • Music led the push for many alternate controllers
  • Keyboards, wah-wah pedals, pitch benders
  • Electronic sound synthesizers
  • Artists often push boundaries of technology to
    explore
  • human emotion
  • concepts and philosophy
  • expression
  • Video games drive H.I.T.

30
Early synthesizers
  • Therimin
  • Besides looking like no other instrument, the
    theremin is unique in that it is played without
    being touched. Two antennas protrude from the
    theremin - one controlling pitch, and the other
    controlling volume. As a hand approaches the
    vertical antenna, the pitch gets higher.
    Approaching the horizontal antenna makes the
    volume softer.
  • http//www.thereminworld.com/
  • Sackbut
  • First voltage-controlled synthesizer an
    instrument in which the operator would control
    three aspects of sound through operations on the
    keyboard in three co-ordinates of space vertical
    pressure was to correspond to volume lateral
    pressure to pitch change and pressure away from
    the performer to timbre.
  • http//www.hughlecaine.com/en/sackbut.html

31
Entertainment, Art, Music
  • Technologies
  • video processing and integration
  • gesture sensing and recognition
  • air guitar
  • wireless applications
  • robotics
  • image processing
  • high speed graphics
  • alternate controllers of all shapes and sizes

32
Technologies to stimulate and respond to human
information channels
33
Visual Display Technologies
  • Cathode ray tube (CRT)
  • Liquid Crystal Display (LCD)
  • Head Mounted Displays (HMD)
  • Projectors
  • CRTs
  • LCDs
  • Virtual Retinal Display (VRD)
  • Scan light directly onto retina no screens!
  • Stereo (3D)
  • Various technical approaches

34
Head Mounted Displays (HMDs)
  • Put one screen on each eye
  • Typical for VR applications
  • Trades off field-of-view with resolution
  • Advantages
  • cost, size
  • very immersive
  • Disadvantages
  • resolution
  • comfort
  • rotational error
  • motion sickness

35
Audio
  • Audio Displays (output)
  • Sound and voice are important at the interface
  • notice if it is absent, mismatched in time or
    appearance, or too loud/soft
  • Spatialization (3D audio)
  • Speech synthesis
  • Music synthesis
  • MIDI (Music Instrument Digital Inteface)
  • Audio Input
  • speech recognition
  • partially working, limited application
  • emotion recognition
  • some work being done here
  • environmental monitoring
  • position and range sensing

36
Audio Displays
  • Benefits
  • eyes free
  • rapid detection
  • alerting
  • backgrounding
  • parallel listening
  • acute temporal resolution
  • affective response
  • auditory gestalt formation
  • trend spotting
  • Disadvantages
  • low resolution
  • limited spatial resolution
  • lack of absolute values
  • lack of orthogonality
  • audio parameters not perceived independently
  • annoyance
  • interference with speech
  • not bound by line of sight
  • absence of persistence
  • no printout
  • user limitations

37
Audio Technologies MIDI
  • Standard instrument style controllers
  • keyboard
  • wind instruments
  • guitars (no strings attached!)
  • drums
  • Typical controllers added to instruments
  • Wheels, Aftertouch, Switches, Pedals
  • Ribbons, Joysticks, Breath Input, Other?
  • Weird controllers
  • Lightning, Thunder
  • Radio Baton (Max Mathews, CCRMA)
  • BioMuse (EMG based system)

38
Lightning
http//www.buchla.com
  • LIGHTNING II is a specialized MIDI controller
    that senses the position and movement of handheld
    wands and transforms this information to MIDI
    signals for expressive control of electronic
    musical instrumentation
  • Basically, LIGHTNING II senses the horizontal and
    vertical position of each hand, for a total of
    four independent coordinates. From this
    information, LIGHTNING's digital signal processor
    computes instantaneous velocity and acceleration,
    and performs detailed analysis of gesture. An
    easily mastered, musically oriented interface
    language allows the user to define relationships
    between various gestures and potential musical
    responses.
  • Performance gestures can be analyzed for
    direction and velocity and can be used to
    generate a variety of notes as well as other
    musical events. Multi-dimensional zoning
    capability can be used to create different
    musical responses in different regions.
    Everything you need to create the conceptual
    ensemble (an invisible, acoustic virtual
    reality).
  • User definable scale and tuning tables allow one
    to determine the range and selection of notes
    occurring along a horizontal or vertical axis.
    Pitches can be in any order, and the boundaries
    can be set where ever desired, facilitating the
    creation of spatial instruments and imaginary
    orchestras
  • LIGHTNING II features a conducting facility that
    can analyze a conductor's gestures, display
    deviations from a preset tempo, and signal errors
    such as missed beats. Simultaneously, LIGHTNING
    can transmit a synchronous MIDI clock for
    controlling external sequencers and output
    programmed note data to accompany specific beats
    within a measure.

39
Thunder
http//www.buchla.com
  • THUNDER IS A SPECIALIZED MIDI CONTROLLER that
    senses various aspects of the touch of hands on
    its playing surface, and transmits the resultant
    gestural information via MIDI (Musical Instrument
    Digital Interface) to responsive electronic
    instrumentation.
  • THUNDER is an alternative controller. All of
    THUNDER's keys respond to pressure some sense
    position as well others incorporate light
    emitting diodes used to indicate key status or
    currently selected options.
  • Called STORM, THUNDER's built-in operating
    language performs the essential function of
    defining the potential interaction between a
    musician and his instrument. Designed for use by
    musicians (programming experience not required),
    STORM is both user friendly and musically
    powerful.
  • Editing procedures are simple and consistent
    menu-labeled "softkeys" provide immediate and
    efficient access to data. Instrument setups
    ("configurations") can be stored in internal
    memory or on plug-in data cards.
  • In addition to configuring itself, THUNDER can
    direct auxiliary equipment to follow suit.
    Program change messages can be routed to any
    combination of MIDI channels and instruments.
    System exclusive messages can be captured by
    THUNDER, stored with configurations, and
    transmitted on command.
  • THUNDER's effects can generate a MIDI bombardment
    of gesture-controlled multiple echoes combined
    with various sorts of transpositions, fades and
    conditional branches

40
Radio-Baton
The Radio-Baton is a device which tracks the
motions of the tips of 2 batons in a 3
dimensional space
41
BioMuse
  • In 1992 BioControl introduced the BioMuse, a
    powerful, 8 channel "biocontroller" that acquires
    and analyzes any type of human bioelectric
    signal, and then outputs code to control other
    processor based devices. which allows users to
    control computer functions directly from muscle,
    eye movement, or brainwave signals, bypassing
    entirely the standard input hardware, such as a
    keyboard or mouse.
  • It receives data from four main sources of
    electrical activity in the human body
  • muscles (EMG signals),
  • eye movements (EOG signals),
  • the heart (EKG signals),
  • and brain waves (EEG signals).
  • These signals are acquired using standard
  • non-invasive transdermal electrodes.
  • http//www.biocontrol.com/biomuse.htm

42
  • In a nutshell, neural interface refers to a
    direct data link between a computer and the human
    nervous system.
  • Ideally, the user can control the activites of
    the computer directly from nerve or muscle
    signals without the need for conventional
    interface devices such as a keyboard or mouse.

http//www.biocontrol.com/biomuse.htm
43
Audio Displays
  • Audification
  • converting time series into sound
  • seismic data, stock data, temperature, pressure
  • want to leverage off familiarity
  • Sonification
  • a mapping of numerically represented relations in
    some domain under study to relations in an
    acoustic domain for the purposes of interpreting,
    understanding or communicating relations in the
    domain under study Scaletti, 1994
  • state transitions mapped to sounds, data set
    comparisons

44
Audio Technologies SampleApplications
  • Whisper A Wristwatch Style Wearable Handset
  • finger in your ear
  • play sound through bone by
  • speaker mounted on ring
  • speaker mounted on boom
  • speaker mounted on wrist
  • solves
  • size
  • noisy environment
  • speakers volume

Wrist handset used by insertinga fingertip in
the ear
Input by rhythmically touching fingertips
together
http//www.lab.nttdocomo.co.jp/english/kenkyu/medh
ia1.html
45
Audio Technologies Sample Applications
  • Interface for Blind (Earcons) E. Mynatt
  • Stephen Brewster Earcons were first proposed by
    Meera Blattner in 1989. They are abstract,
    musical tones that can be used in structured
    combinations to create auditory messages.
    Blattner defines earcons as "non-verbal audio
    messages that are used in the computer/user
    interface to provide information to the user
    about some computer object, operation or
    interaction". They are based on musical sounds.
  • http//www.dcs.gla.ac.uk/stephen/research.shtml
  • Using and Creating Auditory Icons Gaver
  • Everyday sounds that convey information about
    events in the computer or in remote environments
    by analogy with everyday sound producing events.
  • Synthesize
  • impact, bouncing, breaking, scraping, and machine
    sounds
  • parameterized
  • depend upon metaphor to everyday sounds

46
Audio Technology Earcons
  • Basic conclusions (Brewster, Wright and Edwards,
    1994)
  • large perceptual differences should be used to
    ensure recognition
  • Use musical instrument timbres
  • multiple harmonics
  • Dont use pitch
  • Register two or three octaves difference
  • Rhythm different number of notes in each rhythm
    pattern
  • dont use small note lengths
  • Intensity not too loud, not too soft
  • Combinations leave delay of 0.1s between earcons

47
Audio Technologies Sample Applications
  • LiveWire Interval Research
  • Sound and motion
  • While working on the sound mix for We Were
    Soldiers, I wished out loud for some way to give
    the audience the full impact of helicopters
    blasting overhead, artillery shells crashing
    down, jets screaming by, bullets whizzing through
    the air directly above, Randall Wallace
  • Listening to the Earth Sing C. Hayward
  • investigating the use of sonification techniques
    in seismic interpretation for oil exploration.

48
Human Measurement Technologies
  • Tracking Technologies
  • magnetic
  • optical
  • mechanical
  • video based
  • other
  • Primary user input
  • head tracking
  • eye tracking
  • hand tracking
  • pointing and selecting
  • other

49
Input/Data Acquisition System Design for Human
Computer Interfacing William Putnam and R.
Benjamin Knapp                                 
                                                  
                                     
http//www-ccrma.stanford.edu/CCRMA/Courses/252/se
nsors/sensors.html
50
Introduction
  • In this course we have divided the Human Computer
    Interface (HCI) into three parts the input/data
    acquisition, the computer recognition and
    processing, and the output/display (see
    Figure 1).   

Figure 1 The Human Computer Interface Structure
http//www-ccrma.stanford.edu/CCRMA/Courses/252/se
nsors/sensors.html
51
2 General Overview
  • Fiugure sic 2 shows how information from the
    human is passed to the computer. It separates the
    process into three parts sensors, signal
    conditioning, and data acquisition. The choices
    made in the design of these systems ultimately
    determines how intuitive, appropriate, and
    reliable the interaction is between human and
    computer.

Figure 2 The Path from Human to Computer
http//www-ccrma.stanford.edu/CCRMA/Courses/252/se
nsors/sensors.html
52
Physical values to sense
  • electrical
  • voltage
  • resistance
  • impedance
  • optical
  • colour
  • intensity
  • magnetic
  • induced current
  • field direction
  • mechanical force

53
Some sensors (and examples)
  • Force (piezo-electric, force-sensitive resistor,
    etc.)
  • Acceleration (recall Fma)
  • Biopotential
  • The human body's nervous system uses the ebb and
    flow of ions to communicate. This ionic transport
    within and along the nerve fibers can be measured
    on the surface of the skin using a specific type
    of electrochemical sensor commonly referred to as
    the surface recording electrode (sometimes just
    called the electrode).
  • Acoustic energy (sound) (microphones)
  • Light (photodetectors, cameras)
  • Electric fields
  • Physical proximity (sonar, IR reflection, radar,
    EMF)
  • Temperature (thermocouple)

54
Tracking Interfaces
  • Tracking user location is very important for many
    applications
  • want 3 degrees-of-freedom (dof) position
  • X, Y, and Z
  • want 3 dof orientation
  • roll, pitch, yaw (Euler angles), quaternions,
    rotation matrix
  • Trackers used to measure 3-6 dof typically
  • What to track
  • Head viewpoint tracking motion parallax
  • Eyes probably dont want to use it as point and
    select device
  • attention tracking rather than precise pointing
    device
  • Hand pointing and selecting finger
    configuration for manipulation
  • Body
  • Other

55
Hand/Arm/Body Tracking Applications
  • 2D and 3D input devices
  • Remote control manipulators
  • puppetry and computer animation
  • musical performance
  • surgical simulation
  • scientific simulation
  • gestural interface
  • sign language recognition, finger spelling
  • music controller
  • gesture mappings

56
CyberGlove, CyberGrasp
http//www.immersion.com/
  • The CyberGlove is a fully instrumented glove
    that provides up to 22 high-accuracy joint-angle
    measurements.
  • CyberGrasp is an innovative force feedback
    system for your fingers and hand. It lets you
    reach into your computer and grasp
    computer-generated or tele-manipulated objects.
  • The CyberGrasp is a lightweight, force-reflecting
    exoskeleton that fits over a CyberGlove and adds
    resistive force feedback to each finger. With the
    CyberGrasp force feedback system, users are able
    to feel the size and shape of computer-generated
    3D objects in a simulated virtual world.

57
Pinch Glove
http//www.fakespacelabs.com/
  • The PINCH glove system provides a reliable and
    low-cost method of recognizing natural gestures.
  • Recognizable gestures have natural meaning to the
    user
  • a pinching gesture can be used to grab a virtual
    object,
  • a finger snap between the middle finger and thumb
    can be used to initiate an action.
  • A hand-gesture interface system that allows
    developers and users of immersive applications to
    use hand interaction to work within the virtual
    environment.
  • The PINCH system uses cloth gloves with
    electrical sensors in each fingertip. Contact
    between any two or more digits completes a
    conductive path, and a complex variety of actions
    based on these simple "pinch" gestures can be
    programmed into applications.

58
Instrumented Footwear for Interactive Dance
  • Joseph Paradiso, Eric Hu, Kai-yuh Hsiao, MIT
    Media Laboratory

The Performance Shoe
59
Cybernet UseYourHead Gesture recognition
  • Use a webcam and simple software (only 9.95!) to
    tie head motions to keyboard macros
  • http//www.gesturecentral.com/useyourhead/index.ht
    ml

60
Cybernet EyeTracker
  • Cybernet's EyeTracker System reproduces an image
    of the real world marked by the user's gaze
    location, and provides a means to observe the
    pupil with a sophisticated system at a low cost.
  • Allows hands-free control of computer interface
    systems and eliminates the need for a mouse or
    joystick

61
Cybernet Firefly
http//www.gesturecentral.com/firefly/index.html
  • The Firefly by Cybernet Systems Corporation is a
    high-speed real-time optical tracking system
    designed for full body motion capture. The system
    consists of a factory calibrated camera array
    that tracks the position of active tags, infrared
    LEDs wired to a small controller box that can be
    worn wireless by the user for unencumbered and
    free movement.

62
  • Next Generation Interface...
  • Biocontrol Systems (BCS) has recently introduced
    a low cost controller for personal computers. BCS
    believes the time is right to change the way we
    interface with PCs at home, business and school.
    Among the huge installed base of PCs, there is
    growing demand for alternative controllers. With
    that in mind, BCS has developed a "hands free",
    wireless pointing device to replace or enhance
    the computer mouse and joystick.
  • Headband
  • The headband is a small and lightweight unit that
    tracks the position of the head and monitors eye
    blinks. This enables the user to control the
    computer cursor with head motion and use eye
    blinks to activate button clicks. Thus, a user
    can "point and click" without lifting a finger
    from the computer keyboard.
  • Armband
  • The armband resembles a watchband and can track
    the position of the arm and monitor the tension
    of the muscles under the band. As an example
    application, a user can play a video game with
    the gesture of pointing a "virtual" gun, and then
    fire the gun with the actual motion of the
    trigger finger.
  • HFC Benefits
  • Cursor controller for any "hands free" computer
    application, such as data entry work.
  • Laptop mouse replacement for mobile computing.
  • Helps prevent repetitive strain injury for the PC
    user. Every parent should be concerned with their
    children engaging in repetitive mouse clicking
    action, one of the main causes of carpal tunnel
    syndrome.
  • Gives access to physically challenged PC users.
  • Provides exciting new interactive control for
    video games.

63
Haptics Touch and Kinesthesia
  • Sense of touch and kinesthesia
  • tactile sensing
  • proprioception
  • The unconscious perception of movement and
    spatial orientation arising from stimuli within
    the body itself dictionary.com
  • Somatosensory system
  • Of or relating to the perception of sensory
    stimuli from the skin and internal organs
    dictionary.com
  • Bidirectional
  • sense environment
  • temperature, vibration, weight, etc.
  • manipulate environment
  • push, pull, pinch, hit, rotate, etc.

64
Haptics Tactile Sensing
  • Can sense
  • texture/vibration
  • temperature of object or environment
  • slip detection
  • surface compliance
  • elasticity
  • viscosity
  • electrical/thermal conductivity
  • vibration (other than for texture)
  • initial contact detection
  • gauging force required for manipulation

65
Introduction to Haptic Display Tactile
displayby Robert Howe, Harvard University
  • Skin sensation is essential for many manipulation
    and exploration tasks.
  • To handle flexible materials like fabric and
    paper, we sense the pressure variation across the
    finger tip.
  • In precision manipulation, perception of skin
    indentation reveals the relationship between the
    hand and the grasped tool.
  • We perceive surface texture through the
    vibrations generated by stroking a finger over
    the surface.
  • Tactile sensing is also the basis of complex
    perceptual tasks like medical palpation, where
    physicians locate hidden anatomical structures
    and evaluate tissue properties using their hands.
  • Tactile display devices stimulate the skin to
    generate these sensations of contact.
  • The term "tactile display" is sometimes used to
    describe any apparatus that provides haptic
    feedback
  • The skin responds to several distributed physical
    quantities the most important are perhaps
    high-frequency vibrations, small-scale shape or
    pressure distribution, and thermal properties.
  • Vibrations can relay information about phenomena
    like surface texture, slip, impact, and puncture.
  • Small-scale shape or pressure distribution
    information is much more difficult to convey. The
    most common design approach is an array of
    closely-spaced pins that can be individually
    raised and lowered against the finger tip to
    approximate the desired shape.
  • Thermal display is a relatively new area of
    research. Because human fingers are often warmer
    than the "room temperature" objects in the
    environment, thermal perceptions are based on a
    combination of thermal conductivity, thermal
    capacity, and temperature. This allows us to
    infer material composition as well as temperature
    difference.

66
Haptics Kinesthetic Interfaces
  • Awareness of position and movement and forces on
    body parts
  • Force feedback has been shown to be important
  • Teleoperation
  • Molecular docking
  • Grasping tasks

67
Haptics Kinesthetic Interfaces
  • Where is it useful?
  • Virtual reality/Augmented reality
  • medicine
  • surgery
  • diagnosis
  • scientific visualization
  • data manipulation
  • interactive art
  • situations where auditory and visual feedback are
    limited
  • aids to disabled
  • peripheral tasks
  • 3D manipulation

68
Haptics Kinesthetic Interfaces
  • Basic idea
  • Measure movement and forces exerted by user
    (fingers, hand, arm, body)
  • Calculate effect of forces on manipulated objects
    and resulting forces on user
  • virtual or real objects
  • Present forces to the users fingers, wrist, arms
    etc. as appropriate

69
Images from the Haptics Photo Gallery
http//haptic.mech.northwestern.edu/intro/gallery/
Manipulation of a virtual cube
  • Carnegie Mellons MagLev Wrist
  • The user grasps a levitated tool handle to
    interact with computed environments.
  • The dynamics of the handle are controlled so that
    the user feels the motion, shape, resistance, and
    surface texture of simulated objects.
  • 6-DOF motion (x,y,z,roll,pitch,yaw) with one
    moving part
  • Noncontact actuation and sensing
  • Carnegie Mellons WYSIWYF Display
  • What you see is what you feel
  • A Visual/haptic interface to virtual environments
  • Some skills such as pick-and-place can be
    regarded as visual-motor skills, where visual
    stimuli and kinesthetic stimuli are tightly
    coupled.

70
CyberTouch
  • CyberTouch is a tactile feedback option for
    Immersion's popular CyberGlove instrumented
    glove.
  • It features small vibrotactile stimulators on
    each finger and the palm of the CyberGlove.
  • Each stimulator can be individually programmed to
    vary the strength of touch sensation.
  • The array of stimulators can generate simple
    sensations such as pulses or sustained vibration,
    and they can be used in combination to produce
    complex tactile feedback patterns.
  • Software developers can design their own
    actuation profiles.

71
Braille Displays
http//www.freedomscientific.com/
  • Used with the JAWS for Windows screen reader,
    the refreshable Braille cells act as a tactile
    monitor that allow users to navigate and read
    information in dynamic Braille.

72
Impulse Engine
  • The Impulse Engine 2000 is a research quality
    force feedback joystick.
  • It allows applications to track the movements of
    the user and convey high fidelity tactile
    sensations to the user through force feedback.
  • The result is unmatched realistic simulation of
    surfaces, textures, liquids, gravitational
    fields, and biological materials, to name a few
    possibilities.
  • Two or more Impulse Engine 2000's can even be
    interfaced within a single system to create
    exciting multi-user force feedback experiences
    ranging from a ball toss to joint participation
    in simulated surgical procedures.

http//www.immersion.com/products/custom/impulseen
gine.shtml
73
Full body interfaces
  • Passive and active chair, cabin, cockpit,
    centrifuge
  • Used in
  • Flight simulators
  • Arcade games

74
Olfactory Interfaces
  • One of least developed
  • applications
  • poorly understood
  • social mores
  • Useful for
  • fire fighting
  • surgical training
  • immersion
  • manipulate mood
  • increase vigilance
  • decrease stress
  • retention and recall of material

75
Olfactory displays
  • Smell Enhance Experience System
  • seven odors in liquid form
  • delivered with small tube
  • DIVEpak (Southwest Research Institute, 1993)
  • 8 odors
  • contained in cartridge
  • heated and dissolved in air
  • blown at user
  • E. Piaggo BioRobotic Lab (U. of Pisa)
  • smell camera
  • Artificial Reality Corporation
  • developing odorants
  • Marketing Aromatics, Ltd.
  • Aromatic oils vaporized

76
Olfactory displays
  • Storage media
  • liquid, gel or waxy solids
  • microencapsulate odorants
  • scratch and sniff
  • drops of liquids encapsulated in gelatin
  • placed using silk screening
  • Display
  • air dilution
  • breathable membranes
  • liquid injection with air flow control

77
Research Topics
From the National Science Foundations request
for proposals on Human Computer Interaction
(http//www.cise.nsf.gov/div/iis/index.html)
  • The program's ultimate objective is to transform
    the human-computer interaction experience so that
    the computer is no longer a distracting focus of
    attention but rather an invisible tool that
    empowers the individual user and facilitates
    natural and productive human-human collaboration.
  • Human language is an important emphasis of this
    program
  • natural language and speech processing
  • gesture, gaze and emotive processing
  • the total multi-modal human-to-human
    communication experience.
  • HCI research topics also include
  • multi-modal environments
  • graphical and multimedia interfaces
  • use of gesture, movement, touch, and sound in the
    interface
  • highly interactive intelligent interfaces
  • virtual and augmented reality
  • immersive environments
  • wearable, mobile, and ubiquitous computing
  • new input and output devices

78
  • Immersive Art

79
Iamascope
  • The Iamascope is an interactive multimedia
    artwork.
  • The Iamascope combines computer video, graphics,
    vision, and audio technology enabling performers
    to create striking imagery and sound.
  • The result is an aesthetically uplifting
    interactive experience.
  • At an installation, the user takes the place of a
    colourful piece of floating glass inside a
    computer generated kaleidoscope, and
    simultaneously views the kaleidoscopic image of
    themselves on a huge screen in real time.
  • By applying image processing to the kaleidoscopic
    image, the performer's body movements directly
    control music in a beautiful dance of symmetry
    with the image.
  • The image processing uses simple intensity
    differences over time which are calculated in
    real-time.
  • The responsive nature of the whole system allows
    users to have an intimate, engaging, satisfying,
    multimedia experience.

http//hct.ece.ubc.ca/research/iamascope/
80
(No Transcript)
81
The Living Web
  • This CAVE -based interactive installation
    explores the extraordinary potential of the
    world-wide web as data and information medium.
    Today information on the Internet is presented in
    a standard fashion, as defined by the currently
    available image browsers. In "The Living Web"
    users can immerse themselves physically into this
    image and sound information streamed "live" from
    the Internet. Microphones pick up the users'
    conversations and use them to generate and
    download corresponding image and sound file from
    the Web. Users can furthermore interact with
    these data through intuitive interfaces and
    explore their content in more detail. "The Living
    Web" presents a novel system for intuitive,
    immersive and entertaining information creation
    and retrieval.

http//www.art-of-immersion.com/projects.html
82
SonoMorphisINTERACTIVE INSTALLATION WITH GENETIC
GRAPHICS AND SOUND
  • An organic object is projected in front of the
    visitors. By means of a control mechanism the
    user can rotate the object in all directions and
    observe it from various perspectives. Control
    sliders allow the viewer to vary diverse
    parameters of the object. The graphics and sound
    are inseparably linked to each other. In this way
    the space is always filled with new audiovisual
    bodies.
  • SonoMorphis is an interactive installation
    presenting genetic graphic and sound. The concept
    is based on the idea of creating an instrument
    with graphic and sonic dimensions whose variety
    and flexibility are capable of responding
    precisely and subtlely to the technique of the
    instrumentalist. Inside the CAVE, when the sound
    immerses into the depth of three-dimensional
    representation, the corresponding visual
    component moves away from the viewer and vice
    versa. You see and hear moving forms activated
    solely by sound. The artists created with
    SonoMorphis new sonic experiences that are
    simultaneously futuristic and historic, simple
    and monumental, phenomenological and
    mind-altering.

http//www.art-of-immersion.com/projects.html
83
  • conFiguring the CAVE provides the visitor of the
    CAVE with pictorial insights into seven worlds
    materiality, language, the macrocosm,
    association, union, person and emergence.
  • A mechanical doll that can be moved in space on a
    screen facilitates navigation through these
    geographical, cultural, historical, and
    body-related worlds.
  • The presentation involves all the senses,
    fascinating the participant through hyper-real
    3D-effects the simulations of space and body
    almost seem to cut through the viewer's own body,
    a sensation which is intriguing.

http//www.art-of-immersion.com/projects.html
84
Piano -as image media
  • Toshio Iwai
  • Interactive audio visual installation
  • 1995
  • Produced at ZKM / the Institute for Visual Media,
    Karlsruhe, Germany
  • Use the trackball and button to position and
    place 'dots' on a moving grid on the lower
    projection screen. These dots constitute a
    virtual score which triggers the piano keys,
    which in turn project computer-generated images
    on the upper screen. The patterns you create with
    these dots generate simple melodies and related
    visual formations.
  • In an age where digital technologies begin to
    replace the physical world with virtual forms,
    this work tries to combine the physical and the
    virtual into a new interactive experience. It
    makes an aesthetic conjunction of sound and
    image, as well as a functional conjunction of a
    mechanical object (the piano) with the digital
    media (the projected score and computer generated
    imagery).
  • In this way the piano itself seems to become
    transformed into image media - a flow of image
    depresses the piano's keys, which as a
    consequence release yet another flight of images.

http//www.iamas.ac.jp/iwai/artworks/piano.html
85
Piano -as image media
86
Piano -as image media
87
Resonance of 4
  • Toshio Iwai
  • Interactive audio visual installation
  • 1994
  • Resonance of 4 is an interactive audio-visual
    installation which allows four people to create
    one musical composition in cooperation with each
    other. In this installation, four players are
    given different tones with which they can compose
    their own melodies. Each person uses a mouse to
    place dots on four grid images projected onto the
    floor. My hope is that each player listens to the
    melodies which are being created by the other
    players, and then tries to change their own
    melody to make better harmony. In this way, the
    installation will not only generate a resonance
    of sounds, but will create a resonance of minds
    between the four players.

http//www.iamas.ac.jp/iwai/artworks/resonance.ht
ml
88
Resonance of 4
89
Tangible Bits
  • http//tangible.media.mit.edu/projects.htm

90
Tangible Bits
  • People have developed sophisticated skills
  • for sensing and manipulating our physical
    environment
  • Tangible Bits seeks to build upon these skills by
    giving physical form to digital information,
    seamlessly coupling the dual worlds of bits and
    atoms.
  • We are designing "tangible user interfaces" which
    employ physical objects, surfaces, and spaces as
    tangible embodiments of digital information.
  • Foreground interactions with graspable objects
    and augmented surfaces, exploiting the human
    senses of touch and kinesthesia
  • Background information displays which use
    "ambient media" -- ambient light, sound, airflow,
    and water movement.
  • Communicate digitally-mediated senses of activity
    and presence at the periphery of human awareness

91
SandScape
  • SandScape is a tangible interface for designing
    and understanding landscapes through a variety of
    computational simulations using sand. Users view
    these simulations as they are projected on the
    surface of a sand model that represents the
    terrain. The users can choose from a variety of
    different simulations that highlight either the
    height, slope, contours, shadows, drainage or
    aspect of the landscape model.
  • The users can alter the form of the landscape
    model by manipulating sand while seeing the
    resultant effects of computational analysis
    generated and projected on the surface of sand in
    real-time.
  • SandScape has been exhibited at "Get in Touch"
    exhibition at the Ars Electronica Center in Linz,
    Austria since September 2002. http//www.aec.at/c
    enter2/ausstellungskatalog/OG2/OG2_e.html

92
Illuminating Clay
  • This interface allows users to explore and
    analyze free form spatial models. Using this
    platform we explore the domain of landscape
    design, where the relationship between form and
    computational simulations is of particular
    relevance.
  • Landscape models are constructed using a ductile
    clay support. Three-dimensional geometry is
    captured in real time using a laser scanner. From
    this information simulations such as shadow
    casting, land erosion, visibility and travelling
    time are calculated. Finally, the results are
    projected back onto the clay model.
  • This allows the combination the advantages of
    physical interaction with the dynamic qualities
    of graphical displays.

93
ComTouch
  • ComTouch project explores the interpersonal
    communication by use of haptic technology.
  • We expect that touch as a communication medium
    will allow for more personal communication, and
    perhaps even open up remote communication to deaf
    blind users.
  • We hope to develop a haptic communication device
    that will enable users to transmit thoughts,
    feelings, and concepts to each other remotely.
  • The basic concept is a handheld device that
    allows the squeeze under each finger to be
    represented as vibration.
  • Through this research, we aim to describe more
    accurately the language of touch-based
    communication.

94
Sensetable
  • Sensetable is a system that wirelessly tracks the
    positions of multiple objects on a flat display
    surface quickly and accurately.
  • The tracked objects have a digital state, which
    can be controlled by physically modifying them
    using dials or tokens.
  • We have developed several new interaction
    techniques and applications on top of this
    platform.
  • Our current work focuses on business supply chain
    visualization using system dynamics simulation.

95
Audiopad
  • Audiopad is an instrument for electronic musical
    performance that aims to combine the modularity
    of knob based musical controllers with the
    expressive character of multidimensional tracking
    interfaces.
  • The performers manipulations of physical pucks
    on the Sensetable control a real-time synthesis
    process.
  • The system projects graphical information on and
    around the pucks to give the performer
    sophisticated control over the synthesis process.

96
Tangible query interfaces
  • This project is developing new means for querying
    relational databases and live datastreams through
    the manipulation of physical objects.
  • Parameterized query fragments are embodied as
    physical tokens ("parameter wheels").
  • These tokens are manipulated, interpreted, and
    graphically augmented on a series of sliding
    racks.
  • Token rotation maps to parameter selection
  • Token adjacencies maps to Boolean AND/OR
    operations
  • Token ordering maps to result sorting
  • Individual racks map to parenthetical groupings.
  • The interface should support querying to a wide
    range of relational databases
  • Initial target domains include media databases,
    network management, and bioinformatics.

97
Actuated Workbench
  • The Actuated Workbench is a device that uses
    magnetic forces to move objects on a table in two
    dimensions.
  • It is intended for use with existing tabletop
    tangible interfaces
  • providing an additional feedback loop for
    computer output
  • helping to resolve inconsistencies that otherwise
    arise from the computer's inability to move
    objects on the table

98
Dolltalk
  • In order to give the child the impression that a
    character is listening to their stories, we have
    invented a clever mechanism that captures the
    motions and speech of a child using sensors and
    audio processing.
  • Continuing in the vein of research on
    story-listening toys, we have built Dolltalk to
    encourage children to tell and act out original
    stories.
  • Dolltalk consists of two dolls and a system that
    records and plays back the stories.
  • The playback alters the pitch of the childs
    voice higher or lower depending on which doll is
    supposed to be speaking.
  • Children play with their customized dolls and
    tell their stories, which are recorded and played
    back with the same content but with different
    voices.
  • Dolltalk gives the illusion that it can truly
    listen by analyzing their physical gestures and
    speech

99
CADcast
  • CADcast is a system for augmenting physical
    workspaces with projected computer graphics
  • It supports users in constructing three
    dimensional structures with greater efficiency
    and more accuracy.
  • The system also supports improved coordination
    between the design and construction teams
    involved in architectural scale building
    projects.

100
inTouch
  • inTouch is a project to explore new forms of
    interpersonal communication through touch.
  • Force-feedback technology is employed to create
    the illusion that people, separated by distance,
    are interacting with a shared physical object.
  • The "shared" object provides a haptic link
    between geographically distributed users, opening
    up a channel for physical expression over
    distance

101
Pegblocks
  • Pegblocks are networked tactile transducers. As
    users manipulate the array of pegs, sliding them
    back and forth, motion is converted to
    electricity and converted back into motion
    through out the rest of the network. The
    resulting movements of the pegs is not determined
    by any one individual input but by the networked
    group as a whole.
  • Each peg is coupled to an electric dynamo/motor.
    The dynamo/motor can act in two modes it can
    generate electricity from motion and convert
    electricity into motion. As the peg is moved back
    and forth electric energy is generated and
    converted back into motion through out the rest
    of the network.
  • Pegblocks extend the notion of Distributed Shared
    Physical Objects of inTouch to explore
    re-configurable, haptic communication network,
    integrating representations using peg patterns
    that were impossible in inTouch. They allow
    simultaneous touch of multiple human hands to be
    extended over space, as well as asynchronous
    exchange of peg patterns.

102
LumiTouch
  • The Lumitouch system consists of a pair of
    interactive picture frames.
  • When one user touches her picture frame, the
    other picture frame lights up.
  • This touch is translated to light over an
    Internet connection.
  • We introduce a semi-ambient display that can
    transition seamlessly from periphery to
    foreground in addition to communicating emotional
    content.
  • In addition to enhancing the communication
    between loved ones, people can use LumiTouch to
    develop a personal emotional language.
  • LumiTouch explores emotional communication in
    tangible form.

103
Senseboard
  • Senseboard allows the user to arrange small
    magnetic pucks on a grid, where each puck
    represents a piece of information to be
    organized, such as a message, file, bookmark,
    citation, presentation slide, movie scene, or
    newspaper story.
  • As the user manipulates the physical puck, the
    corresponding digital information is projected
    onto the board.
  • Special pucks may be placed on the board to
    execute commands or request additional
    information.
  • We seek to combine the benefits of physical
    manipulation (natural, fluid, rapid, two-handed,
    multi-person interaction) with the benefits we
    can get from computer augmentation (interactive
    commands, functions, queries, operations,
    importing and exporting data, and remote
    collaboration).
  • We believe this type of interface is thus more
    effective for tasks involving organizing,
    grouping, and manipulating types of information
    that have no inherent physical representation,
    and it provides an example of a tangible
    interface for a typical "knowledge worker" task.

104
genieBottles
  • The genieBottles system presents a story that is
    told by three genies that live in glass bottles.
  • When a bottle is opened, the genie contained
    inside is released and begins to talk to the
    user.
  • If several genies are released at once, they
    converse with each other.
  • The physical bottles can be seen as graspable
    "containers" and "controls" for the digital story
    information.
  • The genieBottles use a simple state trans
Write a Comment
User Comments (0)
About PowerShow.com