MENTAL WORKLOAD - PowerPoint PPT Presentation

1 / 67
About This Presentation
Title:

MENTAL WORKLOAD

Description:

Trade-offs between principles and requirements. Air crash error identification and solution ... Example: high-end cameras, camcorders, video and CD recorders, stereos, ... – PowerPoint PPT presentation

Number of Views:1933
Avg rating:3.0/5.0
Slides: 68
Provided by: ACE599
Category:

less

Transcript and Presenter's Notes

Title: MENTAL WORKLOAD


1
MENTAL WORKLOAD SITUATION AWARENESS
  • Nantida Wisawayodhin M Sc. M.Erg.S.

Module Cognition and Information
Processing Course Human Centered Design
(HCD) SOAD, KMUTT
2
OVERVIEW
  • BTS redesign
  • Trade-offs between principles and requirements
  • Air crash error identification and solution
  • Relationship between MWL and SA
  • Situation Awareness (SA)
  • Mental Workload (MWL)
  • Case Study Three Mile Island
  • Mid-term project

3
BTS TICKETING MACHINE
  • INTERFACE REDESIGN
  • User types Novice, Occasional Expert
  • Users Tourist, daily users, blind, deaf
  • Objectives
  • Buy a correct ticket for the right destination

4
BTS TICKETING MACHINE
  • Daily users (Experts)
  • They are likely to use cards.
  • They are likely to know the zone number and cost
    of their regular destinations
  • Least group to worry about
  • Occasional users
  • Possibly use cards
  • Likely to be familiar with the system
  • Novice users
  • Unlikely to know how the ticketing system works
  • The user group to take into consideration the
    most when designing serf-service interfaces
  • Blind and deaf
  • May need to present information in an alternative
    mode

5
TASK SEQUENCE
6
MOVEMENT PATTERN
7
PRACTICAL Issues
  • Appearance
  • Not attention catching no clear indication that
    it is a ticketing machine
  • Design blends in with the environment
  • Look like another advertisement board
  • Task sequence
  • Not clear where to start
  • Too many sub tasks
  • Does not fit user expectations (one-stop shop,
    take bank note)
  • Design around system convenience than user
    convenience
  • Memory Aid
  • Step 1 is not the start point of the ticket
    buying process

8
EXAMPLES OF APPLICABLE GUIDELINES
  • Visually displayed memory aids should be used to
    replace a memory store with a physical store
    wherever this is necessary.
  • The different problem solving strategies employed
    by experts and novices must be taken into account
    when designing for both types of users
  • Consider design in guidance for the
    transformation of information from novice to
    expert by making expert paths visible to novices.
  • Tasks that use verbal working memory are served
    best by auditory inputs and vocal outputs.
  • Auditory presentation is not very effective when
    the message is long (gt5 unrelated words or
    letters). In such cases, text input should be
    used as well as or instead of speech.
  • Exploit existing known mental models to help
    users to better understand the product more
    intuitively

9
PRACTICAL Solutions
  • Appearance
  • Attention catching Clear information sign to
    indicate location of Ticket Machine
  • Make it stand out and visible at the entrances
  • Task sequence
  • Reduce number of steps
  • Make it a one-stop shop
  • Design to take bank notes
  • Design with user in mind
  • Memory Aid
  • Start step 1 where it should be at the map
  • Blind users
  • Use handrail or floor texture to indicate where
    to go
  • Make use of vocal-auditory modality
  • Use Brail language on buttons
  • Provide staff assistance

10
MOVEMENT PATTERN
11
TRADE OFFS
  • Between design principles and HF principles
  • Aesthetics/trend vs. ease of use
  • Between engineering requirements and HF
    principles
  • Optimum condition for machine its performance
    vs. optimum condition for the operators and their
    performance
  • Between safety requirements and HF principles
  • Protection vs. visibility or ease of movement
  • Between HF principle and HF principle
  • Providing situation awareness vs. avoiding
    distraction

12
TRADE OFFS Design Principles
13
TRADE OFFS Design Principles
14
TRADE OFFS Safety requirement

15
TRADE OFFS Safety requirements
16
TRADE OFFS HF Principles
17
MIDLAND AIR CRASH
  • A crash of Midlands flight 092 in June 1989 was
    initiated by a mechanical failure in the left
    engine, which had escalated due to a misdiagnosis
    of the problem by the pilot and co-pilot,
    resulting in a shutdown of the good engine. A
    total of 47 lives were lost with numerous
    injuries.

18
MIDLAND AIR CRASH
  • Sequence of events
  • Please see the printed table provided

19
MIDLAND AIR CRASH
  • Active failure
  • KB and RB mistakes - misdiagnosis of engine
    failure
  • SB lapse failure to complete situation check
    procedure
  • Latent failure
  • Faulty design of the left engine
  • Poor display design
  • The lack of communication of the changes made to
    Boeing 737
  • Inadequate training
  • Lack of communication procedure between pilot and
    air crew during emergencies
  • Human error is the tip of the icebergs!

20
MIDLAND AIR CRASH
  • Changes made to the Boeing 737
  • Please see the printed table provided

21
MIDLAND AIR CRASH
  • Loss of situation awareness (SA)
  • Applying incorrect mental model
  • KB and RB mistakes
  • Fast moving, dynamic and complex system
  • Errors difficult to detect and correct
  • Increase in mental workload (MWL)
  • No longer can make use of available knowledge
  • KB performance heavy demand on WM and mental
    resources
  • More difficult to follow the situation and
    predict the outcome of actions taken
  • The new design of Boeing 737 did not take human
    behaviour into consideration

22
MIDLAND AIR CRASH
  • Issue Engine never tested in flight, only bench
    tested in laboratory. Flight test was not
    mandatory
  • Solution Manufacturers now must flight test all
    new engines
  • Issue Flight simulator training none, so when
    pilots meet problems it is for real
  • Solution Now mandatory flight simulation
    training for all new designs and upgrades
  • Issue Cockpit displays being small and difficult
    to use are difficult to interpret
  • Solution Boeing redesign cockpit displays for
    ease of interpretation for pilots.
  • Issue Lack of communication between pilots and
    air crew during emergency
  • Solution Full communication between pilot and
    air crew

23
MIDLAND AIR CRASH
  • Difficulty issue Bad timing of
  • Communication between ATC and pilot
  • Communication between train driver and signaller
  • Difficult to solve, but one solution
  • Role swap to become aware of the others tasks
  • Encourage communication and exchange of knowledge
    between the roles

24
MWL SA
  • The two concepts are intertwined.
  • The interaction between external cues and the
    internal knowledge
  • Internal Skill, expertise, experience, schema,
    mental models
  • External Available information from the external
    world in order to access correct mental model and
    make informed decisions
  • Maintenance of the dynamics of the situation
    mentally requires mental resources
  • Access of a correct mental model reduces the
    competition for the same mental resources for
    real-time decision making and problem solving and
    SA
  • Novel situations or novices unable to make use of
    existing knowledge (mental models) resort to KBP
    to maintain SA and to make decisions high
    competition of the same mental resources high
    MWL

25
INFORMATION PROCESSING SA
  • The combined operations in perception, working
    memory, and long-term memory, that enable the
    decision maker to entertain hypotheses about the
    current and future state of the world

26
MODEL OF HUMAN INFORMATION PROCESSING
27
SITUATION AWARENESS
  • Closer to home examples
  • Doctor appointment scenario
  • Indicating when wanting to turn or change lanes

28
SITUATION AWARENESS
  • Why do we concern ourselves with SA?
  • To better understand the causes of accidents and
    disasters
  • To design displays and systems to support SA

29
BREAK
  • 15 MINUTES

30
SA INTRODUCTION
  • Studies found that people are generally able to
    physically perform complex tasks and deciding on
    appropriate actions
  • But tend to find it difficult to understand what
    is going on in the situation
  • Developing and maintaining situation awareness is
    the critical and most important component of
    effective decision making
  • A vast proportion of our everyday problem solving
    and decision making performance is spent on
    developing SA and keeping it up to date in the
    rapidly changing environment

31
SA INTRODUCTION
  • All of the incoming data from the many systems,
    the outside environment, fellow team members and
    others must all be brought together into an
    integrated whole. The integrated picture forms
    the control organising feature from which all
    decision making and action takes place.
  • They key to coping in the information age is
    developing systems that support this process.
    Presenting a ton of data will do no good unless
    the data are transmitted, absorbed, and
    assimilated successfully and in a timely fashion
    by the human in order to form SA
  • Salvendy, 2000

More data ? more information
32
SA INTRODUCTION
More data ? more information
33
SA INTRODUCTION
  • Loss of SA is the leading causal factor in
  • Military aviation mishaps (Hartel et al., 1991)
  • Nuclear power operation (Hogg et al., 1993)
  • Accidents among major air carriers, 88 of those
    involving human error could be attributed to
    problems with SA (Endsley, 1995)
  • ATC errors (Rodgers et al., 2000)

34
SA DEFINITION
  • The perception of the relevant information in the
    environment, the comprehension of their meaning
    and the projection of their status in the near
    future
  • Endsley, 1995

35
SA FORMING PROCESS
  • Process of forming SA has three levels
  • Level 1 Perception of the Elements in the
    Environment
  • First it is necessary to perceive critical
    factors in the environment
  • Level 2 Comprehension of the Current Situation
  • Understanding what the perceived factors mean,
    particularly with relevance to the goal
  • Novices this maybe taxing to mental resources
    resulting in high MWL
  • Level 3 Projection of the Future Status
  • An understanding of what will happen with the
    system/product situation in the near future
  • Experts spend more time anticipating possible
    future situations giving them the knowledge and
    time necessary to decide on the most favourable
    course of action
  • (Endsley, 1995)

36
DESIGN TO SUPPORT SA
  • Present clear and unambiguous critical
    information required for making well-informed
    decisions in a coherent manner
  • Avoid presenting irrelevant information (this
    will only increase noise and information
    overload)
  • Exploit the way we organise information (grouping
    and hierarchical) wherever possible
  • Information presented should also include the
    future state wherever possible
  • For novel situations and novices, care should be
    taken to not exceed the WM capacity for realistic
    situations (2 3 steps or items)

37
SA EXAMPLE
Good support for SA coherent presentation with
critical information
38
SA GOOD OR BAD
39
SA GOOD OR BAD
40
SA GOOD OR BAD
41
SA GOOD OR BAD
42
SA GOOD OR BAD
43
SA GOOD OR BAD
44
SA GOOD OR BAD
45
SA GOOD OR BAD
46
SA GOOD OR BAD
47
SA GOOD OR BAD
48
SA GOOD OR BAD
Hong Kong MTR in-cab route map
49
SA GOOD OR BAD
  • The Intelligence signs on Thai roads providing
    information on the current situation for the
    related roads and junctions

50
SITUATION AWARENSS
  • How do we know which piece of information is
    relevant to SA and good decision making?
  • SA requirement analysis methods
  • Written material and document review
  • Expert consultation
  • User observation, verbal protocol, interviews

51
SA COMPLEX SYSTEMS
52
LOSS OF SA
  • How do we lose situation awareness?
  • The critical information necessary to make a well
    informed decision is not available
  • An incorrect mental model is activated due to
    inappropriate, absence or ambiguous cues,
    resulting in RB and KB mistakes
  • Lack of expertise or domain-specific knowledge
    necessary to understand the situation (absence of
    correct mental model)

53
LOSS OF SA
  • What happen if we lose SA?
  • High utilisation of KB performance
  • Heavy demand on mental resources
  • High or unacceptable MWL
  • Significant increase in RB and KB mistakes
  • Reduced ability to detect and correct errors
  • Accidents and disasters

54
LOSS OF SA
  • Factors contributing to loss of SA
  • Endsley identified 8 SA demons
  • Factors that work to undermine SA and may cause
    loss of SA

55
SA EIGHT DEMONS
  • Attentional Tunneling
  • Focused on particular aspects or features of the
    environment to be processed to the detriments of
    the perception of other aspects which are also
    relevant to the goal
  • Failure to divide attention particularly between
    within-modality stimuli
  • One of the most significant challenges to SA
  • Example Driver is concentrated on changing lanes
    and do not see the motorbike on the left hand side

56
SA EIGHT DEMONS
  • 2. Requisite Memory Trap
  • The limited capacity and duration of memory
    retention in the WM pose a limit on information
    processing and development of integrated
    information
  • Cannot take in all critical information and
    comprehend the situation poor or loss of SA and
    high MWL
  • System designs that necessitate that people
    remember information, even short term, increase
    the likelihood of SA error
  • Example Retaining route information in the
    memory and mapping it to the real-world picture

57
SA EIGHT DEMONS
  • 3. Workload, Anxiety, Fatigue and other Stressors
  • Stressors time pressure, anxiety, uncertainty,
    noise, vibration, excessive heat/cold, poor
    lighting, physical fatigue, personal factors,
    alarms and alerts
  • Reduce information gathering capability (become
    more disorganised) and WM capacity (pay less
    attention)
  • Increase the tendency for attentional tunneling
  • Often lead to premature closer (making a decision
    without taking into account all available
    information)
  • Example Three Mile Island disaster

58
SA EIGHT DEMONS
  • 4. Data Overload
  • Complex and rapidly changing environment
    constant input-processing-output
  • Limited capacity of WM (serial processing,
    capacity, modality interference etc)
  • At an unacceptable speed with unacceptable number
    of items of information data overload loss of
    SA and high MWL
  • Coherent and effective presentation of data
    reduce risk of data overload
  • Example Finding one particular piece of
    information in a 50-page document.
    Well-structured documents vs. a 50-page long
    text. Three Mile Island disaster

59
SA EIGHT DEMONS
  • 5. Misplaced Salience
  • Salience catch attention
  • Good draw attention to critical or highly
    important information
  • Bad overuse or misuse can lead be misleading,
    distractive and/or overwhelming
  • Interrupt SA formation process and increase MWL
  • Examples Alerts and alarms in complex systems.
    The constant lit yellow light at some level
    crossings

60
SA EIGHT DEMONS
  • 6. Complexity Creep
  • Systems/products may become more and more complex
    overtime
  • The more features, the more complex, the more
    branching of the rules, the less able a person is
    to comprehend and form an internal representation
    of how the system works to make predictions of
    its behaviour
  • Increase risk of incorrect interpretation of
    information presented and prediction of what is
    likely to happen
  • Example high-end cameras, camcorders, video and
    CD recorders, stereos, microwaves and computer
    software such as Microsoft Office

61
SA EIGHT DEMONS
  • 7. Errant Mental Models
  • Mental models are important mechanisms for
    building and maintaining SA
  • Use of incorrect or incomplete mental models
  • Poor comprehension and projection
  • Difficult to realise and break out of
  • Mode error thinking the system is in one mode
    when it is actually in another is the most
    dangerous and problematic of this demon
  • Example The Midlands flight 092 crash near M1

62
SA EIGHT DEMONS
  • 8. Out-of-the-Loop Syndrome
  • Automation lowers SA by putting people out of the
    loop
  • People become less involved and less alert
  • But humans are still decision maker and problem
    solver when things go wrong
  • Complacency and overtrust less likely to
    monitor the job the automated system is doing
    loss SA of the evolving state of the system
    less able to deal with the problem appropriately
  • Example The return of control to pilot when the
    autopilot suddenly and unexpectedly fail. Various
    complex systems

63
SA MWL SUMMARY
  • MWL refers to the interaction between task
    requirements and human capabilities or resources.
    The higher the MWL, the more demanding the tasks
    and the less available the resources to perform
    another task
  • SA is the awareness and understanding of the
    current situation and ability to formulate future
    states
  • The development and maintenance of SA compete for
    the same resources as other information
    processing tasks
  • Expertise and skill reduces the competition for
    resources
  • Mental models are important mechanisms for
    building and maintaining SA
  • Provide critical information necessary to make a
    well-inform decision in a coherent manner for the
    user to comprehend and project the near future

64
Any Questions?
65
BREAK
  • 15 MINUTES

66
CASE STUDY THREE MILE ISLAND
  • Summary of the incident at TMI
  • A partial meltdown of the reactor core at the
    Three Mile Island nuclear power plant in
    Pennsylvania on 28rd March 1979 due to water
    leakage in the cooling system as a result of a
    routine maintenance
  • The reactor shut down within 13 seconds
  • Sixteen hours later, the emergency was finally
    brought under control.
  • A small amount of radioactive material was
    release into the atmosphere
  • No direct loss of life, but a billion dollar of
    damage and public mistrust
  • The Nuclear Regulatory Commission has not
    reviewed an application to build a new nuclear
    power plant in the United States since
  • Permanent closure of TMI2

67
CASE STUDY THREE MILE ISLAND
  • Loss of SA due to
  • Lack of indication of malfunctioning (leakage as
    a result of maintenance),
  • Bad mechanical design of the PORV displays,
  • Delayed print out of the system state from the
    computer monitoring the system,
  • Rapidly changing states of a complex system where
    activities are highly automated
  • Automation resulting in human being designed
    out-of-the-loop
  • Leads to RB and KB mistakes
  • Delay in detection and recovery
Write a Comment
User Comments (0)
About PowerShow.com