Afdelingsvergadering - PowerPoint PPT Presentation

1 / 44
About This Presentation
Title:

Afdelingsvergadering

Description:

Costs of 'real world' data are high. New technologies often include new ... Volitional control. Accommodation. Stress. Impulsiveness. Taste. Alcohol level ... – PowerPoint PPT presentation

Number of Views:35
Avg rating:3.0/5.0
Slides: 45
Provided by: RvG
Category:

less

Transcript and Presenter's Notes

Title: Afdelingsvergadering


1
Human Performance Metrics for ATM Validation
Brian Hilburn NLR Amsterdam, The Netherlands
2
Overview
  • Why consider Human Performance?
  • How / When is HumPerf considered in validation?
  • Difficulties in studying HumPerf
  • Lessons Learnt
  • Toward a comprehensive perspective

(example data)
3
Traffic Growth in Europe
4
Accident Factors
5
Why consider HUMAN metrics?
  • Unexpected human (ab)use of equipment etc.
  • New types of errors and failures
  • Costs of real world data are high
  • New technologies often include new hidden risks
  • Operator error vs Designer error
  • Transition(s) and change(s) are demanding
  • Implementation (and failure) is very
    expensive!

6
Famous Human Factors disasters
  • Titanic
  • Three Mile Island
  • Space shuttle
  • Bhopal
  • Cali B-757
  • Paris A-320
  • FAA/IBM ATC

7
When human performance isnt considered...
8
...!!!!!!
9
What is being done to cope?Near and medium term
solutions
  • RVSM
  • BRNAV
  • FRAP
  • Civil Military airspace integration
  • Link 2000
  • Enhanced surveillance
  • ATC tools

10
ATM The Building Blocks
Operational concepts (eg Free Flight)
Procedures (eg FF-MAS Transition)
Tools (eg CORA)
Displays (eg CDTI)
11
Monitoring in Free Flight Ops Con drives the
ATCos task!
12
NLR Free flight validation studies
  • Human factors design measurements
  • Ops Con displays procedures algorithms
  • Retrofit automation displays
  • TOPAZ no safety impairment.
  • no pilot workload increase with..
  • 3 times present en-route traffic
  • delay, fuel emission savings
  • ATC controller impact(s)
  • collaborative workload reduction
  • Info at NLR website

13
The aviation system test bed

Two way Radio
Data links
scenario 'events'
scenario 'events'
Experiment Scenario Manager
14
Evaluating ATCo Interaction with New Tools
? Human Factors trials
? ATCos Pilots
? Real time sim
? Subjective data
? Objective data also
15
Objective Measures
Integrated with subjective instruments...
Scan pattern Pupil diameter Blink rate Scan
randomness
Heart Rate Respiration
HEART Analysis Toolkit
16
Correlates of Pupil Diameter
Emotion Age Relaxation / Alertness Habituation Bin
ocular summation Incentive (for easy
problems) Testosterone level Political
attitude Sexual interest Information processing
load
Light reflex Dark reflex Lid closure
reflex Volitional control Accommodation Stress Imp
ulsiveness Taste Alcohol level
17
Pupil Diameter by Traffic Load
18
Automation assistance or burden?Conflict
detection resolution tools
19
Low Traffic
Visual scan trace, 120 sec.
20
High Traffic
Visual scan trace, 120 sec
21
Positive effect of automation on heart rate
variability
22
Positive effect of automation on pupil size
23
Better detection of unconfirmed ATC data
up-links
24
No (!) positive effect on subjective workload
25
Objective vs Subjective Measures
Catch 22 of introducing automation Ill
use it if I trust it. But I cannot trust it until
I use it!
26
Automation Traffic Awareness
27
Converging data The VINTHEC approach
  • Team Situation Awareness

EXPERIMENTAL correlate behavioural markers w
physio
ANALYTICAL Game Theory Predictive Model of
Teamwork
VS
28
Free Routing Implications and challenges
Implications Airspace definition Automation
tools Training ATCo working methods Ops
procedures
Challenges Operational Technical Political
Human Factors
FRAP
29
Sim 1 Monitoring for FR Conflicts
  • ATS Routes
  • Direct Routing Airways plus direct routes
  • Free Routes
  • Structure across sectors

30
Sim 1 Conf Detection Response Time
Response time (secs)
31
Studying humans in ATM validation
  • ? Decision making biases-- ATC skilled,
    routine, stereotyped
  • ? Reluctance-- Organisational / personal (job
    threat)
  • ? Operational rigidity -- unrealistic scenarios
  • ? Transfer problems-- Skills hinder interacting w
    system
  • ? Idiosyncratic performance-- System is strategy
    tolerant
  • ? Inability to verbalise skilled performance--
    Automaticity

32
Moving from CONSTRUCT to CRITERION Evidence
from CTAS Automation Trials
Time-of-flight estimation error, by traffic load
and automation level.
33
Controller Resolution Assistant (CORA)
  • EUROCONTROL Bretigny (F) POC Mary Flynn
  • Computer-based tools (e.g. MTCD, TP, etc.)
  • Near-term operational
  • Two phases
  • CORA 1 identify conflicts, controller solves
  • CORA 2 system provides advisories

34
CORA The Challenges
  • Technical challenges
  • Ops challenges
  • HF challenges
  • Situation Awareness
  • Increased monitoring demands
  • Cognitive overload
  • mis-calibrated trust
  • Degraded manual skills
  • New selection / training requirements
  • Loss of job satisfaction

35
CORA Experiment
  • Controller preference for resolution order
  • Context specificity
  • Time benefits (Response Time) of CORA

36
Synthesis of results
Operationalised Definition
Result
Construct
SA ATA-ETA Auto x Traf Workload PupDiamTX
- PupDiam base Datalink display reduces
WL Dec Making/ Response bias Intent
benefits Strategies Vigilance RT to
Alerts FF CF Attitude Survey responses FF
OK, but need intent info
37
Validation strategy
  • Full Mission Simulation
  • Address human behaviour in the working context
  • Converging data sources (modelling, sim (FT,RT),
    etc)
  • Comprehensive data (objective and subjective)
  • Operationalise terms (SA, WL)
  • Assessment of strategies
  • unexpected behaviours, or covert Dec Making
    strategies

38
Human Performance MetricsPotential Difficulties
  • Participant reactivity
  • Cannot probe infrequent events
  • Better links sometimes needed to operational
    issues
  • Limits of some (eg physiological) measures
  • intrusiveness
  • non-monotonicitytask dependence wrt
  • reliability, sensitivity
  • time-on-task, motor artefacts
  • Partial picture
  • motivational, social, organisational aspects

39
Using HumPerf Metrics
  • Choose correct population
  • Battery of measures for converging evidence
  • Adequate training / familiarisation
  • Recognise that behaviour is NOT inner process
  • More use of cog elicitation techniques
  • Operator (ie pilot / ATCo) preferences
  • Weak experimentally, but strong organisationally?

40
Validation metrics Comprehensive and
complementary
  • Subj measures easy, cheap, face valid
  • Subj measures can tap acceptance (wrt new tech)
  • Objective and subjective can dissociate
  • Do they tap different aspects (eg of workload)?
  • Eg training needs identified
  • Both are necessary, neither sufficient

41
Operationalise HF validation criteria
  • HF world (SA, Workload) vs
  • Ops world (Nav accuracy, efficiency)
  • Limits dialogue between HF and Ops world
  • Moving from construct (SA)
  • to criterion (traffic prediction accuracy)

42
Summing Up Lessons Learnt
  • Perfect USER versus perfect TEST SUBJECT
    (experts?)
  • Objective vs Subjective Measures
  • both necessary, neither sufficient
  • Operationalise terms pragmatic, bridge worlds
  • Part task testing in design Full mission
    validation
  • Knowledge elicitation STRATEGIES

43
Summing Up (2)...
  • Why consider Human Performance?
  • New ATM tools etc needed to handle demand
  • Humans are essential link in system
  • How / When is HumPerf considered in validation?
  • Often too little too late
  • Lessons Learnt
  • Role of objective versus subjective measures
  • Choosing the correct test population
  • Realising the potential limitations of experts
  • Toward a comprehensive perspective
  • Bridging the experimental and operational worlds

44
Thank You...
  • for further information
  • Brian Hilburn
  • NLR Amsterdam
  • tel 31 20 511 36 42
  • hilburn_at_nlr.nl
  • www.nlr.nl
Write a Comment
User Comments (0)
About PowerShow.com