CAREASAS Validation Framework Guidelines - PowerPoint PPT Presentation

1 / 29
About This Presentation
Title:

CAREASAS Validation Framework Guidelines

Description:

Master ATM European Validation Plan (MAEVA) European Commission funded 5th ... Physiological measures of EEG potentials (brainwaves) dismissed as too intrusive. ... – PowerPoint PPT presentation

Number of Views:50
Avg rating:3.0/5.0
Slides: 30
Provided by: rseve
Category:

less

Transcript and Presenter's Notes

Title: CAREASAS Validation Framework Guidelines


1
CARE/ASAS Validation Framework Guidelines Case
Studies
  • Mark Watson
  • NATS

2
Contents
  • WP4 MAEVA VGH
  • The Validation Framework
  • The Case Studies

3
Work Package 4
Align previous work packages to MAEVA VGH
Update EMERALD RTD Plan (presented later)
Write guidelines and include Activity 3 case
studies
Guideline Report
4
Master ATM European Validation Plan (MAEVA)
  • European Commission funded 5th framework project
  • Promote a common framework for validation of 5th
    FP ATM projects
  • Proposes top-down approach rather than
    enabler-targeted bottom-up approach
  • Describes lifecycle of ATM steps from concept to
    operational implementation
  • Wider intended adoption throughout Europe

5
WP1 Initial Validation Framework
WP2 System Performance Metrics
WP3 Human Performance Metrics
Compare To MAEVA VGH
Include Activity 3 Case Studies
6
CARE/ASAS Validation Framework five steps to
enlightenment!
Step 1 Identification Of Validation Aims,
Objectives And Hypotheses Step 2 Validation
Design - Plan Prepare The Validation
Exercise Step 3 Conduct Of Validation Exercise
Runs Step 4 Analysis of the Results Step 5
Develop and Report Conclusions Recommendations
but with 16 actions...
7
Step 1 IDENTIFICATION OF VALIDATION AIMS,
OBJECTIVES AND HYPOTHESES
  • Action 1. Understanding the ATM problem
  • Action 2. Selection of the ASAS application
  • Action 3. Identification of stakeholders
  • Action 4. Identification of validation aims
  • Action 5. Definition of the high level objectives
    (HLO)
  • Action 6. Definition of the low level objectives
    (LLO)
  • Action 7. Establishing validation platform
    requirements and selection of the validation
    technique
  • Action 8. Selection of system performance and
    human performance metrics and hypotheses
  • Action 9. Definition of the high level
    experimental design
  • Action 10. Operational and statistical
    significance

8
  • Step 2 Validation Design - Plan Prepare The
    Validation Exercise
  • Action 11 Selection Of The Validation
    Platform/Tool
  • Action 12 Scenario Definition
  • Action 13 Production Of Detailed Experiment
    Design
  • Step 3 Conduct Of Validation Exercise Runs
  • Action 14 Execution
  • Step 4 Analysis of the Results
  • Action 15 Results Analysis
  • Step 5 Develop and Report Conclusions
    Recommendations
  • Action 16 Conclusion Recommendations

9
MAEVA VGH vs. CARE/ASAS VF
10
Case Study Examples
  • Time based sequencing in approach
  • Airborne Separation category
  • sequencing merging operations from Top of
    Descent until Final Approach Fix
  • time is the separation criteria
  • limited separation responsibility delegated to
    pilot
  • Airborne separation minima may be lower than ATC
    separation minima
  • Mixed levels of ADS-B equipage
  • Example airspace Madrid

11
Case Study Examples
  • Airborne Self Separation in Segregated En-route
    Airspace
  • Airborne self separation category
  • Free flight segregated airspace
  • Aircraft fly preferred route between entry and
    exit
  • Flight crews responsible for self-separation from
    all aircraft
  • example airspace Mediterranean

12
Order Of The VF Presentations
  • Scenario Template Database - Juan Alberto
    Herreria, ISDEFE
  • System Performance Metrics - Mike Sharples,
    QinetiQ
  • Case Study (Time based sequencing) Actions part 1
    - Mark Watson, NATS
  • Discussion Forum
  • Lunch
  • Human Performance Metrics Experimental Design -
    Brian Hilburn, NLR
  • Case Study Actions part 2 - Mark Watson, NATS

13
Coffee Break
14
Case Study of the Validation Framework
  • Time Based Sequencing In Approach

15
The Validation Framework
  • Step 1 Identification Of Validation Aims,
    Objectives And Hypotheses (10 actions)
  • Step 2 Validation Design - Plan Prepare The
    Validation Exercise (2 actions)
  • Step 3 Conduct Of Validation Exercise Runs
  • Step 4 Analysis of the Results
  • Step 5 Develop and Report Conclusions
    Recommendations

16
Step 1 IDENTIFICATION OF VALIDATION AIMS,
OBJECTIVES AND HYPOTHESES
Action 1. Understanding the ATM problem Action 2.
Selection of the ASAS application Action 3.
Identification of stakeholders Action 4.
Identification of validation aims Action 5.
Definition of the high level objectives
(HLO) Action 6. Definition of the low level
objectives (LLO) Action 7. Establishing
validation platform requirements and selection of
the validation technique Action 8. Selection of
system performance and human performance metrics
and hypotheses
17
Action 9. Definition of the high level
experimental design Action 10. Operational and
statistical significance
Step 2 VALIDATION DESIGN - PLAN PREPARE THE
VALIDATION EXERCISE
Action 11 Selection Of The Validation
Platform/Tool Action 12 Scenario
Definition Action 13 Production Of Detailed
Experiment Design
18
Step 1 IDENTIFICATION OF VALIDATION AIMS,
OBJECTIVES AND HYPOTHESES
Action 1. Understanding the ATM problem
  • Constrained capacity on approach within TMA
    airspace.

Action 2. Selection of the ASAS application
  • Increase capacity on approach by aircraft flying
    the minimum aircraft separation.
  • Selected ASAS application is Time Based
    Sequencing In Approach.
  • Separation responsibility should be delegated to
    the pilot to decrease controller workload.
  • Maintain present level of safety.

19
Step 1 IDENTIFICATION OF VALIDATION AIMS,
OBJECTIVES AND HYPOTHESES
Action 3. Identification of stakeholders
  • Airline operator
  • Pilot
  • ATSP
  • Airport Operator
  • ATCO

Action 4. Identification of validation aims
  • Assess the application for its effect on capacity
    in TMA on approach.
  • Assess the impact on controller and pilot
    workload and TMA capacity.

20
Step 1 IDENTIFICATION OF VALIDATION AIMS,
OBJECTIVES AND HYPOTHESES
Action 5 Definition of the High-Level Objectives
(HLO)
  • Safety
  • Capacity
  • Economics

Action 6 Identification of Low-Level Objectives
(LLO)
  • Airspace throughput
  • Controller Pilot Workload
  • Voice Communications
  • Conflicts
  • Traffic densities

21
Step 1 IDENTIFICATION OF VALIDATION AIMS,
OBJECTIVES AND HYPOTHESES
Action 7 Establishing Validation Platform
Requirements and selection of validation technique
  • Scope of ATM system
  • Fidelity/Resolution
  • Geography
  • Time-based Requirements

22
Step 1 IDENTIFICATION OF VALIDATION AIMS,
OBJECTIVES AND HYPOTHESES
Action 8 Identification of System Performance
and Human Performance Metrics Hypotheses
  • SYSTEM
  • Planned versus Actual Flight Profiles
  • Sector Entry/Exit
  • Conflicts
  • Workload per controller
  • Number of Time Based Clearances

23
Step 1 IDENTIFICATION OF VALIDATION AIMS,
OBJECTIVES AND HYPOTHESES
Action 8 Identification of System Performance
and Human Performance Metrics Hypotheses (2 of
4)
Safety Perspective (capacity efficiency)
24
Step 1 IDENTIFICATION OF VALIDATION AIMS,
OBJECTIVES AND HYPOTHESES
Action 8 Identification of System Performance
and Human Performance Metrics Hypotheses (3 of
4)
ATSP Perspective (capacity efficiency)
25
(No Transcript)
26
Step 1 IDENTIFICATION OF VALIDATION AIMS,
OBJECTIVES AND HYPOTHESES
Action 8 Identification of System Performance
and Human Performance Metrics Hypotheses (4 of
4) - IF IT WAS A REAL TIME SIMULATION!!
  • Pilot metric to assess peak workload
  • Various performance based and physiological based
    objective measures are available
  • Physiological measures of EEG potentials
    (brainwaves) dismissed as too intrusive. Pupil
    diameter is therefore chosen

27
Step 1 IDENTIFICATION OF VALIDATION AIMS,
OBJECTIVES AND HYPOTHESES
Action 9 Definition of High Level Design
  • Initial 2005 baseline sample with no ASAS
    application to prove representativeness.
  • Three measured runs - 2005, 2010, 2015.
  • Three levels of separation delegation for each
    run.

Action 10 Operational and Statistical
Significance
  • 2005 measured run with minimum separation
    delegation
  • decreased controller and communications workload
    by 5
  • All other measured runs must improve on this.
  • 95 statistical significance required

28
Step 2 Plan and Prepare the Validation Exercise
Action 11 Selection Of Platform/Tool
  • MAEVA VGH describes available European platforms
    and their capability.
  • Suitability of TAAM for addressing HLO of
    Economics and Capacity through fast time
    simulations.
  • Adaptable to the airspace of this validation
    exercise.
  • Safety can be addressed through analysis of
    results.

Action 12 Scenario Definition
  • Use scenario template as aide-memoir
  • Helps develop a detailed scenario definition
    document

29
Step 2 Plan and Prepare the Validation Exercise
Action 13 Production Of Detailed Experimental
Design
  • Detailed planning of the exercise runs.
  • Preparation of the Measurement and Analysis
    Specification.

30
and finally Conclusions
  • CARE/ASAS VALIDATION FRAMEWORK closely aligns
    with MAEVA VGH
  • (some interim steps differ in order or are
    tailored)
  • Step by Step route map for the creation of
    validation exercises for any ASAS application
  • An iterative process of design
  • sufficient detail for organisations with limited
    ASAS or validation experience
  • Will encourage uniformity of ASAS validations

31
Forum Discussion
Write a Comment
User Comments (0)
About PowerShow.com