E132 Unit - PowerPoint PPT Presentation

1 / 107
About This Presentation
Title:

E132 Unit

Description:

Utilized for ODP Direct Support Exercises in Cincinnati (MLB) and Cleveland (UASI) ... Exercise Schedule. E132 Unit #16 - 62. As of February 7, 2006. CONTENT: ... – PowerPoint PPT presentation

Number of Views:35
Avg rating:3.0/5.0
Slides: 108
Provided by: lynnste8
Category:
Tags: e132 | mlb | schedule | unit

less

Transcript and Presenter's Notes

Title: E132 Unit


1
UNIT 16 Exercise Evaluation
2
Objectives
  • Describe the Evaluation Process
  • Identify objective based
  • evaluation concepts
  • Describe an evaluation team
  • Describe an evaluator checklist,
  • complete with objectives and points of review.
  • Identify evaluator skills, attributes and
    pitfalls
  • Describe the After Action Report process
  • Describe the Improvement Planning process

3
DEFINITION OF EXERCISE EVALUATION
The act of reviewing or observing and
recording exercise activity or conduct, applying
the behavior or activity against exercise
objectives, and noting strengths, areas for
improvement, deficiencies, and other
observations.
4
DEFINITION OF EXERCISE EVALUTOR
EVALUATOR An evaluator is an exercise
participant who observes and documents those
actions and decisions of the players that are
directly related to the exercise objectives
5
THE CASE FOR SYSTEMATIC EXERCISE EVALUATION
  • Exercise Evaluation must be addressed
    consistently and systematically in emergency
    management exercises.
  • It is the Cost - vs - Benefit factor

6
RATIONALE FOR A SYSTEMATIC EXERCISE EVALUATION
DESIGN (K. Lerner ANL)
7
SYSTEMATIC, OBJECTIVE-BASED EVALUATION DESIGN
PROCESS
  • Formulation of an Evaluation Plan
  • Based on definition and tracking of exercise
    objectives
  • Produces written evaluations for each stated
    objective
  • Provides a basis for assessment and upgrading of
    the emergency management system
  • Includes evaluation forms that are based on the
    objectives corresponds w/ the exercise design
    criteria
  • Depends on the use of qualified evaluators

8
EVALUATION TASKS IN THE OVERALL EXERCISE
DEVELOPMENT PROCESS
9
THE EXERCISE EVALUATION PROCESS
10
PRE-EXERCISE TASKS OF THE EVALUATION TEAM
DIRECTOR
  • Select the evaluation methodology, and if
    necessary, develop the appropriate evaluation
    forms
  • Develop the Evaluation Plan
  • Select and train evaluation team members

11
EXERCISE PHASE TASKS OF THE EVALUATION TEAM
DIRECTOR
  • DURING THE EXERCISE ENSURES EVALUATORS ARE
  • In the right locations
  • Equipped with the proper correct
  • documentation equipment
  • Observing documenting player actions
  • Provided with backup, if needed

12
POST-EXERCISE TASKS OF THE EVALUATION TEAM
DIRECTOR
  • Ensures evaluators assess achievement of exercise
    objectives
  • Coordinates involvement of evaluators in
    post-exercise meetings
  • Coordinates and
  • reviews drafting of
  • written reports
  • May play a role in
  • long-term follow-up

13
EVALUATION METHODOLOGY DEFINITION
  • EVERY EXERCISE EVALUATION METHODOLOGY IS DEFINED
    BY
  • Objectives to be demonstrated
  • Evaluator team size, critique
  • process, scheduling
  • Forms, checklists, and points of review for use
    by the evaluators
  • Information logistics support including
    evaluator packets
  • After action follow-up

14
EXERCISE OBJECTIVES
Serve as the basis for evaluation of the exercise
Standards Criteria (EOP, SOP)
Systematic Exercise Evaluation
Exercise Objectives
15
DISCUSSION QUESTION
Where can standard lists of emergency
management exercise objectives be obtained?
16
EXERCISE OBJECTIVES
  • Should be demonstrable with a reasonable
    commitment of resources
  • Should be defined so that related actions occur
    in one location

17
EXERCISE OBJECTIVES
Should divide the exercise into discrete functions
Direction Control
Fire/ Rescue
Evacuation/ Shelter Mass Care
Communication
Warning
Alert/ Notification
18
EXAMPLES
  • EMS Medical Command will Triage, Treat transport
    victims within 45 min
  • Hazmat Will identify the product
    initiate offensive containment within 90 min
  • Pub Info PIO will coordinate
  • interagency information
  • releases per protocols
  • Res Mgnt CP/EOC will
  • coordinate resources

19
SAMPLE OBJECTIVES
20
TYPES OF OBSERVATIONAL DATA
  • DESCRIPTIVE
  • Reporting everything which
  • is related to the assigned function
  • Usually reliable because it
  • requires little inference

21
TYPES OF OBSERVATIONAL DATA
  • INFERENTIAL
  • Requires the evaluator to make inferences before
    recording data
  • Harder to collect reliable information

22
TYPES OF OBSERVATIONAL DATA
  • EVALUATIVE
  • Requires evaluative judgments as well as
    inferences
  • Most complex and difficult to collect

23
EVALUTOR PACKETS FORMS
  • Remind the evaluator what to look for
  • Prompt the evaluator to gather specific
    information in an organized manner

24
POINTS OF REVIEW
Are used by the evaluators to determine whether
or not objectives are successfully demonstrated
25
DISCUSSION QUESTION
What actions or conditions would an evaluator
look for to determine if this objective has been
met?
Demonstrate communications capabilities with
all appropriate emergency response locations,
organizations, and personnel.
26
EVALUATION FORM DESIGN
Keep the questions short and simple
  • What time did responders arrive?

27
EVALUATION FORM DESIGN
KEEP THE FORM MANAGEABLE
  • Try to limit to a maximum of 15-20 Points of
    Review
  • May be more extensive for complex exercises
  • Indicates points of review for the exercise
    objectives

28
EVALUATION FORM DESIGN
Keep layout and typeface (font) simple bold
Did the EMS units within 10 minutes
Did the EMS units arrive at the Scene within 10
minutes of being dispatched?
29
EVALUATION FORM DESIGN
Do not ask for known information
Were the participants aware of the location of
the EOC?
30
EVALUATION FORM DESIGN
Do not ask questions the evaluator cannot answer
If an evaluator is in the field, do not ask what
time the call came into the dispatch center.
31
EVALUATION FORM DESIGN
Objective specific narrative summaries give
substance to basic information on the checklist
portion of the form.
32
EVALUATION FORM DESIGN
References help the evaluator in later assessment
of the exercise objectives.
33
SAMPLE EVALUATION FORM
34
STATE OF OHIO TERRORISM EXERCISE AND EVALUATION
MANUAL
  • Developed by the State of Ohio with Direct
    Support from the Office for Domestic Preparedness
    (ODP)
  • Nearly 18 months from concept to completion
  • Based on HSEEP Volume II
  • Cross-walked to HSEEP Volume II Tasks and EEGs
  • Utilized in-state for over 75 exercises in 2004
  • Utilized for ODP Direct Support Exercises in
    Cincinnati (MLB) and Cleveland (UASI)

35
EVALUATOR PITFALLS
Sometimes evaluators are not objective
36
MINIMIZING EVALUATOR FATIGUE
How can evaluator fatigue be minimized?
37
EVALUATOR EFFECT
The presence of evaluators may influence the
behavior of the players because they know what
the evaluators are looking for.
38
REDUCING EVALUATOR EFFECT
When players know the evaluation report does not
reflect unfavorably on individuals.
39
REDUCING EVALUATOR EFFECT
Evaluator
Be at the appropriate place when the players
arrive.
40
EVALUATOR BIAS
This type of bias refers to errors traceable
to characteristics of the evaluator or the
observed situation.
What are some ways to avoid evaluator bias?
41
RATING ERRORS
Data can be influenced by certain readily
identifiable errors.
42
ERROR OF LENIENCY
What a great job by all!!!
Some evaluators will rate all actions positively.
43
ERROR OF CENTRAL TENDENCY
An individual that will describe all activities
as average to avoid making any difficult
decisions is committing the Error of Central
Tendency.
44
HALO EFFECT
The Halo Effect is the tendency for an
evaluator to form an early impression of an
individual or an operation and permit this
impression to influence his or her observations.
45
HYPERCRITICAL EFFECT
I know somethings wrong here.
When an evaluator believes that it is his or
her job to find something wrong regardless of
the players performance.
46
CONTAMINATION
CONTAMINATION HAS TO DO WITH THE INFLUENCE OF THE
EVALUATORS KNOWLEDGE OR EXPECTATIONS ABOUT
CERTAIN ASPECTS OF THE EXERCISE
Same animals as last year
47
DISCUSSION QUESTIONS
  • If evaluators need additional information, what
    sources other than direct questioning might be
    available?
  • What are some reasons why it would be beneficial
    to allow evaluators to intervene in the exercise?
  • Why would you not want evaluators to intervene?

48
EFFECTIVE QUESTIONING
Intervene only when necessary
What alternative questions might be asked to
obtain the information sought?
49
EFFECTIVE QUESTIONING
Avoid leading questions
Was the notification process completed within the
appropriate timeframe?
50
EFFECTIVE QUESTIONING
Avoid prompting questions
Have you begun evacuation of the affected area
yet?
51
EFFECTIVE QUESTIONING
Avoid the role of advisor
When an evaluator must question a player, it may
set up an environment where the player looks to
the evaluator for guidance.
52
PRECAUTIONS TO MINIMIZE EVALUATOR EFFECTS AND
ERRORS
1. Structure the situation. 2. Be familiar with
rating errors. 3. If unable to document all data,
contact the team leader and ask for
assistance. 4. Avoid making evaluations and
judgments during the exercise.
53
PRECAUTIONS TO MINIMIZE EVALUATOR EFFECTS AND
ERRORS
5. Avoid conversations that could influence
your impressions of the exercise. 6. Complete
evaluator training retrain for each
exercise. 7. Familiarize yourself thoroughly
with evaluation checklists and report
forms. 8. Report obvious evaluator bias to
the team leader.
54
THE EXERCISE EVALUATION PROCESS
55
EVALUATION TEAM STRUCTURE
SMALL EXERCISE
Evaluation Team Leader
Functional Evaluator- Fire
Functional Evaluator- Health Medical
Functional Evaluator- Law Enforcement
Functional Evaluator- Resource Management
56
EVALUATION TEAM STRUCTURE
MORE FORMAL/ COMPLEX
Evaluation Team Director
Team Leader Fire Service
Team Leader Law Enf.
Team Leader EMS
Team Leader Public Works
Group Leader
Group Leader
Group Leader
Group Leader
Evaluator
Evaluator
Evaluator
Evaluator
Evaluator
Evaluator
Evaluator
Evaluator
Evaluator
Evaluator
Evaluator
Evaluator
Evaluator
Evaluator
Evaluator
Evaluator
Evaluator
Evaluator
Evaluator
Evaluator
57
RECUITING EVALUATORS
After the Evaluation Team Leader/Director has
determined
  • The number and type of evaluators needed
  • The types of skills required
  • The attributes sought

58
DISCUSSION QUESTION
Desired Evaluator Skills Attributes
  • SKILLS
  • Technical ability, knowledge, or expertise
  • ATTRIBUTE
  • A quality or characteristic, something a person
    may have been born with or developed as a part of
    their life experiences

59
RECRUITING EVALUATORS
  • Setting expectations evaluators must be
    available for
  • pre-exercise training and briefing
  • pre-exercise site visit
  • the entire exercise (hours to days)
  • post-exercise hot-wash
  • post-exercise data analysis (1 day)
  • contribution to the draft AAR

60
EVALUATOR TEAM ORIENTATION
  • CONTENT
  • The Evaluation Methodology
  • Structure of the evaluation team
  • Objectives to be demonstrated
  • Contents and use of the evaluator packet,
    including evaluator forms

61
EVALUATOR TEAM ORIENTATION
  • CONTENT
  • Exercise Scenario and Rules of Play
  • Exercise Scenario
  • Controller Inputs
  • Exercise Ground Rules
  • Exercise Schedule

62
EVALUATOR TEAM ORIENTATION
  • CONTENT
  • Exercise Evaluation Assignments
  • Specific objectives to be demonstrated
  • Parts of the emergency plan that pertain to
  • the assignment

63
EVALUATOR TEAM ORIENTATION
  • CONTENT
  • Information about the jurisdiction
  • Organization and location
  • Logistics

64
EVALUATOR TEAM ORIENTATION
  • CONTENT
  • Exercise Evaluation Assignments (Contd)
  • Location specific protocols
  • Local geography including potential problems
  • Local response structure

65
DISCUSSION QUESTION
  • When is the best time to train evaluators?
  • As part of a regular training program?
  • As early as possible before the exercise?
  • At the exercise?

66
THE EVALUATION PLAN
The Evaluation Plan (EvalPlan) explains
the procedures that will be used to assess the
policies, plans, procedures and resources
employed in the response community during the
exercise.
Evaluation Plan
67
THE EVALUATION PLAN
  • FOR THIS CLASS YOUR EVALUATION PLAN SHOULD
    CONSIST OF
  • Objectives
  • Evaluators
  • Evaluation forms w/Points of Review
  • Positioning of evaluators

68
DEVELOPING THE AFTER ACTION REPORT
69
AAR SOURCES OF INFORMATION
  • Evaluators observations
  • Players debriefing comments
  • Hot Wash
  • Players written comments
  • Controller/Simulator comments
  • Subsequent clarification or discussions with
    Players and Evaluators

70
AAR SOURCES OF INFORMATION
CONSISTENT WITH THE EXERCISE OBJECTIVES,
PERFORMANCE CRITERIA WITH A FOCUS ON PERFORMANCE
AND CAPABILITIES
71
DATA ANALYSIS
  • Conduct Hotwash
  • Develop Timeline and Narrative Summaries
  • Analyze Performance
  • Individuals
  • Teams/Functions
  • Outcomes

72
HOTWASH
  • Player Hotwash
  • Usually held immediately following exercise play
  • Typically facilitated by the evaluator
  • Provides opportunity for
  • Player self-assessment
  • An interactive discussion
  • Clarification of observed events
  • Assessment of exercise simulations

73
TIMELINE DEVELOPMENT
  • Make a team timeline of actions from your notes
    and collected data.

Avoid insignificant details.
74
TIMELINE DEVELOPMENT
  • Identify the appropriate objective for each
    activity

75
DATA ANALYSIS
  • Analysis of activities
  • What tasks were to be accomplished
  • Which tasks went well and which need improvement
  • Root causes
  • Recommendations

76
ROOT CAUSE ANALYSIS
1. Why did it happen?
2. Why did that happen?
3. Why was that?
4. And why was that?
5. And why was that?
6. And so on
Root Cause
Each step must completely explain the step above
down to the basic underlying causal factor.
77
RECOMMENDATIONS FOR IMPROVEMENT
  • Questions for identifying recommendations for
    improvement
  • What training and/or equipment is needed?
  • What changes need to be made to plans and
    procedures, or organization structures?
  • What changes could be made to the management
    processes?

Note These initial recommendations will evolve
into the AAR recommendations
78
INTEGRATED ANALYSIS
  • Allows further identification of
  • Successes and best practices
  • New gaps and problems
  • Root causes
  • Recommendations for improvement
  • Conducted by more limited group
  • Compares observations from different locations
    and functions

79
INTEGRATED ANALYSIS
Mission Outcome Hazard Mitigation
Location 2
Location 1
Fire
LE
EOC
One evaluator per key function/location
Single Function/Discipline/Location
zoom out
EOC
Many evaluators per function/location
80
INTEGRATED ANALYSIS
  • How to do integrated analysis
  • Steps
  • 1 Reduce evaluation team size to a core team
  • 2 Integrate timelines
  • 3 Look for performance disconnects
  • 4 Discuss disconnects ID new improvement
    areas
  • and strengths
  • 5 Discuss overall outcomes degree achieved

81
INTEGRATED TIMELINE DEVELOPMENT
82
INTEGRATED TIMELINE DEVELOPMENT
Timeline across locations
Example of Linked Events
Sorted by time, outcome, and location
83
INTEGRATED TIMELINE DEVELOPMENT
Timeline across locations
Finding whats missing
Sorted by time, outcome, and location
84
INTEGRATED TIMELINE DEVELOPMENT
  • Timeline
  • across
  • locations

Root Cause Analysis
Sorted by time, outcome, and location
85
INTEGRATED ANALYSIS
  • Techniques for timeline building
  • Index cards
  • Excel Spreadsheet
  • Post-it notes

86
INTEGRATED ANALYSIS
  • What if you dont have time to complete the
    integrated timeline?
  • Conduct analysis without the timeline
  • Facilitated discussion
  • Degree mission outcomes accomplished
  • Strengths and weaknesses
  • Review of overall exercise objectives
  • AAR drafting team completes the integrated
    timeline after the Exercise Evaluation Conference

87
THE AFTER-ACTION REPORT (AAR)
  • Drafted by small group
  • Serves as feedback tool
  • Summarizes what happened and recommendations for
    improvement
  • May include lessons learned to share with other
    jurisdictions

88
AFTER-ACTION REPORT
  • Prepared in two stages
  • Draft AAR completed immediately after the
    exercise for review
  • Community adds improvement steps/corrective
    actions
  • Final AAR

89
AFTER ACTION REPORT FORMAT(HSEEP)
  • Executive Summary
  • Part 1 Exercise Overview
  • Part 2 Exercise Goals and Objectives
  • Part 3 Exercise Events Synopsis
  • Part 4 Analysis of Mission Outcomes
  • Part 5 Analysis of Critical Task Performance
  • Part 6 Conclusion
  • Appendix A Improvement Plan Matrix

90
AAR FormatExecutive Summary
  • Summarizes exercise
  • Exercise design
  • Key successes and recommendations
  • Intended for
  • policymakers
  • senior managers
  • 1 2 pages
  • Prepared last

Stair steps to improvement
91
AAR FormatPart 1 Exercise Overview
  • Exercise Name
  • Duration
  • Exercise Date
  • Sponsor
  • Type of Exercise
  • Funding Source
  • Program
  • Focus
  • Classification
  • Scenario
  • Location
  • Jurisdiction(s) Exercised
  • Participating Organizations
  • Overview/design
  • Evaluation method

92
AAR FormatPart 2 Goals and Objectives
  • Simple listing of goals objectives
  • Summarizes the exercise architecture
  • Can be used as cross-check of the report for
    completeness

93
AAR FormatPart 3 Exercise Events Synopsis
Starting Scenario
  • Summary of precipitating scenario (e.g. STARTEX
    scenario)
  • Chronological synopsis
  • key scenario events player response
  • non-participant should be able to understand
    what happened
  • May be organized by exercise location, function,
    or discipline
  • Sticks to the facts avoids judgment

Event 1 c Response
Event 2 c Response
Event n c Response
ENDEX
94
AAR FormatPart 4 Analysis of Mission Outcomes
  • Analyzes how well participating organizations
    addressed the mission outcomes
  • Discusses all mission outcomes exercised
  • Narratives use critical tasks (EEGs) as the
    framework for discussion
  • Focuses on cross-functional, inter-agency, or
    inter-jurisdictional activities
  • Explicitly makes judgments

95
AAR FormatPart 5 Analysis of Critical Task
Performance
  • Examines each critical task exercised
  • For each task performed as expected
  • Task and description From EEG
  • Task III-14 Provide Emergency Public Information
    to the Media and Public
  • Discussion
  • Briefly describe how the task was performed as
    expected (1 2 paragraphs).
  • No recommendations

96
AAR FormatPart 5 Analysis of Critical Task
Performance
  • For each task with improvement opportunities
  • Task and description From EEG
  • Issue and short description
  • Discussion
  • What happened
  • How it was different from what was expected
  • Consequences
  • Root cause analysis
  • Recommendations What should be done to improve
    preparedness

97
AAR FormatWriting Recommendations
  • Number for identification
  • Single sentence
  • Use action verbs
  • Dont be afraid to make honest recommendations
  • Specific and measurable
  • Indicate who (agency) should take action
  • Recommendations should flow from analysis
  • Make each recommendation stand-alone
  • Check for consistency dont leave conflicting
    recommendations
  • Indicate where performance was good or adequate
    and no recommendations needed.

98
AAR FormatPart 6 Conclusions
  • Wraps up the report
  • Summarizes major findings
  • Outcomes accomplished
  • Strengths and improvement areas
  • Recommendations for follow-on exercises

99
DRAFT AAR
Standard Format for Issue Write-ups Task
___________________________ (Number and Title of
EEG) Issue ____ (A consecutive number, to be
assigned later by Evaluation Team Leader, and a
one-sentence title) Reference (EEG and local
plan, procedure, MOU, etc. citation) Summary
(One paragraph description of the
issue) Consequence (The effects of the issue on
the ability of the community to respond, with
reference, as appropriate, to effects on the
overall Mission, Function or Discipline, or
Task) Analysis (Why this issue occurred the
root cause) Recommendations (What can be done to
resolve the issue, including changes in plans or
procedures, additional training, additional
resource, MOUs, etc.)
100
RECOMMENDATION TEMPLATE
  • Use these templates as suggestions for refining
    recommendations. You need not match the wording
    exactly.
  • (Who?) should prepare/revise ________ plan to
    (correct what?) by (when).
  • (Who?) should prepare/revise ________ policy or
    procedure to (correct what?) (when).
  • (Who?) will conduct training for (group) in
    (what?) so that ________ by (when).
  • (Who?) will obtain _______ equipment/facilities
    so as to __________ by (when).
  • (Who) will conduct _________ study/analysis to
    __(action required)___ so as to _______________.
  • (Who) will convene a working group of
    ___(people/agencies)___ to __(action required)__
    so as to correct ___(what)___.

101
Exercises of Increasing Complexity
DISCUSSION
-
BASED EXERCISES
OPERATIONS
-
BASED EXERCISES
FULL
-
SCALE
EXERCISES
FUNCTIONAL
EXERCISES
HOTWASH
DATA
DRILLS
COLLECTION
GAMES
ANALYSIS
AFTER ACTION
WORKSHOPS
REPORT
SEMINARS
IMPROVEMENT
PLAN
TRAINING
PLANNING
IMPROVEMENT PLAN IMPLEMENTATION
102
DISCUSSION-BASED EXERCISES
103
IMPROVEMENT PLANNING PROCESS
  • Developed by local jurisdiction
  • Identifies how recommendations will be addressed
  • What actions
  • Who is responsible
  • Timeline for completion

104
IMPROVEMENT PLAN FORMAT
105
MONITOR IMPLEMENTATION
  • The final AAR/IPs are sent to ODP through SAA
  • ODP will follow-up on status of improvements
    through SAA
  • Identify need for technical support to implement

106
UNIT SUMMARY
  • IF THE EVALUATION PROCESS IS BASED ON SOUND
    OBJECTIVES FROM THE EARLIEST STAGES, THE RESULTS
    WILL BE
  • An effective evaluation operation
  • An organized presentation of meaningful
    evaluation results
  • Results documented in a formal AAR are utilized
    to facilitate the development of an Improvement
    Plan

107
QUESTIONS?
Write a Comment
User Comments (0)
About PowerShow.com