Homeland Security Exercise and Evaluation Program (HSEEP) SAL Briefing 11/12/08 - PowerPoint PPT Presentation

1 / 40
About This Presentation
Title:

Homeland Security Exercise and Evaluation Program (HSEEP) SAL Briefing 11/12/08

Description:

... CERTs will obtain uniform message and distribute into identified ... Duty logs and message forms. Status boards. Evaluation/Participant Feedback Forms ... – PowerPoint PPT presentation

Number of Views:527
Avg rating:3.0/5.0
Slides: 41
Provided by: emd3
Category:

less

Transcript and Presenter's Notes

Title: Homeland Security Exercise and Evaluation Program (HSEEP) SAL Briefing 11/12/08


1
Homeland Security Exercise and Evaluation
Program (HSEEP) SAL Briefing
11/12/08
2
HSEEP CYCLEFocus on Evaluation
3
Objectives Based on UTL
Agency Function Universal Target
List Objective
Scenario
Element EEG
4
Evaluation and Improvement Process
5
Levels of Criteria for Analysis
6
Mission-Level Analysis
  • All capabilities contribute to one or more of the
    overarching homeland security missions
  • Prevention
  • Protection
  • Response
  • Recovery

7
Capability-Level Analysis
Answers the Question How prepared is the
community to prevent, protect against, respond
to, and/or recover from a natural or manmade
disaster?
  • Assesses if the participants, as a whole,
    achieved the expected capability outcomes
  • Focus is on outcomes instead of processes

8
Activity-Level Analysis
Answers the Question Did the larger team
adequately perform all tasks in accordance with
approved plans, policies, procedures, and
agreements?
  • Performance measures and tasks that demonstrate
    proficiency in part of a capability
  • Useful for assessing
  • Plans
  • How the members worked together
  • How team members communicated

9
Task-Level Analysis
Answers the Question Did the individual(s) or
team(s) carry out the tasks as expected, and did
the completion of the task contribute to
achievement of the activities?
  • Specific required duty which individuals or teams
    are able to perform or recall during an exercise
  • Helps determine if personnel, training, and
    equipment are sufficient
  • Linked to objectives
  • Tasks
  • Performance measures

10
Step 1 Plan and Organize the Evaluation
  • Part of the exercise design and development
    process
  • Exercise Planning Team determines
  • What information is collected
  • Who collects it
  • How it is collected
  • Evaluators are identified, recruited, and trained

11
Evaluation Plan
  • Lead Evaluator takes charge of planning the
    evaluation
  • An evaluation plan should consider
  • Exercise-specific information
  • Plans, policies, and procedures
  • Evaluator assignments
  • Evaluator instructions
  • Evaluation tools (e.g., EEGs)

12
Evaluation Team
  • Chosen for their knowledge of a particular
    functional area
  • Should be familiar with the jurisdictions plans,
    policies, procedures, and agreements
  • Identified early in the exercise planning process

13
Evaluators
  • Observe and record the discussions or actions of
    players
  • Participate in data analysis and help draft the
    AAR
  • Are generally drawn from nonplaying members of
    participating organizations

Lessons Learned It helps if a new evaluator is
paired with a more experienced evaluator during
his or her first evaluation experience. The
experienced evaluator can mentor the new
evaluator, which improves the quality of the
evaluation and increases the number of
experienced evaluators in the jurisdiction.
14
Evaluator Time Requirements
  • Evaluators may need to be available for
  • Pre-exercise training
  • Briefing and/or site visit
  • Exercise conduct
  • Postexercise Hot Wash
  • Controller and Evaluator Debriefing
  • After-Action Conference

15
Evaluator Training
  • Evaluators should be trained and prepared before
    the exercise
  • Training should address
  • Exercise goals and objectives
  • Scenario
  • Participants
  • Evaluator roles and responsibilities

16
Evaluator Training
  • Training should include guidance on
  • Observing the exercise
  • What to look for
  • What to record
  • How to use EEGs
  • How to analyze data
  • During training, evaluators should be provided
    with
  • Exercise documents
  • Jurisdictional plans
  • Evaluation materials, EEGs, schedule, and
    assignment

17
Exercise Evaluation Guides
  • EEGs are only guides to help evaluators document
    exercise activities and determine if objectives
    are met
  • EEGs are not a report card
  • Generally, one packet for each of the
    capabilities in the Target Capabilities List
    (TCL)

18
Exercise Evaluation Guides
  • EEGs
  • Activities
  • Tasks Observed
  • Performance Measures
  • Activity Map
  • Overview of typical activity flow to be
    accomplished for each capability

19
Activities
  • Defined in the TCL for each capability
  • Grouping of responder Tasks with similar
    purpose
  • Organizing framework against which Tasks and
    their associated Performance Measures are
    evaluated during an operations-based exercise

20
Performance Measures
  • For certain aspects of a Task that can be
    assessed by a number or time limit
  • Quantitative or qualitative
  • Yes/no, percentages, or continuous
  • Local tasks and performance measures can be
    added, but deletions or alterations cannot be
    made to existing content

21
Tasks
  • Response actions can be observed which would
    indicate the level of success
  • Measured against the jurisdictions existing plan
    and procedures and the response actions necessary
  • The pre-populated tasks within EEG cannot be
    changed or deleted however, jurisdiction-specific
    tasks can be added

22
Observation Keys
  • Provided for each Task as an evaluator aid
  • Type of actions typically taken for the
    accomplishment of the Task
  • Are not exhaustive

23
How to Use Exercise Evaluation Guides
  • Identify the capabilities from the TCL that are
    being exercised
  • Select EEGs associated with the Capabilities and
    Activities being exercised
  • Tailor the EEG
  • Locate and complete the section for
    jurisdiction-specific tasks
  • EEGs are located in a searchable, online library
  • http//hseep.dhs.gov
  • Locked fields
  • Unlocked fields (where jurisdiction can make
    additions)

24
Why Use Exercise Evaluation Guides?
  • Analysis is based on data collection using the
    EEGs
  • Data is compiled in the After-Action Report
    (AAR)/Improvement Plan (IP)
  • Trends and improvements can be tracked using
    consistent evaluation standards
  • Identify the activities the evaluator should be
    observing
  • Provide consistency in tasks across multiple
    exercises
  • Link individual tasks to disciplines and outcomes
  • Guide data collection as a reference for
    evaluators/data collectors

25
Step 2 Observe the Exercise and Collect Data
  • Expert evaluators collect data
  • Record observations during exercise
  • Collect additional data from records and logs
  • Attend Player Hot Wash

26
How to Use the Exercise Evaluation Guides
  • Record the name, time, and exercise
  • More detailed notes are taken on notebooks,
    tablets, other devices, etc.
  • Capture who, what, where, and how
  • If exercise artificialities are affecting the
    observed task, they should be noted
  • Log times accurately
  • Evaluators should synch their timekeeping devices
    before the exercise

27
How to Use the Exercise Evaluation Guides
  • EEG used for discussion- and operations-based
    exercises
  • More detailed in operations-based
  • Must be tailored to support discussion-based
    exercises
  • For example, Coordinate rescue efforts with law
    enforcement to ensure safety of rescuers while
    law enforcement secure the incident site is
    tailored to a discussion-based exercise as
    follows
  • Who should initiate coordination with law
    enforcement to ensure the safety of rescue
    workers?
  • How are rescue workers made aware that the
    appropriate coordination has occurred?
  • How are Incident Command, WMD/HazMat, and law
    enforcement personnel trained on this
    requirement?

28
Operations-Based Data Collection
  • Evaluators record actions as they occur
  • Who performed the action or made the decision
  • What occurred
  • Where the action or decision took place
  • When the action took place
  • Why the action took place or decision was made
  • How they performed the action or made the
    decision
  • Evaluators do not interfere with play

29
Operations-Based Data Collection
  • Sometimes evaluators should prioritize the data
    to collect
  • Essential information includes
  • Message in/message out
  • Discussion
  • Decision
  • Directive
  • Movement
  • What Happened
  • Inject

Tips for Successful Observation
  • Do
  • Be in position before players arrive
  • Get a good view of the action
  • Focus on critical activities
  • Take detailed/legible notes
  • Do Not
  • Leave your post
  • Prompt players
  • Get in the way
  • Answer players questions

30
Operations-Based Player Hot Wash
  • Usually occurs the day of the exercise
  • If an FSE has several sites, a Player Hot Wash
    may occur at each location
  • Most effective if led by an experienced
    facilitator
  • Opportunity to distribute Participant Feedback
    Forms

Sample Hot Wash Ground Rules
  • Short time duration
  • Facilitated discussion format
  • Constructive comments only
  • Identify things that
  • Went well
  • Need improvement

31
Following the Exercise
  • Evaluators should
  • Review notes for gaps
  • Collect additional data to fill in gaps
  • Sources include
  • Records produced by automated systems
  • Duty logs and message forms
  • Status boards
  • Evaluation/Participant Feedback Forms

32
Analysis Section
  • Used after the exercise to compile and analyze
    information collected on various tasks
  • Activities
  • Tasks Observed
  • Activity Narrative
  • Detailed Evaluator Observations

33
Analyzing Data
  • Identify issues
  • Tasks that were not completed as expected
  • Determine root cause
  • Source for an identified issue
  • Action toward which an improvement is directed
  • Develop recommendations for improvement
  • What should be done
  • Who should do it

34
Analyzing Operations-Based Exercises
During analysis, evaluators try to answer the
following questions
  • What happened? What did the observation team
    see?
  • Plans, Policies, and Procedureswas there a
    difference and why? What is the root cause?
  • Impact?
  • Lessons learned?
  • Recommendationsfixes to the root cause?

35
Analyzing Operations-Based Exercises
During analysis, evaluators try to answer the
following questions
  • What was the impact? Were the consequences of
    the action (or inaction or decision) positive,
    negative, or neutral?
  • What should be learned and what are the
    recommendations for improvements? What are the
    fixes to the root cause?

36
How to Determine Root Cause
  • Root cause is the source of an identified issue
  • Evaluators should ask why each causal event
    happened or did not happen
  • A root cause with an actionable solution should
    be determined for each issue

37
How to Develop Recommendations
  • Recommendations should
  • Both sustain and improve
  • Be forthright
  • Be specific and measurable
  • Use the active voice
  • Link to observations and analysis
  • Be consistent with other recommendations
  • Note The tip sheet in the manual gives best
    practices for developing recommendations for
    discussion- and operation-based exercises

38
Developing Recommendations
  • Evaluation Team recommendations are one possible
    suggestion to remedy a problem
  • The participating jurisdiction/organization is
    responsible for developing the recommendation
    that will address the problem appropriately
  • Action items within the Improvement Plan should
    address the problem, not the recommendation

39
Example Recommendations from Evaluator
  • The chief of plans should attend the EMI course
    on developing an Incident Action Plan.
  • The nine counties should develop a regional CBRNE
    task force.
  • The city and county should sustain the Unified
    Command that integrates their response to a
    disaster.

40
HSEEP WEBPAGE
https//hseep.dhs.gov/pages/1001_HSEEP7.aspx
Bill Webb, FEMA Region X, bill.webb_at_dhs.gov,
(425) 487-4605
Write a Comment
User Comments (0)
About PowerShow.com