ASW METOC Metrics: Metrics Overview, Goals, and Tasks - PowerPoint PPT Presentation

1 / 35
About This Presentation
Title:

ASW METOC Metrics: Metrics Overview, Goals, and Tasks

Description:

ASW METOC Metrics: Metrics Overview, Goals, and Tasks Tom Murphree Naval Postgraduate School (NPS) murphree_at_nps.edu Bruce Ford Clear Science, Inc. (CSI) – PowerPoint PPT presentation

Number of Views:58
Avg rating:3.0/5.0
Slides: 36
Provided by: Tom349
Category:

less

Transcript and Presenter's Notes

Title: ASW METOC Metrics: Metrics Overview, Goals, and Tasks


1
ASW METOC Metrics Metrics Overview, Goals, and
Tasks
Tom Murphree Naval Postgraduate School
(NPS) murphree_at_nps.edu Bruce Ford Clear Science,
Inc. (CSI) bruce_at_clearscienceinc.com Paul
Vodola, Matt McNamara, and Luke Piepkorn Systems
Planning and Analysis (SPA) pvodola_at_spa.com CAPT(
s) Mike Angove OPNAV N84 michael.angove_at_navy.mil
Brief for ASW METOC Metrics Symposium Two 02-04
May, 2007
Ford B and T. Murphree, Metrics Overview, May 07,
bruce_at_clearscienceinc.com, murphree_at_nps.edu
2
ASW METOC Metrics Key Concepts
  • Definition
  • Measures of the performance and operational
    impacts of the CNMOC products provided to ASW
    decision makers.
  • Uses
  • Improve product generation and delivery
    processes, product quality, assessments of
    uncertainty and confidence in products, product
    usage, and outcomes of ASW operations.
  • Allow CNMOC to more effectively participate in
    ASW fleet synthetic training, reconstruction and
    analysis, campaign analysis, and other modeling,
    simulation, and assessment programs.
  • Evaluate new products, including new performance
    layer and decision layer products.

Ford B and T. Murphree, Metrics Overview, May 07,
bruce_at_clearscienceinc.com, murphree_at_nps.edu
3
ASW METOC Metrics Key Concepts
  • Metrics System
  • Capable of collecting and analyzing data on
    actual products, verifying observations,
    decisions made by users of the products, and
    outcomes of user operations.
  • Minimal manpower impacts through extensive use of
    automation.
  • Metrics delivered in near real time and in
    formats that allow CNMOC managers to effectively
    use the metrics in their decision making.
  • Includes operations analysis modeling capability
    to simulate operational impacts of products.
    Model metrics complement those derived from data.
    Modeling simulates events difficult to represent
    with actual data (e.g., rare or difficult to
    observe events), and can allow experimentation
    with different scenarios (e.g., different levels
    of product accuracy, different CONOPS for METOC
    support).

4
METOC Metrics and Battlespace on Demand Tiers
METOC metrics address all three tiers --- in
particular, the performance and operational
impacts of each of the three layers.
5
  • Key Questions
  • In what ways does the METOC community impact
    the ASW
  • fight?
  • How can we measure those impacts?
  • How can we use those measures to improve our
    impacts?

6
  • Other Questions Metrics Can Help Answer
  • Where are the gaps in METOC support?
  • Are we good enough?
  • Is a product really worth generating?
  • Is there a more efficient way to produce our
    products?
  • What difference do we make to our customers?
  • How could we improve our impacts on customers?
  • How much confidence should we and our customers
    have in our products?

7
  • High Priority Metrics for ASW Directorate
  • Operational impact metrics main metric
  • Product performance metrics stepping stone to
    main metric

8
Process for Developing METOC Metrics
or other products
9
Operational Modeling Overall Intent
  • Operational model of ASW scenarios laboratory
    for experiments
  • Similar to the strike / NSW application of WIAT
  • Investigate the effect of METOC products and
    services across a wide range of ASW scenarios
  • Effect of increased accuracy
  • Effect of enhanced timeliness
  • Develop METOC support benchmarks to evaluate real
    world performance and establish goals

Scenario
Pre-Hostilities Area Clearance MPRA Area Search SURTASS Cuing of MPA Pouncers SLOC Transit Protection
A
B
C
D
Multiple combinations of METOC products and
services support ASW
METOC Product
10
Operational Modeling Development
  • Early stages in model development
  • Identify the scope of the process to be modeled
  • Identify the METOC data flow / mission planning
    for the desired scope
  • Identify the end-user of the model and desired
    outputs
  • Later stages in model development
  • Develop a simulation of the METOC data flow and
    mission planning
  • Incorporate real-world metrics as available to
    improve model fidelity and accuracy
  • Develop output to describe the impact of improved
    METOC accuracy and/or timeliness

11
METOC Support Chain for ASW
12
METOC Support Chain for ASW Small Scale Model
13
METOC Support Chain for ASW Medium Scale Model
14
METOC Support Chain for ASW Large Scale Model
15
Operational Modeling Notional Modeling Output
  • For each combination of METOC support product and
    scenario, display the payoffs from increasing
    accuracy and/or timeliness to
  • Determine the level of METOC support that meets
    ASW requirements
  • Enable decision-maker to balance cost vs.
    benefit of product improvement

Resulting METOC Metrics
Scenario
No ASW payoff from additional accuracy /
timeliness
Pre-hostilities Area Clearance MPRA Area Search SURTASS Cuing of MPA Pouncers SLOC Transit Protection
A
B
C
D
ASW performance is improved
METOC Product
UNSAT - No impact above historical
16
Operational Modeling Notional Metrics Assessment
Ex SLD Prediction Accuracy
Perfect Prediction
Actual data come from metrics collection
processes instituted for VS07
Actual
Predicted data come from METOC products
distributed by RBC
Predicted
17
Operational Modeling Notional Metrics Assessment
Ex SLD Prediction Accuracy
Acceptable tolerance lines (no significant impact
to operations)
Some impact to operations becomes a risk
management decision
Actual
Performance thresholds generated by modeling and
simulation
Region of unacceptable performance
Predicted
18
Capstone MetricsErrors in environmental
depiction
lead to errors in tactical decisions
Question What level of environmental sensing is
needed to sufficiently enable decisions?
Slide provided by CDR Mike Angove, N84
19
Capstone Metrics ROI for Environmental Knowledge
Study Notional Output Curve
Deliverable Analysis will fit POM-08
LBSFI/PR-09 altimeter investment to this curve
e.g., CZ or Sfc Layer presence
Slide provided by CDR Mike Angove, N84
20
Metrics Steps
  • Determine what we want to know and be able to do
    once we have a
  • fully functioning metrics system.
  • 2. Determine what metrics we need to in order
    to know and do these
  • things.
  • 3. Determine what calculations need to be done
    in order to come up
  • with the desired metrics. 
  • 4. Determine what data needs to be collected in
    order to do the
  • desired calculations (i.e., data analyses). 
  • 5. Determine the process to use to collect and
    analyze the
  • needed data.
  • 6. Implement the data collection and analysis
    process.
  • a. If data can be collected, go to step 7. 
  • b. If data can't be collected, repeat steps
    1-5 until you can.
  • 7. Use the metrics obtained from steps 1-6.
  • 8. Assess the results of steps 1-7.
  • 9. Make adjustments to steps 1-8.
  • 10. Repeat steps 1-9 until satisfied with the
    process and the outcomes from the process.

21
Metrics Steps and Symposium Two Tasks
  • Determine what we want to know and be able to do
    once we have a fully functioning metrics system.
  • Determine what metrics we need to in order to
    know and do these things.
  • Determine what calculations need to be done in
    order to come up with the desired metrics. 
  • Determine what data needs to be collected in
    order to do the desired calculations (i.e., data
    analyses). 
  • Determine the process to use to collect and
    analyze the needed data.
  • Implement the data collection and analysis
    process.
  • a. If data can be collected, go to step 7. 
  • b. If data can't be collected, repeat steps 1-5
    until you can.
  • Use the metrics obtained from steps 1-6.
  • Assess the results of steps 1-7.
  • Make adjustments to steps 1-8.
  • Repeat steps 1-9 until satisfied with the process
    and the outcomes from the process.

Steps completed in approximate form in Symposium
One and in committee reports.
22
Metrics Steps and Symposium Two Tasks
  • Determine what we want to know and be able to do
    once we have a fully functioning metrics system.
  • Determine what metrics we need to in order to
    know and do these things.
  • Determine what calculations need to be done in
    order to come up with the desired metrics. 
  • Determine what data needs to be collected in
    order to do the desired calculations (i.e., data
    analyses). 
  • Determine the process to use to collect and
    analyze the needed data.
  • Implement the data collection and analysis
    process.
  • a. If data can be collected, go to step 7. 
  • b. If data can't be collected, repeat steps 1-5
    until you can.
  • Use the metrics obtained from steps 1-6.
  • Assess the results of steps 1-7.
  • Make adjustments to steps 1-8.
  • Repeat steps 1-9 until satisfied with the process
    and the outcomes from the process.

Task 1 for Symposium Two Review and revise
results for these steps.
23
Metrics Steps and Symposium Two Tasks
  • Determine what we want to know and be able to do
    once we have a fully functioning metrics system.
  • Determine what metrics we need to in order to
    know and do these things.
  • Determine what calculations need to be done in
    order to come up with the desired metrics. 
  • Determine what data needs to be collected in
    order to do the desired calculations (i.e., data
    analyses). 
  • Determine the process to use to collect and
    analyze the needed data.
  • Implement the data collection and analysis
    process.
  • a. If data can be collected, go to step 7. 
  • b. If data can't be collected, repeat steps 1-5
    until you can.
  • Use the metrics obtained from steps 1-6.
  • Assess the results of steps 1-7.
  • Make adjustments to steps 1-8.
  • Repeat steps 1-9 until satisfied with the process
    and the outcomes from the process.

Task 2 for Symposium Two Outline plan for steps
1-6 for VS07.
24
Metrics Steps and Symposium Two Tasks
  • Determine what we want to know and be able to do
    once we have a fully functioning metrics system.
  • Determine what metrics we need to in order to
    know and do these things.
  • Determine what calculations need to be done in
    order to come up with the desired metrics. 
  • Determine what data needs to be collected in
    order to do the desired calculations (i.e., data
    analyses). 
  • Determine the process to use to collect and
    analyze the needed data.
  • Implement the data collection and analysis
    process.
  • a. If data can be collected, go to step 7. 
  • b. If data can't be collected, repeat steps 1-5
    until you can.
  • Use the metrics obtained from steps 1-6.
  • Assess the results of steps 1-7.
  • Make adjustments to steps 1-8.
  • Repeat steps 1-9 until satisfied with the process
    and the outcomes from the process.

Task 3 for Symposium Two Outline plan for steps
1-10 for next several years.
25
  • Conceptual Helpers
  • Fence and Gates Analogy
  • Hierarchy of Metrics
  • Bricks and House Analogy

26
Fence and Gates Analogy Overall Concept
Future capability (pri 1)
Future capability (pri 3)
Immediate goals
Future capability (pri 2)
27
Fence and Gates Analogy An Example
Real-time metrics display (pri 1)
  • VS07 capability and methods
  • experiment
  • RBC data collection system
  • NOAT data collection system

Exercise leveldata collection (pri 3)
MPRA data collection system (pri 2)
28
  • Hierarchy of Metrics
  • Metrics are most useful when they provide
    information to multiple
  • levels of the organization
  • Individual forecaster
  • SGOT/OA Chief/Officer
  • METOC activity CO/XO
  • Directorate
  • CNMOC
  • Fact-based metrics are best developed when
    developed from data
  • from the lowest levels of the organization
  • Critical to collect data on the smallest unit
    of support (e.g., forecast, recommendation)
  • Higher level metrics (directorate, CNMOC) rely on
    lower level data collection/metrics
  • Operational modeling is enhanced by quality real
    world information (e.g., significant numbers of
    mission data records)

29
Hierarchy of Metrics
Higher Level (Navy-wide SLD accuracy)
Metric Symposium Focus Space
Larger Spatial and/or Temporal Scale (Exercise
forecast location)
Performance(Temperature and salinity accuracy)
Impacts (Number of positively identified
submarines)
Smaller Spatial and/or Temporal Scale (Point
forecast location)
Lower Level (NOATs SLD accuracy)
30
Hierarchy of Metrics
Higher Level (Navy-wide SLD accuracy)
Metric Symposium Focus Space
CNMOC/FleetMetrics
Larger Spatial and/or Temporal Scale (Exercise
forecast location)
Directorate Metrics
Bottom-up approach to developing higher level,
larger scale metrics. Bottom metrics support
development of fact-based top metrics.
NOAC Metrics
Performance(Temperature and salinity accuracy)
Impacts (Number of positively identified
submarines)
Exercise Metrics
NOAT Metrics
Ind. Forecast Metrics
Smaller Spatial and/or Temporal Scale (Point
forecast location)
Lower Level (MOATs SLD accuracy)
31
Hierarchy of Metrics
Higher Level (Navy-wide SLD accuracy)
Metric Symposium Focus Space
CNMOC/FleetMetrics
Larger Spatial and/or Temporal Scale (Exercise
forecast location)
Directorate Metrics
Lower-level metrics can be input into operational
models, which can provide higher-level metrics
(e.g. mission model)
NOAC Metrics
Performance(Temperature and salinity accuracy)
Impacts (Number of positively identified
submarines)
Exercise Metrics
NOAT Metrics
Ind. Forecast Metrics
Smaller Spatial and/or Temporal Scale (Point
forecast location)
Lower Level (MOATs SLD accuracy)
32
Hierarchy of Metrics
Higher Level (Navy-wide SLD accuracy)
Metric Symposium Focus Space
CNMOC/FleetMetrics
Larger Spatial and/or Temporal Scale (Exercise
forecast location)
Directorate Metrics
Lower-level metrics can be input into operational
models, which can provide higher-level metrics
(e.g. mission model)
NOAC Metrics
Top-down approach Higher level, larger scale
metrics can also provide useful feedback for
improving lower level, smaller scale metrics
Performance(Temperature and salinity accuracy)
Impacts (Number of positively identified
submarines)
Exercise Metrics
NOAT Metrics
Ind. Forecast Metrics
Smaller Spatial and/or Temporal Scale (Point
forecast location)
Lower Level (MOATs SLD accuracy)
33
Bricks and House Analogy
Impact on METOC Customers (higher-level metrics)
  • Each brick
  • Each brick represents a different warfare support
    area or subset of an area (e.g., MPRA, NOAT, RBC)
  • Takes many records to make good high-level
    metrics
  • Each record must be well constructed to make
    quality high-level metrics
  • Support Unit Record
  • Forecast data
  • Verification data
  • Customer plans
  • Customer outcomes
  • Recommendations
  • Other data

34
Where Does the Effort Belong Early Stages
Data Analysis
Capacity Capability Algorithms Display Archival Mo
deling
35
Where Does the Effort Belong Later Stages
Data Analysis
Capacity Capability Algorithms Display Archival Mo
deling
Write a Comment
User Comments (0)
About PowerShow.com