Navy ERM Sustainment Strategy METRICS Charley Barth Director of Navy Records (202) 433-2434 Matthew Staden Navy Records Manager (202) 433-4217 - PowerPoint PPT Presentation


PPT – Navy ERM Sustainment Strategy METRICS Charley Barth Director of Navy Records (202) 433-2434 Matthew Staden Navy Records Manager (202) 433-4217 PowerPoint presentation | free to download - id: 6b9ec1-YjEyM


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation

Navy ERM Sustainment Strategy METRICS Charley Barth Director of Navy Records (202) 433-2434 Matthew Staden Navy Records Manager (202) 433-4217


Title: Navy Records Management Metrics Author: James B Jordan Last modified by: Owen Ambur Created Date: 8/1/2006 2:22:40 PM Document presentation format – PowerPoint PPT presentation

Number of Views:27
Avg rating:3.0/5.0
Slides: 13
Provided by: JamesBJ152
Learn more at:


Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: Navy ERM Sustainment Strategy METRICS Charley Barth Director of Navy Records (202) 433-2434 Matthew Staden Navy Records Manager (202) 433-4217

Navy ERM Sustainment StrategyMETRICSCharley
BarthDirector of Navy Records(202)
433-2434Matthew StadenNavy Records
Manager(202) 433-4217
What is a Metric?
  • A calculated term or enumeration representing
    some aspect of biological assemblage, function,
    or other measurable aspect and is a
    characteristic of the bio data that changes in
    some predictable way with increased human
    influence. A multimetric approach involves
    combinations of metrics to provide an assessment
    of the status of
  • A random variable x representing a quantitative
    measure accumulated over a period.www.tsl.state.t

Again What is a Metrics
  • The standard of measurement of a contract
    requirement for which a quality standard can be
    applied. For example, if there is a contract
    requirement to maintain an accurate inventory of
    widgets, the metric is the number of widgets
    accounted for. The accuracy of the widgets
    inventory can be compared to a standard to
    determine the quality of the inventory. A unique
    identifier of performance. A metric is what is

Example of NARA ERA Metrics
Key Elements of Metrics
  • Relevant Accuracy is self-evident but,
    unfortunately is often in the eye of the
    beholder. It is critical to gain consensus around
    the accuracy of a metric so that once it is in
    place, everyone agrees on its value, authority,
    and relevance.
  • Relevant is a key attribute that links the
    metric with a collection of relevant information
    that may be unique to the user.
  • Transparent means that there is no mystery about
    how the metric is computed, what sources are
    used, how often measurement is performed, who
    uses it and how it is used.
  • Must provide a business need, Just dont collect
    data for the fun of it.

Metrics Development
  • Metrics Development
  • Define Strategic Intent (ERMS Strategy)
  • Records Management
  • Business Process Management
  • Information Visibility
  • Legal Compliance
  • Training/Marketing/Awareness
  • Sources of Data for Metrics
  • Interviews with DRMs and Admins (completing first
  • Subjective (anecdotes)
  • Refining Pulse Points
  • Data pulls from TRIM and from EDS (working with
    Tower and EDS)
  • Objective (raw data from systems)
  • Investigating what we can access from datasets
    and NMCI
  • Identify key data points
  • Develop a method to capture trends and not just
    snapshots (datamining)

Interview/Visit Reports (Metrics Collection)
  • The interviewer shall periodically conduct an
    interview/visit with the DRM and use a series of
    topics and questions as a guide.
  • This interview is to provide the DRM the
    opportunity to request assistance or ask
    questions regarding ERMS issues they may be
  • The interviewer will address and document
    specific issues regarding the dataset operations
    as well as collect metric points that will help
    in making overall DON ERMS sustainability
  • The result of this interview will be forwarded to
    the OPNAV DNS-5 in a timely manner and captured
    into a knowledge base specifically for the
    dataset deployment as well as for overall DON
  • Benefit
  • History and visibility of health of dataset.
  • Data point for enterprise ERM decisions

Data Collection
  • Determine what can be captured from TRIM directly
    via saved searches and statistics within TRIM
  • Determine related data pulls from NMCI that will
    assist in developing useful data points.
  • Develop working database(s) (in MS Access) to
    collect/store trend related historical data for
    trend analysis and data mining.
  • Identify appropriate data points/fields
  • Working with Tower and EDS to determine scope of
    data available
  • Working with sustainment agents/team to determine
    methodology/governance to maintain data.
  • Issue for DRM discussion.
  • Collecting existing data/information
  • Business cases
  • Tips
  • Lessons Learned
  • Snapshots from WSRs

Interview Pulse Points
  • Initial set of Pulse Points (expected to mature
    and evolve)
  • Policy Have business rules and policy been
    established regarding electronic records
    management? What are you putting into TRIM?
  • DRM Reach back Is the DRM aware and involved in
    DRM Community activities? Best Practice? Lessons
  • Migration Documents/records migrated to TRIM?
    What is the relationship between the share drive
    and TRIM? Portal and TRIM?
  • Storage Status of CLIN16AA ordering process?
  • Records New records in TRIM? Total records in
  • Training Do you have a local Training Plan for
    Records Management/TRIM Context? Number of new
    users trained? Number of command members who
    have taken the RM CBTs?
  • Locations New locations in TRIM? Total

Metrics Key Performance Indicators
  • Proposed Metric Categories
  • Records Management
  • Retention Schedules, Disposition execution and
  • Content stored in TRIM (email, logs, command
  • Storage profile (TRIM vs. share drive vs. paper
    records at FRC)
  • Locations/records distribution (internal/external)
  • Business Process Management/Improvement
  • Number Process areas migrated to TRIM
  • Workflows/Actions Created
  • Process Improvement Metrics (Time, Number of
    steps, cost)
  • Information access time (knowledge work to find
    the right information)

Metrics Key Performance Indicators
  • Information Visibility
  • Corporate memory utilization (using the IC stored
    in TRIM)
  • SME/Author identification (collaborative work
    from TRIM)
  • Human Capital Mgmt ( users using TRIM records as
    turnover history files)
  • of external locations
  • Legal Compliance
  • Case Preparation time
  • What is captured as a record (official
    correspondence to emails)?

Metrics Key Performance Indicators
  • Training/Marketing/Awareness
  • of trained TRIM Administrators
  • of users trained (CBT, command training)
  • Locations vs. Training audit?
  • Training plan in place
  • DRM/Admin rotation/PRD (military DRM/RM transfer
    every 3 years)
  • Command Campaign Plan?
  • RM command awareness (Plan of the Week/New
  • DRM Active on the Navy Knowledge online
    Collaborations site