Process Metrics EEE493 2000 - PowerPoint PPT Presentation

1 / 15
About This Presentation
Title:

Process Metrics EEE493 2000

Description:

What are some size metrics we have looked at already? ... for a small set of data to tell if a program was healthy or heading for trouble ... – PowerPoint PPT presentation

Number of Views:18
Avg rating:3.0/5.0
Slides: 16
Provided by: knig9
Category:

less

Transcript and Presenter's Notes

Title: Process Metrics EEE493 2000


1
Process Metrics EEE493 2000
Royal Military College of Canada Electrical and
Computer Engineering
  • Major Greg Phillips
  • greg.phillips_at_rmc.ca
  • 1-613-541-6000 ext. 6190

Dr. Scott Knight knight-s_at_rmc.ca 1-613-541-6000
ext. 6190
2
Refs
  • Judy Clapp, Getting Started on Software Metrics,
    IEEE Software, Jan. 1993

3
Teaching Points
  • The Mitre Software Management Metrics
  • Principles behind the metrics
  • The metrics set

4
Review
  • What are some size metrics we have looked at
    already?
  • How did we measure productivity (this is a
    process metric)?
  • What were some problems with the productivity
    metric?

5
Motivation
  • In the USAF in the late 80s 7 out of every 10
    electronic systems had problems related to
    software development
  • USAF leadership asked for a small set of data to
    tell if a program was healthy or heading for
    trouble
  • Mitre Corp. produced
  • Software Management Metrics

6
Six Principles(mainly just good engineering
principles)
  • A successful software-development project is one
    that meets its cost, schedule and quality goals
  • Development plans should set quantitative goals
    so that you can tell if you are meeting them
  • Plans should be compared with actual performance
    throughout development to detect potential
    problems early

7
Six Principles
  • Data trends over time are often better inficators
    of potential problems than actual values, because
    they can show when deviations from the plans are
    temporary, fluctuating, growing, or diminishing
  • There are many explanations, good and bad, for
    the same set of data metrics indicate not
    problems, but data values that should be
    investigated
  • How metrics are presented can obscure or clarify
    their message

8
Metrics Set
  • Size
  • The planned size and the current estimated size
    or actual size, measured in lines of code. Used
    to measure total effort and schedule, and to
    measure productivity
  • Personnel
  • The number of staff members planned and the
    actual number currently doing development and
    documentation. Used to measure productivity and
    predict schedule delays or cost overruns due to
    understaffing or overstaffing

9
Metrics Set
  • Computer Use
  • The estimated and actual percentage of the target
    systems hardware CPU, storage, and
    communications capacity being used.
  • Unit Progress
  • The planned versus actual units designed, tested,
    and integrated. Used to measure work planned and
    done in terms of products that complete some
    phase of development.
  • Schedule Progress
  • Ratio of total schedule to the ratio of the
    budget spent

10
Metrics Set
  • Volatility
  • The number of specified functional requirements
    and unresolved requirements issues. Used as an
    indication of unstable requirements that may
    cause rework.
  • Design Complexity
  • The designs complexity, as measured using
    McCabes method. Used to determine parts of the
    design that may be error prone and hard to
    maintain.
  • Requirements and Design Progress
  • The number of requirements that have been
    scheduled to be documented and actually documented

11
Metrics Set
  • Testing Progress
  • The planned and actual configuration items and
    system tests completed number of new problem
    reports, and open or unresolved problem reports.
    Used to measure progress in completing testing,
    number of potential defects found, and efficiency
    in resolving them. Can be used to estimate
    quality and time needed to complete tests.
  • Incremental Release Content
  • The estimated and actual release date and the
    estimated and actual components in each release.
    Used to measure progress and requirement changes
    used to meet schedule

12
Use of Metrics only use the ones you need
  • The most frequently used metrics are
  • Size
  • Personnel
  • Computer Use
  • Unit Progress
  • Testing Progress

13
Use of Metrics You must examine trends
  • Periodic comparisons of planned values and the
    later estimated/actual values can show deviations
    that indicate problems
  • Trends how the deviations change over time

Plan
Actuals
Time
14
Use of Metrics You must cross-correlate metrics
  • One metric alone seldom gives you enough
    information to understand a trend
  • Use cross-correlation to help understand why a
    trend is occurring
  • Examples
  • low number of problem reports and testing
    progress
  • schedule progress and release content
  • unit progress and schedule progress (burn rate)

15
Next ClassProcess Metrics Analysis
Write a Comment
User Comments (0)
About PowerShow.com