SOFTWARE METRICS - PowerPoint PPT Presentation

Loading...

PPT – SOFTWARE METRICS PowerPoint presentation | free to view - id: aabe8-ZDFjM



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

SOFTWARE METRICS

Description:

Web Software Example. Web site has many 'pages' (IOC) New addition (FOC) ... Metrics Usage Groundrules. Examples of Metrics. Cautions About Metrics. 02/05/01. 10 ... – PowerPoint PPT presentation

Number of Views:460
Avg rating:3.0/5.0
Slides: 22
Provided by: markn5
Category:
Tags: metrics | software

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: SOFTWARE METRICS


1
SOFTWARE METRICS
  • MN3309 Session 6
  • Prof. Mark Nissen

2
Agenda
  • Web Software Example
  • Mythical Manmonth
  • Software Metrics
  • Metrics Exercises
  • Summary

3
Web Software Example
  • Web site has many pages (IOC)
  • New addition (FOC)
  • Create using word processor, ftp to site
  • 10 pages (static) - 10 IF, 10 EQ
  • 1 image (x10) - 10 IF
  • 10 links to external sites (x10) - 100 EQ
  • 10 links from IOC app (x10) - 0 EIF
  • Consistency
  • Important to be consistent in counting
  • Ensure common interpretation/measure

4
Counting Example
(0) (0) (10)
(0) (0) (0) (100)
(20)
5
Counting Example
6
Calibration
  • Past project
  • Same environment, counting rules
  • 25 AFP/page
  • 4 SLOC/AFP
  • 320 hours for 15 new pages
  • Parametrics
  • .85 hours/AFP (15 x 25)1.0 (productivity? scale?)
  • Ec A(KSLOC) --gt A Ec/KSLOC 213
  • Ep 213 x 1.0 213 hours
  • Reasonable? Concerns?

7
Mythical Manmonth
  • Programmers are optimists?
  • Manmonths mythical?
  • 2 components?
  • 30/year max manpower buildup
  • ROT for scheduling
  • Coding x?
  • Planning y? Test integration z
  • Gutless estimating?
  • Sharp milestones?

8
Estimate Bootstrapping
  • Perform a sample of work, record time
  • Time by task types (plan, meet, code, etc.)
  • Use to make calibrate estimates
  • 1 new Web page
  • 2 hours to create (multiply by 6?)
  • How long to plan? Test? Integrate?
  • Time associated with old Web pages?
  • Time not associated with Web pages?
  • Triangulate with other methods

9
Software Measurement
  • Measurement Motivation
  • Measurement Life Cycle
  • Metrics Usage Groundrules
  • Examples of Metrics
  • Cautions About Metrics

10
Measurement Motivation
  • Key to process improvement
  • cant manage what you cant measure
  • Monitor risk areas, before crisis
  • Basis for rewards incentives
  • Key how to measure tech progress?
  • C-SAWS?

11
Measurement Life Cycle
12
Metrics Usage Groundrules
  • Metrics must be
  • Understandable?
  • Economical?
  • Field tested
  • Highly-leveraged
  • Timely
  • Improvement-oriented
  • Applied to all life cycle phases
  • Useful at multiple levels
  • Metrics related to estimates?

13
Typical Software Metrics
  • Quality - user satisfaction, Rome Labs
  • Size - SLOC, function/feature points
  • Complexity - McCabe, Halstead
  • Requirements - Stability, traceable
  • Effort productivity - mm/SLOC or FP
  • Cost schedule - /mm mo, phased
  • Scrap rework - defect/correct rates
  • Support - track characteristics (size)

14
Army STEP Metrics
  • Schedule cost
  • Computer resource utilization
  • Contractor SEE rating
  • Design requirements stability
  • Fault profile
  • Complexity
  • Breadth depth of testing
  • Reliability

15
Army STEP Metrics
16
F-22 Time-Phased Metrics
17
Program Feedback Control
18
Cautions About Metrics
  • Use as indicators, not absolutes
  • Only as good as underlying data
  • CDRL items program tracking
  • Do not measure everything
  • Some metrics universal
  • Many program-idiosyncratic
  • Evolve with program
  • Use multiple metrics, track estimates
  • Tie metrics to risk areas problems

19
Change Point Tracking
20
ERROR SCRs-DISCOVERY CORRECTION
CURRENT STATUS
R
A key positive indicator will be when the
discovery line flattens out as the rate of new
errors encountered approaches zero.
The correction rate for Priority 1 2 SCRs
continues to keep pace with the discovery rate.
21
Program Stretch-out Effects 1
Baseline Program
P P-Mo, Mo B/L 140, 5
C
T
D
A
I
22
Program Stretch-out Effects 2
Baseline Program
P P-Mo, Mo B/L 140, 5 S-P 140, 7
plan
Stretch-out
23
Program Stretch-out Effects 3
Baseline Program
P P-Mo, Mo B/L 140, 5 S-P 140, 7 S-A 215, 8
plan
actual
Core staff
Stretch-out
24
Single vs. Incremental Builds
Single Build
C
C2
T
D
Incremental Build
A
I
P P-Mo, Mo SB 140, 5 IB 140, 7
25
Summary
  • Measurement is important
  • Key to performance improvement
  • ID problems in advance of crisis
  • Metrics must be tailored
  • Some metrics universal
  • Many program-idiosyncratic
  • Evolve with program
About PowerShow.com