The Software Project Managers Key To Project Visibility - PowerPoint PPT Presentation

Loading...

PPT – The Software Project Managers Key To Project Visibility PowerPoint presentation | free to view - id: 123a-ZjNlM



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

The Software Project Managers Key To Project Visibility

Description:

PSM is a PROCESS and is tied directly to a Project's Risk Management Process ... Cost of management usually estimated at. 10-15% of the project's total cost ... – PowerPoint PPT presentation

Number of Views:308
Avg rating:3.0/5.0
Slides: 38
Provided by: Brianan
Category:

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: The Software Project Managers Key To Project Visibility


1
The Software Project Managers Key To Project
Visibility
Software Project Tracking and Oversight (Metrics
)
2
Objectives
  • Learn what metrics are and why they are important
    to the success of a project
  • Become familiar with the Practical Software
    Measurement Guide (PSM) and PSM Process
  • Understand how to select and apply metrics on a
    project
  • Become familiar with the SEIs standard set of
    software metrics
  • Understand the basic principles to be followed in
    implementing a successful metrics program

3
Applying Software Project Tracking Oversight
(PTO)
Intent Provide visibility into actual progress
and oversight to enable corrective action
Use SDP to track activities
Track actual size, cost, and effort against
estimates
Track actual progress against the schedule
Take corrective action as necessary
4
What Are Metrics?
Metric - quantitative measurement (used for
tracking purposes)
feedback to improve the process, productivity

Process Metrics
Project Metrics used to track project progress
Product Metrics track quality of product
5
Why Collect Metrics?
6
Practical Software Measurement Guide (PSM) V3.1
April 17, 1998 http//www.psmsc.com/
  • Produced by the Joint Logistics Commanders Joint
    Group on Systems Engineering
  • PSM Principles
  • Program Issues and Objectives drive measurement
    requirements
  • Developers process determines how the software is
    actually measured
  • Collect and analyze data at a level of detail
    sufficient to identify and isolate software
    development process problems
  • PSM is a PROCESS and is tied directly to a
    Projects Risk Management Process

7
PSM Process Diagram
Software Program Team
Implement Measurement Program Process
Obtain Organizational Support Define Measurement
Responsibilities Provide Measurement Responsibili
ties
Initiate Measurement Process
Measurement Needs
Data
Actions
Tailor Measurement Process
Measurement Application Process
Identify and Prioritize Project Issues
Select and Specify Project Measures
Integrate into the Software Process

Collect Data Analyze Issues Make Decisions
Measurement Plan
Issues, Objectives, Software Process
Characteristics
8
Implement Measurement Program Process
9
Step 1 Obtain Organizational Support
  • Generate support for software measurement at all
    levels within the organization
  • - briefings
  • - training classes
  • - involve stakeholders in developing
    measurement program
  • Management-mandated measurement without
    organizational buy-in and multilevel support will
    seldom succeed
  • All levels of the organization need to understand
    how measurement will directly benefit their
    projects and their own work processes
  • Gaining support usually requires cultural change

10
Step 2 Define Measurement Responsibilities
Establishes high level performance objectives
Uses measurement results to make organizational
and
enterprise level decisions
Executive Manager
Identifies and manages project issues
Uses measurement results to make program
decisions
Project Manager
Tailors measures to address program
issues Collects and analyzes measurement da
ta
and reports results
Measurement Analyst
Uses measurement results in software engine
ering efforts Provides measurement dat
a
Development Team
11
Step 3 Provide Measurement Resources
provide funding for measurement effort
provide measurement tools Estimated Meas
urement Program Costs (Data Collection
Analysis) - Estimated al cost - Cost of management usually estimate
d at 10-15 of the project's total cost
- Not an additional cost once management
starts using data to assist decisions
Examples Source Data Collected Cost () NA
SA/SEL Research 8 to 25 NEC(Japan) Quality Facto
rs 10-20 STC (UK) Program Management 3 to 5 NU
WC Program Management 2 to 3 NASA/SEL Program Ma
nagement 2.5 STRICOM Program Management 2 ATT
Program Management 1.5 Motorola Program Managem
ent 1
12
Step 4 Initiate Measurement Process
  • Properly implemented measurement becomes part of
    the way an organization does business
  • The most effective measurement process is one
    that is used and understood at all levels within
    the organization
  • Measurement must reflect existing software
    acquisition and engineering capabilities

13
Measurement Tailoring Process
14
Step 1 Identify and Prioritize Project Issues
  • Issues areas of concern that may impact the
    achievement of a project objective
  • Issues can be problems, risks, or lack of
    information
  • Issues should be defined at the onset of the
    project
  • - Issues can be identified by performing a risk
    assessment
  • (implement project risk management process
    and develop risk management plan) or by
    relying on past experience
  • Prioritize by ordering issues in terms of their
    expected impact and probability of risk occurring

15
Step 2 Select and Specify Measures
  • Map issues to the 6 common issues in the PSM
  • - Schedule and Progress
  • - Resources and Cost
  • - Growth and Stability
  • - Product Quality
  • - Development Performance
  • - Technical Adequacy
  • Map to the common issues to measurement
    categories
  • Map measurement categories into measures
  • Specify data requirements
  • - determine level of detail of data to be
    collected

16
Common Issues - Measurement Categories
  • Schedule and Progress
  • Milestone Performance
  • Work Unit Progress
  • Incremental Capability
  • Growth and Stability
  • Product Size and Stability
  • Functional Size and Stability
  • Technical Adequacy
  • Technology Impacts
  • Target Computer Resource Utilization
  • Technical Performance
  • Resources and Cost
  • Personnel
  • Financial Performance
  • Environment Availability
  • Development Performance
  • Process Maturity
  • Productivity
  • Product Quality
  • Defects
  • Complexity
  • Rework

17
Software Issues - Categories - Measures
Mapping(excerpt from PSM Page 32)
Issue
Category
Measure
  • Resources and Cost Personnel Effort
  • Staff Experience
  • Staff Turnover
  • Financial Performance Earned Value
  • Cost
  • Environment Availability Resource
    Availability Dates
  • Resource Utilization

18
Metrics Phase Coverage


19
Step 3 Integrate Measures into the Software
Process
  • Characterizing the software environment
  • - use developers current practices and existing
    data collection mechanisms
  • - do not force process changes on the developer
  • Identifying measurement opportunities
  • - potential candidates include all of the
    products and processes used by the developer
  • - take advantage of any measurement mechanisms
    already in place (e.g. financial reports,
    defect counts, SLOC counts)
  • Specifying implementation requirements
  • - describe how measurements will be collected
    and reported (can be documented in the SDP or a
    project measurement plan)

20
Suggested Project Measurement Plan Contents
  • Issues and measures (list of issues)
  • Data elements (structures, attributes,and data
    items for all measures)
  • Data definitions
  • Data sources
  • Level of measurement (level of detail of data
    collected)
  • Aggregation structure (structure by which data
    items will be combined)
  • Frequency of data collection
  • Methods of delivery (e.g. databases, electronic
    media)
  • Communication and interfaces (POCs for data
    sources, reports)
  • Frequency of analysis and reporting
  • Working document
  • Sample Measurement Plan included in the Software
    Project Tracking and Oversight Process
    (http//sepo.spawar.navy.mil/docs.htmlSOFTWARE
    PROJECT TRACKING AND OVERSIGHT)

21
Measurement Application Process
22
Step 1 Collect and Process Data
  • Access data
  • Data available from SDP, status reports, and
    engineering databases
  • Baseline the planned and collect the actual data
    values (include all versions of the planning
    data Plan 1,2,3,4,...)
  • Collect data at least monthly
  • Develop clear and concise definitions to guide
    data collection - know what the data means
  • Collect data at the appropriate level to localize
    problems
  • Verify Data
  • Question unusual trends and inconsistencies
  • Check for data ambiguities (numbers too large or
    too small)
  • Does the data look too regular?
  • Ensure the same units of measure (e.g. hrs, days,
    SLOC) are being used across all sources
  • Normalize data (if applicable)
  • Used when comparing or combining measurement data
    from different activities or from from software
    components with different characteristics
  • Do not expect perfect data

23
Step 2 Analyze Issues(define and generate
indicators)
  • Indicator A measure or combination of measures
    that provides insight into a software issue.
    Usually compares planned (baseline) versus actual
    values. Often displayed graphically.
  • Basic indicators produced regularly
  • Decomposition of indicators to localize problems
  • New indicators that respond to new questions
  • Sets of related indicators

24
Step 3 Make Decisions
  • Report Results
  • The measurement results must be clearly
    understood by the decision-maker, include all
    alternatives
  • measurement should be used for communication and
    understanding, not punishment!
  • Make results available throughout the
    organization
  • Select Alternative
  • Take action
  • Action must be taken to realize any benefit from
    measurement

25
Actions
  • Actions may include
  • - Extend schedule to maintain quality
  • - Add resources to stay on schedule
  • - Delete capabilities to control costs
  • - Change the process to improve performance
  • - Reallocate resources to support key
    activities
  • - Invoke Risk management contingency plan
  • Update software plans (SDP) to reflect actions
    taken
  • Effects of actions should be tracked

26
Core Program MetricsAs recommended by the SEI
Issues Have we Planned for suffic
ient resources? Are we progressing as planne
d? On budget? On schedule? How good is the pro
duct?
Metrics Size Effort Cost and Sch
edule
Quality
Measure (Actual vs. Planned) Function points,
SLOC, CSCIs, SUs, etc. Labor Costs, Labor
Hours, etc. Actual vs. planned cost to date,
Gantt Charts, Milestones, Reviews, Deliverables,
etc. Defects, Changes, Fixes, etc.
How large is the job? Does my process provide va
lid estimates?

Source Carleton, et al., Software Management for
DoD Systems Recommendations for Initial Impleme
ntation, SEI Technical Report,
Sept 1992
27
Quality
Discrepancies Found
Q u a n t i t y
Discrepancy Fixed
Open discrepancy reports
0
TIME
28
Cost and Schedule Metrics(Earned Value example)
Projected actual end cost
CONTRACT BUDGET BASE
NOW

Actual cost of work performed

Planned cost of work scheduled
COST VARIANCE

SCHEDULE VARIANCE
Planned cost of work performed
15
16
17
19
20
21
22
23
24
25
26
27
28
29
30 ..... end of project
18
TRR
CDR
CONTRACT MONTH
29
Earned Value Interim Milestones Method
  • Work that spans more than 2 accounting months and
    cannot be broken down into similar units
  • Select milestones that represent justifiable
    divisions of work and weight them according to
    the amount of effort needed to complete the
    milestone
  • Example
  • Management Plan Rough Draft 50 earned
    value Review Complete 15 earned
    value Comments Incorporated 20 earned value
  • Final Document Delivered 15 earned
    value 100 earned value

30

Guidance on Metrics
Set clearly defined goals (Goal-Question-Metri
c (GQM) paradigm) Focus on project issues (r
isk management) Begin by only collecting a few
key measurements Develop a Project Measureme
nt Plan Must have management commitment and
buy-in from project personnel -
training, briefs - allocate adequate resources
metrics analyst - set up a metrics working gro
up Provide timely feedback to project personn
el on metrics analysis Automate measurement c
ollection as much as possible - minimize impact
on project personnel Measurement results sh
ould remain anonymous - do not measure individu
als - all projects are different, do not compar
e - focus on the process and the product
31
Guidance on Metrics (continued)
Reward participation in the measurement program
Measurements that are not analyzed and acted
upon are of no value Measurement program must be
institutionalized and repeatable
Measurement program should be part of an overall
software process improvement effort
Understand that metrics are not an end to
themselves - metrics cannot identify, explain,
or predict everything - most issues/problems re
quire more than one measurement to
characterize and understand - metrics should au
gment, not replace, good management and
engineering judgment Encourage feedback from meas
urement program participants Plan for changes to
measurement program and the types of measurements
collected
32
Measurement Tools and References
  • PSM Insight (http//www.psmsc.com/insight.html)
  • SPMNs Software Project Control Panel
    (http//www.spmn.com/pcpanel.html)
  • MS Excel
  • MS Access
  • FileMakerPro
  • Project management tools (e.g. MS Project)
  • D87 Metrics Procedures
  • SEPO Weekly status reporting process
  • http//sepo.spawar.navy.mil/docs.htmlSEPO
    INTERNAL PROCESSES AND PROCEDURES
  • NAVSSI status reporting procedures

33
SPMN Project Control Panel
34
What are the consequences of not
collecting metrics?
35
Web Resources for Metrics
  • Joint Logistics Commanders Practical Software
    Measurement (PSM) http//www.psmsc.com
  • Department of the Army Pamphlet 73-7, Section
    10http//www.army.mil/swmetrics/ddown.htm
  • NASA Software Measurement Guide Book
    (NASA-GB-001-94)http//www.ivv.nasa.gov/SWG/resou
    rces/
  • NASA Software Measurement Guidebook
    (SEL-94-102)http//fdd.gsfc.nasa.gov/selprods.htm
    l
  • Software Project Control Panel
    http//www.spmn.com/pcpanel.html

36
Getting Started !
Read the Joint Logistics Commanders Practic
al Software Measurement Select a few key metric
s, start with simple ones Develop written organi
zational policy and procedures for collecting
and data basing process, product, and projec
t metrics. Establish responsibility for collecti
ng and analyzing metrics data.
Modify the contractors monthly status report to
include end product completion percen
tage. Condition the staff to use weekly status re
ports focusing on work accomplishmen
ts. Acquire the tools (i.e., MS Project, MS Exce
l, PSM Insight, Software Project Control Panel)
to support metric collection and reporting.
Plan to review your program to add, or replace,
metrics to continually improve your metrics
process.
Please fill out your evaluation form for this
section
37
References
- Software Metrics, Grady, Robert B., Caswell,
Deborah L., Prentice-Hall, 1987 - Practical
Software Measurements, Joint Logistics
Commanders, Version 3.1, 17 April 1998 - Pr
actical Software Measurement - PSM Overview
Seminar, Fall 1997 - NAWC China Lake Software Pr
oject Management Course, Dec 96
- Software Engineering - A Practitioner's
Approach (3rd Edition), Pressman, Roger, McGr
aw-Hill, Inc., 1992 - Software Engineering Risk
Analysis and Management, Charette, Robert N.,
McGraw-Hill, 1989 - Managing Software Developm
ent Projects, Whitten, Neal, John Wiley Sons
, Inc., 1990 - Software Management Indicators, A
ir Force Systems Command, AFSC Pamphlet 800-
43, 31 Jan 1986 - Acquisition Management - Softw
are Risk Abatement, Air Force
Systems Command, AFSC Pamphlet 800-45, 30 Sep
1988 - Procedures Guide for Software Management
Indicators, Software Quality Improvement (SQ
I) Program, NAVSEA Code 06D2Q, 15 August 199
0 - Recommended Approach to Software Developmen
t, Rev. 3, NASA, Software Engineering Lab Ser
ies, SEL-81-305, June 1992
About PowerShow.com