Team Software Project TSP - PowerPoint PPT Presentation

Loading...

PPT – Team Software Project TSP PowerPoint presentation | free to download - id: 375a4-ZTEyO



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

Team Software Project TSP

Description:

– PowerPoint PPT presentation

Number of Views:54
Avg rating:3.0/5.0
Slides: 46
Provided by: leeva
Category:
Tags: tsp | project | software | team

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: Team Software Project TSP


1
  • Team Software Project (TSP)
  • June 26, 2006
  • System Test

2
Outline
  • Remaining Session Plan Discussion
  • System Test Plan Discussion
  • Mythical Man Month
  • System Test Plan Recap
  • Metrics Presentations
  • More on Measurement
  • Next Phases
  • Cycle 1 Test
  • Cycle 1 Post-Mortem Presentations
  • Cycle 2 Plan Strategy

3
Due Today
  • Key Metrics Presentation (10-15 minutes)
  • All Implementation Quality Records (LOGD, CCRs,
    etc.)
  • Final code (source executable)
  • Updated Products (code components, SRS, HLD, User
    Documentation)
  • Intermediate Products (e.g. Unit Test Plans)
  • Configuration Management Plan
  • Release CD
  • Application
  • User Guide
  • Release Letter
  • No class on July 3

4
Project Performance Discussion
5
Remaining Lectures Plan/Discussion
  • July 10 Cycle 1 Test Complete Post-Mortem
  • Cycle 1 Results Presentation Discussion
  • Cycle 1 Reports Post-Mortem
  • Measurement
  • Team audit
  • July 17 Cycle 2 Launch
  • Cycle 2 Launch, Project Measurement Planning
  • Peopleware Topics Management, Teams, Open
    Kimono, Quality, Hiring/Morale,
  • July 24 Cycle 2 Requirements Complete
  • Cycle 2 Requirements
  • Death March Projects
  • July 31 Cycle 2 Implementation Complete
  • System Test Plan Baselined
  • Cycle 2 Design Implementation
  • Process topics CMMI, TL-9000, ISO
  • August 7 Cycle 2 Test Complete
  • Cycle 2 Test Complete
  • Cycle 2 Post-Mortem Complete
  • August 14 - Course Review

6
Remaining Course Topics Discussion
7
System Test Schedule
  • Note Assumes system has already passed
    Integration Test
  • Full feature to system test and instructor by COB
    June 25 including
  • Test environment
  • Executable
  • User documentation (note CCRs can be filed
    against user documentation)
  • Source code
  • Tester generates CCRs for all finds fills out
    LOGTEST
  • Email to instructor when generated (see below)
  • Development team updates LOGD referencing CCRs
  • Required turn-around times for fixes
  • 80 within 24 hours
  • 99 within 48 hours
  • Required test coverage short of blocking issues
  • 80 First Pass Test Complete by June 28
  • 100 First Pass Test Complete by July 1
  • Regression Test Complete by July 3
  • Daily test reports to instructor detailing test
    cases executed, results CCRs

8
System Test Plan Recap
  • Areas to cover
  • Installation
  • Start-up
  • All required functions available working as
    specified
  • Diabolical (e.g. power failures, corner cases,
    incorrect handling)
  • Performance
  • Usability
  • Includes
  • Test cases you plan to run (numbered / named)
  • Expected results
  • Ordering of testing dependencies
  • Supporting materials needed
  • Traceability to requirements

9
Release Letters
  • Purpose
  • Whats in it?
  • Version Information
  • Release contents
  • Examples
  • All functionality defined in Change Counter
    Requirements v0.6 except GUI
  • Phase 1 features as defined in project plan x.y
  • Feature 1, Feature 2, Feature 3 as defined by
  • Known Problems
  • Change Request IDs w/ brief customer oriented
    description
  • Fixed Problems
  • Upgrade Information
  • Other?

10
Implementation Status
  • Implementation experience
  • Unit/Integration experience
  • Problems / Rework?
  • PIP forms

11
Implementation Test Discussion
  • Sample topics
  • Obstacles to success?
  • Things that went well?
  • Things to avoid?
  • Biggest surprises?
  • How did you do vs. plan?
  • Crises handled?
  • Team dynamics in crisis?

12
Team Presentation
13
Project Measurement
  • Source Practical Software MeasurementJohn
    McGarry, et.al.

14
Measurement
  • If you cant measure it,
  • you cant manage it
  • Tom DeMarco

15
Fundamentals
  • Dont try to measure everything
  • Align measures with
  • Project goals risks (basic survival mode)
  • Process improvement areas (continual improvement
    mode)
  • Define measurement program up front
  • Monitor continuously take action where needed

16
Applications
  • Improve accuracy of size cost estimates
  • Improve quality
  • Understand project status
  • Produce more predictable schedules
  • Improve organizational communication
  • Faster, better informed management decisions
  • Improve software processes

17
Basic In-Process Measurement Examples
  • Schedule
  • Earned Value vs. Planned Value
  • Schedule Variance
  • Development
  • Task completion
  • Actual code completed vs. planned
  • Project End Game
  • Defect Creation vs. Closure
  • Variations severity
  • System Test
  • Testing Complete
  • Variations passed, failed, blocked
  • Test Time / Defect
  • Test Coverage (vs. requirements, white box code
    coverage)

18
Process Improvement Measurement Examples
  • Quality
  • Defect density
  • Post Deployment defect density
  • Inspection Effectiveness
  • Defects / inspection hour
  • Estimation Accuracy

19
Why Measure?
  • Support short long term decision making
  • Mature software organization (CMMI level?) uses
    measurement to
  • Plan evaluate proposed projects
  • Objectively track actual performance against plan
  • Guide process improvement decisions
  • Assess business technical performance
  • Organizations need the right kind of information,
    at the right time to make the right decisions

20
Measurement in Software Lifecycle
  • Plan
  • Do carry out change
  • Check observe effects of change
  • Act decide on additional areas for improvement
  • Repeat
  • Considerations Cost, schedule, capability,
    quality

21
Measurement Psychological Effects
  • Measurement as measures of individual performance
  • Hawthorne Effect
  • Measurement Errors
  • Conscious rounding, pencil whipping (ie. False
    data entry)
  • Unintentional inadvertent, technique (ie.
    Consistent)

22
Use of Measures
  • Process Measures time oriented, includes defect
    levels, events cost elementsUsed to improve
    software development maintenance process
  • Product Measures deliverables artifacts such
    as documentsincludes size, complexity, design
    features, performance quality levels
  • Project Measures project characteristics and
    executionincludes of developers, cost,
    schedule, productivity
  • Resource Measures resource utilizationincludes
    training, costs, speed ergonomic data

23
Measurement Uses
  • Objective information to help
  • Communicate effectively
  • Track specific project objectives
  • Identify correct problems early
  • Make key trade-off decisions
  • Justify decisions

24
Glossary
  • Entity - object or event (e.g. personnel,
    materials, tools methods)
  • Attribute - feature of an entity (e.g. LOC
    inspected, defects found, inspection time)
  • Measurement - and symbols assigned to
    attributes to describe them
  • Measure quantitative assessment of a
    product/process attribute (e.g. defect density,
    test pass rate, cyclomatic complexity)
  • Measurement Reliability consistency of
    measurements assuming nochange to method/subject
  • Software validity proof that the software is
    trouble free functions correctly (ie. high
    quality)
  • Predictive validity accuracy of model estimates
  • Measurement errors systematic (associated with
    validity) random (associated w/ reliability)
  • Software Metrics approach to measuring some
    attribute
  • Defect product anomaly
  • Failure termination of products ability to
    perform a required function

25
PSM Measurement Process
  • Measurement Plan
  • Information need e.g.
  • What is the quality of the product?
  • Are we on schedule?
  • Are we within budget?
  • How productive is the team?
  • Measurable Concept
  • Measured entities to satisfy need (abstract
    level e.g. productivity)
  • Measurement Construct
  • What will be measured? How will data be
    combined? (e.g. size, effort)
  • Measurement Procedure
  • Defines mechanics for collecting and organizing
    data
  • Perform Measurement
  • Evaluate Measurement

26
Measurement Construct
Decision Criteria
Analysis Model
Measurement Function
Measurement method
Measurement method
27
Attributes
  • Attribute
  • Distinguishable property or characteristic of a
    software entity
  • (Entities processes, products, projects and
    resources)
  • Qualitative or Quantitative measure

28
Base Measure
  • Measure of an attribute (one to one relationship)
  • Measurement method
  • Attribute quantification with respect to a scale
  • Method type
  • Subjective (e.g. high, medium, low), Objective
    (e.g. KLOC)
  • Scale
  • Ratio
  • Interval
  • Ordinal
  • Nominal
  • Unit of measurement
  • e.g. hours, pages, KLOC

29
Derived MeasureIndicator
  • Derived Measure
  • Function of 2 or more base measures
  • Measurement Function
  • Algorithm for deriving data (e.g. productivity
    KLOC/developer hours)
  • Indicator
  • Estimate or Evaluation
  • Analysis Model
  • Algorithm / calculation using 2 or more base /or
    derived measures
  • Decision Criteria
  • Numerical thresholds, targets, limits, etc.used
    to determine need for action or further
    investigation

30
Measurement ConstructExamples
  • Productivity
  • Attributes Hours, KLOC
  • Base Measures Effort (count total hrs), Size
    (KLOC counter)
  • Derived Measure Size / Effort Productivity
  • Analysis Model Compute Mean, compute std
    deviation
  • Indicator Productivity mean w/ 2 ? confidence
    limits
  • Quality
  • Attributes Defects, KLOC
  • Base Measures Defects (count defects), Size
    (KLOC counter)
  • Derived Measures Defects / Size Defect Rate
  • Indicator Defect rate control baseline mean,
    control limits measured defect rate

31
More Measurement ConstructExamples
  • Coding
  • Base Measure Schedule (w.r.t. coded units)
  • Derived Measure Planned units, actual units
  • Analysis Model Subtract units completed from
    planned units
  • Indicator Planned versus actual units complete
    variance

32
Class Measurement ConstructExamples
  • Coding
  • Base Measure
  • Derived Measure
  • Analysis Model
  • Indicator

33
Measurement Planning
  • Identify Candidate Information Needs
  • Project Objectives
  • Cost, schedule, quality, capability
  • Risks
  • Prioritize
  • One approach probability of occurrence x project
    impact project exposure
  • e.g.
  • Schedule
  • Budget
  • Reliability
  • Dependencies
  • Product Volatility

34
PSM Common Information Categories
  • Schedule Progress
  • Resources Cost
  • Product Size Stability
  • Product Quality
  • Process Performance
  • Technology Effectiveness
  • Customer Satisfaction

35
PSM Common Information CategoriesMeasurement
Concepts
  • Schedule Progress - milestone dates/completion,
    EV/PV
  • Resources Cost - staff level, effort, budget,
    expenditures
  • Product Size Stability - KLOC/FP,
    requirements, interfaces
  • Product Quality - defects, defect age, MTBF,
    complexity
  • Process Performance - productivity, rework
    effort, yield
  • Technology Effectiveness - requirements coverage
  • Customer Satisfaction - customer feedback,
    satisfaction ratings, support requests, support
    time, willingness to repurchase

36
Select Specify Measures
  • Considerations
  • Utilize existing data collection mechanisms
  • As invisible as possible
  • Limit categories choices
  • Use automated methods over manual
  • Beware of accuracy issues (e.g. timecards)
  • Frequency needs to be enough to support ongoing
    decision making(alternative gate processes)

37
Measurement Construct
38
Project Measurement Plan Template
  • (from PSM figure 3-10, p 56)
  • Introduction
  • Project Description
  • Measurement Roles, Responsibilities
    Communications
  • Description of Project Information Needs
  • Measurement Specifications (i.e. constructs)
  • Project Aggregation Structures
  • Reporting Mechanisms Periodicity

39
Team Project Postmortem
  • Why
  • Insanity
  • Continuous improvement
  • Mechanism to learn improve
  • Improve by changing processes or better following
    current processes
  • Tracking process improvements during project
  • Process Improvement Proposals (PIP)
  • Post-Mortem
  • Areas to consider
  • Better personal practices
  • Improved tools
  • Process changes

40
Cycle 2 Measurement Plan
  • Identify cycle 2 risks information needs
  • Review revise measures create measurement
    constructs
  • Document in a measurement plan

41
Postmortem process
  • Team discussion of project data
  • Review critique of roles

42
Postmortem process
  • Review Process Data
  • Review of cycle data including SUMP SUMQ forms
  • Examine data on team team member activities
    accomplishments
  • Identify where process worked where it didnt
  • Quality Review
  • Analysis of teams defect data
  • Actual performance vs. plan
  • Lessons learned
  • Opportunities for improvement
  • Problems to be corrected in future
  • PIP forms for all improvement suggestions
  • Role Evaluations
  • What worked?
  • Problems?
  • Improvement areas?
  • Improvement goals for next cycle / project?

43
Cycle Report
  • Table of contents
  • Summary
  • Role Reports
  • Leadership leadership perspective
  • Motivational commitment issues, meeting
    facilitation, reqd instructor support
  • Development
  • Effectiveness of development strategy, design
    implementation issues
  • Planning
  • Teams performance vs. plan, improvements to
    planning process
  • Quality / Process
  • Process discipline, adherence, documentation,
    PIPs analysis, inspections
  • Cross-team system testing planning execution
  • Support
  • Facilities, CM Change Control, change activity
    data change handling, ITL
  • Engineer Reports individual assessments

44
Role Evaluations Peer Forms
  • Consider fill out PEER forms
  • Ratings (1-5) on work, team project
    performance, roles team members
  • Additional role evaluations suggestions
  • Constructive feedback
  • Discuss behaviors or product, not person
  • Team leaders fill out TEAM EVALUATION form

45
Cycle 1 Project Notebook Update
  • Updated Requirements Design documents
  • Conceptual Design, SRS, SDS, System Test Plan,
    User Documentation
  • Updated Process descriptions
  • Baseline processes, continuous process
    improvement, CM
  • Tracking forms
  • ITL, LOGD, Inspection forms, LOGTEST
  • Planning actual performance
  • Team Task, Schedule, SUMP, SUMQ, SUMS, SUMTASK,
    CCR

46
Due July 10 Class
  • Cycle 1 Reports / Post-Mortem
  • Cycle 1 Results Presentation
  • Cycle 2 Project Plan
  • Cycle 2 Measurement Plan

47
Cycle 1 Audit
About PowerShow.com