Introduction to project management topics - PowerPoint PPT Presentation

1 / 110
About This Presentation
Title:

Introduction to project management topics

Description:

Analysing spare time. Identifying most critical jobs. Advantages: ... Spare time can be utilised. Resources can be balanced. Project Completion dates advanced ... – PowerPoint PPT presentation

Number of Views:1693
Avg rating:3.0/5.0
Slides: 111
Provided by: Alli6
Category:

less

Transcript and Presenter's Notes

Title: Introduction to project management topics


1
Introduction to project management topics
2
Persuasive Communicator
Good Controller
Effective Planner
Project Manager
Effective Manager
Good Trainer
Technically Competent
Sensitive to problems of change
Organisationally Powerful
3
DEVELOPMENT MODELS
  • Waterfall
  • Clear deliverables
  • Spiral Model
  • Risk assessment
  • Evolutionary
  • Staged Delivery
  • RAD
  • Fast development enivronment
  • Package based
  • Using a foundation

4
Waterfall Model
Feasibility Study
Define Requirements
System Design
Programming and Testing
Implementation
5
V Model
Concept
System
QA Activities
Development Activities
6
Incremental
Initial Requirements and Design
Increment 1 Detailed design, programming and
testing
Increment 2 Detailed design, programming and
testing
Increment 3 Detailed design, programming and
testing
7
Spiral Model
Evaluate alternatives, identify and resolve risks
Determine objectives, alternatives, constraints
Risk Analysis
Prototype
Review
Reqs/ Design/ Code
Project plan Q.Plan Test plan
Plan for next phase
Develop Verify
8
PRINCE -Overview
  • A project management methodology
  • Other similar ones METHOD/1 SDM
  • Public domain / standard
  • Separates management from technical

9
PRINCE - Organisation
Project Board
Project Assurance Team
Project Manager
Stage Manager(s)
Team
Supplier(s)
10
PRINCE - Planning/ Deliverables
Project Deliverables
Managerial
Technical
Quality
11
PRINCE - Planning / Tolerances
Target

Cost
Tolerance Box
-
-

Time
12
PRINCE - Control
Stages
Phases
Product
Product
Product
Management Control - health of project
Technical Control - health of product
13
Rapid Application Development
  • Martin - 1992
  • Initially synonymous with hacking
  • DSDM - mid-1990s
  • Iterative approach
  • Minimum documentation
  • Scientific (objective) philosophy
  • Further reading / sources
  • Stapleton, J. (1997) Dynamic Systems Development
    Method, Addison -Wesley
  • Beynon-Davies et al (1999) Rapid Application
    Development (RAD) an empirical review, European
    J. of IS, vol.8, pp211-223.

14
DSDM Principles (1)
  • Active user involvement
  • Team empowered to make decisions
  • Frequent delivery of products
  • Fitness for purpose is acceptance criterion
  • Collaborative co-operative approach

15
DSDM (2)
  • Iterative incremental development
  • All changes are reversible
  • Requirements frozen at high level
  • Testing integrated throughout life cycle

16
Components of RAD
  • Joint Application Design (JAD)
  • Clean rooms
  • Time boxing
  • Rapid development tools
  • Highly interactive, low complexity projects
  • Intensive phased projects

17
Time Boxing
U s e r R e v i e w
U s e r R e v i e w
U s e r R e v i e w
Time box 1
Time box 2
Time box 3
18
DSDM Life Cycle
Feasibility
Business Study
Agree Schedule
Implement
Functional Prototype Iteration
Implement- ation
Create Functional Prototype
Identify Functional Prototype
Business Review
Train Users
Review Prototype
User approval user guidelines
Review Design Prototype
Design Prototype Iteration
Identify Design Prototype
Create Design Prototype
Agree Schedule
19
Project planning topics
20
REASONS FOR PLANNING
  • Target Verification
  • Resource Planning
  • Commitment
  • Basis for What-if
  • Enforces Pre-thinking
  • Step in Delegation
  • Basis for Control

21
ELEMENTS OF A PLAN
  • Project Activities
  • Tasks / Deliverables
  • Dependancies
  • Resource Estimates
  • - mandays - Mb
  • Activity Schedules
  • Resource Budgets
  • By Task / Person
  • Risk Analysis
  • Risk - Reductions - Contingency
  • Quality Plan

22
ACHIEVES THE GOAL
  • Meets the projects objectives
  • Deliverables based
  • NOT 80 of program XYZ done
  • SHOULD BE program XYZ signed off by team leader
    as complete
  • All activities catered for
  • Major cost overrun
  • Checklists /Methodologies

Programmers
CONTRACT
Project Manager
Customer
23
REALISTIC TARGETS
  • Short Activities
  • eg 1 week
  • Real Estimates
  • NOT time sliced
  • Dependencies
  • all clearly defined
  • Calculated Contingency
  • from risk analysis
  • per task not the whole
  • risk of wrong estimates
  • OSINTOTS
  • rework

24
PURPOSE OF CONTROLLING THE PLAN
PLAN
OBTAIN AGREEMENT
REASSESS
PROCEED
  • To monitor progress
  • To provide motivation
  • Input to formal estimating technique
  • Input to staff performance reviews
  • Basis of lessons learnt
  • Mainly to trigger replanning

25
PROCESS OF DECOMPOSITION
Project WBS
Program Design
Program Coding
Program Testing
Prog Prog Prog A B C
Prog Prog Prog A B C
Prog Prog Prog A B C
  • Guidelines
  • Method of evaluating task completeness
  • Tasks clearly defined
  • Tasks should be assigned to very few people
  • Time duration short enough to monitor
  • Task cohesion (same type of work)
  • Minimise task coupling (interdependency)

26
DEPENDENCIES
  • Types
  • Finish - Start
  • Start-Start
  • Finish-Finish
  • Partial Finish -Start
  • Predecessor task has another dependent task on
    it.
  • Successor task is dependent on another task

27
SCHEDULING
  • Gantt (bar) Chart
  • What to be done
  • Who will do it
  • When it will be done
  • Activity Network
  • Inter-dependencies
  • Estimated Effort
  • Resource Matrix
  • Skills/resources required
  • Under/over utilisation
  • Used to smooth team size
  • Loss Factors
  • Non-productive time
  • Absences
  • Increases with team size
  • 4 days a week productive

28
NETWORK ANALYSIS
  • Steps
  • Establish sequence
  • Timetable for each job
  • Analysing spare time
  • Identifying most critical jobs
  • Advantages
  • Attention can be given to most critical jobs
  • Spare time can be utilised
  • Resources can be balanced
  • Project Completion dates advanced
  • Updating/Revision of plans easier
  • Diagrams
  • Activity on the arrow
  • Activity on the node

29
NETWORK CONVENTIONS
  • Start and Finish Events
  • Single start activity and completion
    activity
  • Left to right Dependencies
  • No Dangling Events
  • No Looping
  • Avoid Redundancy
  • Example Node

A B
A B C E (finish) D.........?
A B C
A B C
EST Earliest Start Time LST Latest Start
Time DUR estimated DURation SLK SLacK time
(LST-EST)
Activity
EST
DUR
LST
SLK
30
WHY ESTIMATING IS IMPORTANT
  • To determine feasibility
  • Calculate when and at what cost a system can be
    delivered
  • Deciding on staffing policies and how to carry
    out a project
  • If our estimates are inaccurate then our ability
    to control and deliver the system on time and to
    budget is affected
  • Affects all subsequent stages if we get it wrong

31
HOW ESTIMATES FIT INTO OVERALL PROCESS
  • It is one element of an iterative process which
    includes
  • information gathering
  • breaking down the job
  • estimating of effort and skills
  • scheduling
  • monitoring and control
  • reporting

Accuracy 100
Information Available
32
WHY THERE ARE PROBLEMS
  • Inappropriate expectations of accuracy
  • Lack of experience and methods
  • Different methods used by different people
  • Poor record in the past
  • Failure to take into account changes in
    development environment
  • Lack of separation of estimating from scheduling
  • Skill level assumed is not clear
  • Elapse time between original estimate and the
    event
  • No estimate for dealing with problems

33
METHODS
MODEL
Known factors
Estimate
  • Function Point Analysis
  • Cocomo - lines of code based
  • Pert - weighted average
  • (A4BC)/6
  • Ratios
  • High Level (phases)
  • Low Level (tasks)
  • Standard Estimates
  • Fixed
  • Multiplied by number of items
  • Based on complexity / size
  • Multiple Estimate
  • Compare and contrast
  • Evolution of a local model

34
FUNCTION POINT ANALYSIS
  • Steps in performing FPA
  • Defining the system boundary
  • Defining the parameters
  • external inputs
  • external outputs
  • internal logical data groups
  • external logical data groups
  • inquiries
  • Identifying unique occurrences of parameters
  • Complexity assessment
  • low, average, high
  • General System Characteristics
  • VAF (TDI x 0.01) 0.65
  • Final FP score unadjusted score x VAF

35
COCOMO
  • Basic level
  • Effort a x sizeb size KLOC
  • Organic (small scale)
  • Effort 2.4 x size1.05
  • Embedded (large, formal)
  • Effort 3.6 x size1.2
  • Semi-detached (medium)
  • Effort 3.0 x size1.12
  • Intermediate
  • takes other factors into account
  • Extended
  • breaks the project down into phases / tasks

36
PROJECT CONTROL
  • Planning alone does not ensure the success of a
    project
  • The project must be controlled
  • monitoring progress
  • evaluating performance
  • making adjustments
  • reacting to changes

PLAN
CONTROL LOOP
FEEDBACK
INFORM
ACTIVITY
37
CONTROL PROCESS
Change Requests
Timesheets
INPUT INFORMATION
QUESTIONS
Are we on schedule?
OUTPUTS
New Resource Requirements
Revised Plan
Progress Report
38
SLIPPAGE / VARIANCE
  • Common Causes
  • Calculating Slippage / Variance
  • Earned Value
  • Use of network chart
  • Actions
  • Extra Manpower ?
  • Active management
  • Recovery plans
  • Work longer / harder
  • Training
  • Examine all possible solutions
  • Hustle
  • Review critical path
  • Smooth the way ahead
  • Remove / Reduce tasks
  • phase system in?

39
Quality management topics
40
DEFINITION OF QUALITY
  • Crosby defines quality as
  • Conformance to requirements
  • not goodness or elegance
  • Quality is implemented through prevention of
    defects
  • not through post-manufacturing inspection
  • Performance standard must be zero-defects
  • not thats near enough
  • The measurement of quality is the price of
    non-conformance

41
COST OF QUALITY
  • Prevention
  • Planning
  • Standards
  • Training
  • Improving the process
  • Doing
  • Cost of Production
  • eg production of program design
  • Appraisal
  • Reviews Testing
  • Failure
  • Reworked Scrapped Work
  • Data Corrections
  • Retests
  • User Discussions
  • Cost of Q. price of non-conformance

42
CUSTOMERS
  • Identify your customers
  • Internal / External
  • Senior / Junior
  • Involve throughout the project
  • Mere Perception
  • The individual perceives service in his or her
    own way Arch McGill (IBM VP)
  • Tom Peters adds ....in his or her own unique,
    idiosyncratic, human, emotional, end-of-the-day,
    irrational, erratic way.

43
QUALITY ETHOS
  • Quality comes from people
  • It happens because YOU want it to happen
  • Quality staff Quality work?
  • Avoid the were no worse than anyone else
    syndrome
  • Ownership motivation for quality
  • Difficult to improve 1 thing 100 It is easier
    to improve 100 things by 1
  • Book A Passion for Excellence, Peters Austin

44
Accumulating Errors
Ideas, wishes and requirements
Correct Reqmt
Faulty Reqmt
Reqmt Definition
System Specifn
Correct Specifn
Specifn Errors
Errors induced from reqmts
Induced errors from
Correct Design
Design Errors
Design
Reqmts
Specficn
Induced errors from
Correct Program
Program Errors
Coding
Reqmts
Specficn
Design
Known Uncorrected errors
Testing Integration
Correct Operation
Corrected Errors
Unknown Errors
Software with known and unknown faults
Source Wallmuller Fig 1.3
45
Percentages of fault costs compared with
development costs
Management 9
Quality Assurance, Configuration Mgt 6
CORRECTIONS 39
1
5
4
10
6
12
10
Rqmts
10
18
Sys Specn
9
Detailed Design
Coding
Integration System test
Adapted From Moller
in Software Quality Reliability
Ed. Ince
46
QUALITY PLANNING
Quality Management System
Methodology
Development Environment
Documentation Practices
Standards
Configuration Management
CASE tools
Review Procedures
Quality Plan
Test Plan
Project Plan
47
CONTENTS OF THE PLAN
  • Purpose
  • Reference Documents
  • Management
  • Documentation
  • Standards, Practices, Conventions, and metrics
  • Reviews and Audit
  • Test
  • Problem Reporting Correction
  • Tools, techniques, and methodologies
  • Code Control
  • Media Control
  • Supplier Control
  • Record Keeping
  • Training
  • Risk Management

48
USE OF SOFTWARE METRICS
  • Types
  • Effort Used
  • Productivity Rate
  • Lines of Code
  • Defect Data
  • Complexity
  • Mean Time To Failure
  • Change Requests
  • Uses
  • Compare with Plan/Expectations
  • Identify Problem Areas
  • Identify Areas for Using Tools
  • Improved Information for Future Projects
  • Implementation Decision

49
RISKS
  • System will never be delivered
  • System will be delivered late
  • System will exceed budget
  • Project will divert user resources to an
    unacceptable extent
  • System will lack functionality
  • System will contain errors
  • System will present difficulties to users in
    using it
  • System will be difficult / costly to support /
    enhance

Anyone of these risks could lead to project
failure
50
COUNTER MEASURES
  • Objective
  • to reduce / eliminate risk that something will
    cause the project to fail
  • Examples
  • Extra tasks
  • eg develop prototype to check user interface is
    okay
  • Contingency
  • eg extra time available
  • Ensure controls in place
  • eg contracts signed off
  • Avoid assumptions
  • eg excellent programmer from last project will
    join you on new project
  • Have options available
  • eg if performance problems then have extra
    machine time available
  • Avoid promises
  • eg use windows for implementation dates

51
QUESTION / METRIC
  • Where are the errors found?
  • Error location
  • What type of errors are they?
  • Error classification
  • What is our productivity Rate?
  • Effort
  • Deliverables Signed Off
  • Number of lines of code
  • How many more test runs do we need?
  • Number of outstanding errors
  • Error location
  • How successful are our inspections?
  • Effort
  • Number of defects found
  • How can we improve the development process or
    what training is required?
  • Origin of Error

52
Analysis of Defect Metrics
Defect Density


No. of Defects

Size of program
53
ORIGIN OF DEFECTS
54
RELIABILITY
No. of Defects Outstanding
Time
Cumulative Defects
Failure Intensity
Time
55
TARGETING MAIN PROBLEMS
Pareto Chart
56
PRINCIPLES FOR USING METRICS
  • Pragmatism and Compromise
  • Measuring People - Dont!
  • Modelling Simplification
  • Ask not for whom the bell tolls - ask why?
  • The sum of the whole is greater than the
    constituent parts
  • Culture Shock!

57
REVIEW OF DELIVERABLES
  • Team Leader Review
  • Eg. Programming Team Leader
  • Check for conformity, consistency and
    completeness
  • Peer Review
  • Eg. Another designer within team
  • As with team leader review
  • Walkthrough
  • Eg Designer gt Programmer
  • To confirm understanding and allow customer to
    ask questions
  • Inspections
  • Several people
  • (Eg. Moderator, Author, Customer, Specialist,
    Peer, User)
  • To trap and correct defects
  • To review against requirements

58
INSPECTION PROCESS
Input Documents
Exit Criteria
Deliverables
Checklists
INSPECTION
Correct Material
Defect and Process Data
59
INSPECTION CYCLE
Produce Deliverable Moderator checks
suitability Choose inspection team Overview
meeting Preparation Inspection
Meeting Rework Moderator Sign-off
Reinspection if gt25 changed
60
Walkthrough Vs Inspection
Walkthrough None Sometimes Implied Usually
Author No 10 7
Inspection Compulsory Essential Explicit Neve
r Independent Yes 25 1
Precise Entry / Exit Criteria Checklists
Used Rework Stage Defects discussed/ corrected
in meeting Leader of meeting Defects formally
recorded and analysed Defects found per
review Defects found per man hour
61
CLEANROOM APPROACH
  • More effective than debugging
  • 90 defects found before test
  • Total defect count drops
  • Forces problems out early
  • Based on statistical quality control
    mathematical verification
  • Error reduction based on areas of most frequent
    usage

62
TEST STAGES
Black Box Test
Requirements Specification
Acceptance Testing
Black Box Test
Systems Specification
System Testing
White Box Test
(sub)System Design
Integration Testing
Black Box Test
White Box Test
Module Design
Module Testing
Black Box Test
Coding
Source Wallmuller
63
TEST DOCUMENTATION
Quality Plan
System Documentation
Project Plan
Test Plan
Test Design Specification
Test Cases Procedures
Test execution
Test Log
Test Incident Report
64
TEST PLAN
  • Overall strategy for testing system
  • shows how tests will prove the system,
    including such things as
  • stress/performance testing
  • recovery
  • regression testing
  • Objective for each kind of test
  • avoid overlaps
  • Criteria for completion
  • how to decide when to stop
  • Test schedule
  • tasks dependencies
  • Responsibilities
  • team organisation
  • Resources
  • eg. automation tools
  • Test procedures documentation to be produced
  • test details
  • eg. conditions, data, expected results,
    instructions, tools
  • evidence

65
AUTOMATING TESTING
  • Capture Playback
  • data input
  • Comparator
  • file/DB
  • program code
  • output vs expectation
  • Input Generation
  • for stress testing
  • use of random numbers
  • Verification (static analysis)
  • eg compilers, spell checkers, style checker
  • Simulator
  • simulates real world
  • Test Harness / Test Driver
  • enables tests to be run unattended
  • Test Coverage Measurement
  • checks if all conditions/paths tested
  • Debugging
  • observation of variable states

66
Software management topics
67
People Management (1)
  • In software development, talented programmers
    are known to be ten times more productive as the
    less talented members of a team
  • Cusumano (1997) How Microsoft makes large teams
    work like small teams, Sloan Management Review

68
LEADERSHIP STYLES
  • Autocratic (Boss-Centred)
  • Manager makes decision and announces it
  • Manager sells decision
  • Manager presents ideas and invites questions
  • Manager presents tentative decision subject to
    change
  • Manager presents problem, gets suggestions, makes
    decision
  • Manager defines limits, asks group to make
    decision
  • Manager permits sub-ordinates to function within
    limits defined by superior
  • Democratic (sub-ordinate centred)

69
TEAM BUILDING
  • Ensure commitment to goal
  • Team loyalty to one another
  • Obtain team agreement
  • Encourage contributions and views
  • Listen to views - take action!
  • Organise team to give clear responsibilities and
    structure
  • Optimise team size
  • 5 - 10 people per team
  • Breakdown of team should minimise need for
    co-ordination
  • Realistic but positive leadership
  • Work closely with customers
  • Manage conflict

70
COMMUNICATION
  • Within Team
  • Team Meetings
  • Newsletters
  • Noticeboard / Wallchart
  • Top / Down Bottom / Up
  • Peer to Peer
  • Formal System Information
  • eg change requests, new CASE tool
  • External to Team

Management
Users Team Other Teams
Suppliers
71
PRODUCTIVITY
Perfectly partionable task (with no communication)
months
People
Unpartionable Task
months
People
Task with complex interrelationships
months
People
No of links n (n-1) / 2
72
GOAL SETTING
  • SHORT TERM
  • Use of work plans
  • Work Plan for Ian
  • LONG TERM
  • Use of job reviews / Appraisals
  • frequency?
  • major objectives
  • development of staff
  • review objectives
  • No shocks


Task Prepare for interview Interview Marketing
Manager Write up interview Update data
model Update DFDs
Deliverable Questionnaire Completed
Questionnaire Minutes Data Model DFD
Estimate 8 hours 4 hours 6 hours 2 hours 8
hours
Target 23/11 24/11 25/11 25/11 27/11
73
HOW DO WE MANAGE THE CUSTOMER?
  • Know their business
  • Set clear responsibilities
  • Communicate frequently
  • Involve them throughout
  • Give low expectations
  • Use contracts
  • Plan for changeover

74
COMMUNICATION
  • Plan Involvement
  • Users involved throughout project
  • Inform of changes to plan or design
  • Demonstrate system early
  • Report progress
  • Ensure users report problems
  • Communicate information on new releases
  • Use of contracts / service level agreements
  • Measure satisfaction
  • Get to know the customers

75
MANAGING EXPECTATIONS
  • The users views of what to expect develop
    throughout the project
  • 20 of programming will give 80 of functionality
    - Code this first!
  • Some Guidelines
  • Tell them the worst
  • Keep the good news until it is certain
  • Prepare users for problems
  • Be clear on essential vs. desirable
  • Do not assume functions should be computerised

76
MANAGING THE IMPLEMENTATION
  • Preparation for live running
  • user procedures
  • operation procedures
  • training
  • control procedures
  • conversion planning
  • regression plan
  • Going Live
  • Conversion of data, etc..
  • Technical environment
  • Software release
  • version control
  • Hardware installed
  • Customer Acceptance
  • sign off contract

77
WHY DO IMPLEMENTATIONS FAIL?
  • System does not meet requirements
  • Lack of planning
  • Lack of management action after implementation
  • No success factors to judge by
  • Lack of training
  • Poor user morale / no desire
  • Over expectations
  • Not making full use of the system
  • Benefits not achieved
  • Resistance to change
  • Fear
  • absenteeism
  • avoidance of system
  • redundancies
  • Result
  • Effect on Business
  • Internal

78
COST / BENEFIT ANALYSIS
  • COSTS
  • Installation
  • Running
  • BENEFITS
  • Tangible
  • measurable now
  • Indeterminate
  • measurable afterwards
  • Intangible
  • not measurable
  • Simple Payback Equation
  • Time Installation Costs
  • Period Benefits (p.a.) - Running Costs
    (p.a)
  • eg. 100,000 4 years
  • (30,000-5,000)

79
PRESENTATION OF COST/BENEFIT ANALYSIS
  • Simple Presentation
  • eg graphical
  • eg charts
  • Year 1 Year 2 Year 3
  • Costs 10,000 5,000 3,000
  • Benefits 0 12,000 15,000
  • Cash flow -10,000 7,000 12,000
  • Put detail in back up documents
  • Agree management requirements first

-
Implementation
Break even point
80
ENSURING THE REWARDS ARE REAPED
  • Maximising Benefits
  • Winning over the users
  • Training / Procedures
  • Plan to gain benefits (rigorously
    manage accordingly)
  • eliminate functions
  • regroup staff
  • Contingency plans
  • Selling to multiple customers or other companies
  • Minimising Costs
  • Good planning and control
  • Being effective in development
  • Controlling suppliers
  • Managing risks
  • Reducing maintenance
  • minimising cost of making change
  • minimise number of changes

81
Why the productivity paradox?
  • Amount of money spent
  • Benefits not matching investment
  • IS over hyped?
  • Competitive advantage - disaster dichotomy
  • Difficulty in estimating costs and measuring
    benefits
  • Timing of returns

82
Types of justification
  • Cost / benefit
  • Payback period
  • RoI
  • Cashflow
  • Business Value
  • RoM
  • Information Economics
  • Strategic Value
  • CSF

83
Changing nature of IS justification
Repositioning
Making money Saving money
Differentiation
Effectiveness
Efficiency
time
Source Lincoln (1990)
84
Moving from a rational to interpretive view
Viewpoint Objective / Rational
Efficiency Zone
Effectiveness Zone
Understanding Zone
Viewpoint Subjective / Political
85
Transforming the IT function
From Monopoly Supplier to Mixed Sourcing
From Systems Analysts to Business Consultants
From Craftsmen to Project Managers
From Business to Industry Standards
PROCESS
PEOPLE
PURPOSE
From Decentralized Bias to Centralized Topsight
From Large Function to Lean Teams
From System Provider to Infrastructure planner
Transformation of the IT function at British
Petroleum, Cross, Earl and Sampler, 1997
86
The Software Process
Plans
Tools
Requirement
Product
Activities
People Management
Standards Procedures
87
The Problem ...
  • For every six new large-scale software systems
    that are put into operation, two others are
    cancelled. The average software development
    project overshoots its schedule by half larger
    projects generally do worse
  • W. Wayt Gibbs

88
Software Process Improvement
  • SEI believes the quality of software is largely
    determined by the quality of the software
    development and maintenance processes used to
    build it
  • ESI focuses on the organisational and management
    challenges of producing software, as it is
    increasingly recognised that purely technological
    solutions yield benefits that are difficult to
    sustain

89
The Capability Maturity Model
Maturity levels
Process capability
Contain
Key process areas
Goals
Organised by
Common features
Implementation or institutionalisation
Contain
Key practices
Activities or infrastructure
90
The Capability Maturity Model
CMM Level
Focus
1 - Initial 2 - Repeatable 3 - Defined 4 -
Managed 5 - Optimised
Competent people heroics Project management
processes Engineering processes and
organisational support Product and process
quality Continuous process improvement
91
Process Improvement Cycle
Build executive support
Implement action plan
Build improvement infrastructure
Develop improvement action plan
Assess the organisations software process
92
Process Improvement Infrastructure
Steering Committee
Division Executive
Program Manager
S/w Engineering Process Group
Software Manager
Technical Working Groups
Software Staff
93
Software Process Improvement and Capability
dEtermination
  • SPICE is an attempt to draw together the CMM,
    Bootstrap and other approaches to date.
  • The SPICE project has three goals
  • to develop a working draft for a standard for
    software process assessment
  • to conduct industry trials of the emerging
    standard
  • to promote the technology transfer of software
    process assessment into the software industry
    world-wide

94
Process Improvement - the Benefits
  • According to a survey by SEI
  • Productivity gains/year 35
  • Time to market (reduction/year) 15-23
  • Post release defects 39
  • Business value ratio 51 (benefitscosts)
  • Savings at Raytheon
  • Rework cost reduction 16m
  • Productivity gains 130 over 5 years

95
The impact of maturing
Customer Satisfaction
100
Productivity
80
60
Staff morale
of companies good / excellent
40
20
Initial Repeatable
Defined Maturity Level
96
Process Improvement - the Pitfalls
  • Cost time exceed expectations
  • Most difficult aspect appears to be project
    planning
  • Keeping the people on board
  • Im a programmer, I dont want all this!
  • Using the metrics to improve
  • Organisational politics
  • Turf guarding

97
Current Status
  • SEI
  • 1991 81 at level 1 none at 4or 5
  • 1999 approximately 100 at level 4 or 5
  • ESI benchmark of European companies
  • 66 have common coding standards
  • Only 56 track actual project costs
  • 75 had post implementation review
  • UK one of leading European countries in software
    management practice

98
Some Key Aspects of Improving (1)
  • Senior management need to actively monitor SPI
    progress
  • Set clear SPI goals
  • Staff time dedicated to improvement
  • Clear, compensated assignment of responsibility
  • SEPG staffed by respected people
  • Technical staff involved in improvement

99
Some Key Aspects of Improving (2)
  • Team work
  • Learn from own experience
  • Starting from where you are not where you would
    like to be
  • Introduce new ideas in order of hit rate
  • Start to measure the processes as soon as
    possible
  • Focus on early removal of defects

100
A Typical Approach to SPI
  • Single champion
  • Motivated by current problems
  • Diffusion is ad-hoc
  • No corporate view
  • Will often fail
  • cost gt expected
  • champion leaves
  • benefits take time

101
IDEAL model
Acting
Leveraging
Initiating
Diagnosing
Establishing
102
Product Management Issues
  • Rapid changes in the market place
  • First product often wins significant market share
  • IS function needs to be seen to be profitable
  • Applications are developed as one-off activities
  • Reuse limited to cut and paste approaches or
    low level components

103
Software Maintenance
  • Types of maintenance
  • corrective
  • adaptive
  • perfective
  • Significant cost of maintenance
  • Poor perception
  • Lacks good staff continuity from development
    team
  • Patchwork changes not strategic re-engineering

104
Moving from Backroom to Boardroom
  • Build a responsive IT infrastructure to enable
    change
  • Use business level components as well as low
    level components
  • helps executives relate to reuse
  • Components
  • clear interfaces and behaviour reusable
  • Manage the risk of poor components

105
A Packaged Software Organisation
  • Market leader in demographic data analysis
    software
  • Multiple products targeted at different markets
  • Previous experience with OO and technical level
    components
  • Key changes in the business leading to multiple
    products and high levels of reuse

106
Product Portfolio
Customer Relationship Management
Data Mining
Sales Management
Market Profiling Packages
Bespoke Solutions
New Prospect Finder
Vertical Market Products
Level of Sophistication
Local Area Marketing
Relationship Marketing
Market Analysis
107
Component Architecture
Packaged Applications
Bespoke Applications
Business Portfolio
Reporting
Mapping
Profiling
Multimedia
Business Components
Databases
Statistical Models
KBS Models
Internet Browser
Technical Components
108
Organisational Structure
Source of Software
Ownership
Application Teams
Executive Strategy Group
Application Portfolio
Component Teams
Business Components
Technical Steering Group
External Sources
Technical Components
109
A Flexible Approach
  • Build for change
  • Components reduce repetitious work
  • Reduces lead time to market
  • Avoids mass legacy systems
  • Involve executives in maintenance decisions for
    added value of current products

110
Conclusion
  • We need to manage people, process and product
  • We need to reconsider some traditional views of
    project management
  • The future chaos not panaceas (Baskerville and
    Smithson, 1995)
Write a Comment
User Comments (0)
About PowerShow.com