EEE492'25 Capability Maturity Model CMM - PowerPoint PPT Presentation

1 / 50
About This Presentation
Title:

EEE492'25 Capability Maturity Model CMM

Description:

The implication is that by following an effective process one will ... how do we know if an organization is using these concepts, or just using the buzzwords? ... – PowerPoint PPT presentation

Number of Views:73
Avg rating:3.0/5.0
Slides: 51
Provided by: GregPh4
Category:

less

Transcript and Presenter's Notes

Title: EEE492'25 Capability Maturity Model CMM


1
EEE492.25 Capability Maturity Model (CMM)
Royal Military College of Canada Electrical and
Computer Engineering
Reference HvV 6.6
  • Dr. Terry Shepard
  • shepard_at_rmc.ca
  • 1-613-541-6000 ext. 6031

Major JW Paul Jeff.Paul_at_rmc.ca 1-613-541-6000
ext. 6091
With contributions from SEG members
2
What is the purpose of 492A?
  • Software Process and Quality
  • The implication is that by following an effective
    process one will produce better software
  • We have looked at many discrete processes, but
  • how do we know if an organization is using these
    concepts, or just using the buzzwords?
  • One Answer CMM
  • trained appraisers for CMM
  • Standard CMMI Appraisal Method for Process
    Improvement (SCAMPI)
  • note - there are other standards
  • ISO 9000
  • also CMM is being adapted to agile processes

3
Some Definitions
  • Software Process
  • set of activities, methods, practices, and
    transformations used to develop and maintain
    software and associated products
  • project plans, design documents, code, test
    cases, user manuals
  • Software Process Capability
  • knowing what the above will give you
  • Software Process Performance
  • actual results from the above
  • Software Process Maturity
  • how well the process is specific process is
    explicitly defined, managed, measured,
    controlled, and effective
  • also includes consistency and change

4
Software Process Improvement
  • Six steps to improve software capabilities
    through process improvements (one method)
  • 1. Understand the current development process or
    processes
  • 2. Develop a vision of the desired process
  • 3. List the required process improvement actions
    in order of priority
  • 4. Produce a plan to implement the actions
  • 5. Commit the resources and execute the plan
  • 6. Start over at step 1

5
Immature Software Organization
  • Does not mean they produce poor code
  • Characterized by
  • ad-hoc/improvised processes (project dependant)
  • process are not rigorously followed (if
    specified)
  • reactionary to immediate crisis (fighting
    fires)
  • quality and function compromised to meet schedule
  • quality related activities often eliminated due
    to schedule pressures
  • schedules and budgets routinely exceeded
  • no objective basis for measuring quality
  • hard to predict future events...

6
Mature Software Organizations
  • Does not mean they produce good code, but
  • Characterized by
  • organization wide ability for managing software
    development and maintenance
  • process is integral to the organization
  • communicated to staff staff follow process
  • process is useable and useful
  • process is not static (evolves in controlled
    manner)
  • fit for use updated as necessary
  • objective and quantitative quality metrics
  • schedules and budgets based on historical data
  • and thus usually achieved

7
The Birth of CMM
  • 1986, the Software Engineering Institute (SEI),
    investigated maturity framework (Watts
    Humphrey)
  • 1987 questionaires
  • software capability evaluation
  • maturity questionaire
  • 1991 - CMM
  • a set of key processes and recommended practices
  • guidance on how to gain control of their process
    and how to evolve toward a culture of software
    engineering and management excellence

8
Observations that motivated CMM
  • productivity and quality gains from methodologies
    and technologies not near what was expected in
    the early 80s
  • difficult to do better in a chaotic process
  • in undisciplined organizations, most projects
    produce poor results
  • in undisciplined organizations, some projects
    produce excellent results
  • usually the result of heroic effort
  • repeating the result means repeating the heroics

9
Capability Maturity Model (CMM)
  • used as a standard for appraising the current
    state of the organizations software process
  • used as a guide for identifying and prioritizing
    the actions comprising the software process
    improvement effort
  • Made up of 5 levels and 18 key process areas
    (KPAs)
  • a CMM-based maturity questionnaire may be used
    to assess the software capability of a particular
    organization
  • government may use this to assess the capability
    of potential software development contractors

10
CMM Levels
  • 1 Initial Competent People and Heroics
  • 2 Repeatable Project Management process
  • 3 Defined Engineering Process Org Support
  • 4 Managed Product and Process Quality
  • 5 Optimizing Continuous Process Improvement

11
Basic descriptions of CMM levels
software process is ad hoc, maybe even chaotic
few processes are defined success depends on
individual effort and heroics basic project
management practices to track cost, schedule,
functionality necessary process discipline is in
place to repeat earlier successes on projects
with similar applications software process for
both management and engineering activities is
documented, standardized, integrated into a
standard software process all projects use an
approved, tailored version of the organizations
standard software process for developing and
maintaining software detailed measures of the
software process and product quality are
collected both the software process and products
are quantitatively understood and
controlled continuous process improvement is
enabled by quantitative feedback from the process
and from piloting innovative ideas and
technologies
  • Initial
  • Repeatable
  • Defined
  • Managed
  • Optimizing

1 2 3 4 5
Reference www.sei.cmu.edu/cmm/cmm_sum.html
12
SW-CMM Structure
13
Common Features
  • Commitment to Perform (CO)
  • groups all generic practices related to creating
    policies and securing sponsorship for process
    improvement efforts.
  • Ability to Perform (AB)
  • groups all generic practices related to ensuring
    that the project and/or organization has the
    resources it needs to pursue process improvement.
  • Directing Implementation (DI)
  • groups the generic practices related to
    collecting, measuring, and analyzing data related
    to processes. The purpose of these activities is
    to provide insight into the performance of
    processes.
  • Verifying Implementation (VE)
  • groups all generic practices related to verifying
    that the projects and/or organizations
    activities conform to requirements, processes,
    and procedures.

14
Structure of CMM
maturity levels
indicate
process capability
contain
key process areas
achieve
organized by
goals
common features
address
contain
implementation / institutionalization
key practices
describe
activities / infrastructure
15
Key Process Areas (KPAs)
  • each maturity level (except 1) is decomposed into
    several key process areas that indicate the areas
    an organization should focus on to improve its
    software process
  • a cluster of related activities which
    collectively achieve a set of important goals
  • when the goals are accomplished on a continuing
    basis, the KPA is said to be institutionalized

16
Key Process Areas (contd)
  • KPAs are enhanced in succeeding levels
  • to achieve a maturity level, the KPAs for that
    level must be satisfied
  • there are other processes deemed to be not key to
    achieving a maturity level they are not
    addressed by the model

17
Key Process Areas (KPAs)
  • Individual Effort Heroics (no KPAs)
  • Requirements Management
  • SW Project Planning
  • SW Project Tracking Oversight
  • SW Subcontract Management
  • SW Quality Assurance
  • SW Configuration Management
  • Organization Process Focus
  • Organization Process Defintion
  • Training Program
  • Integrated SW management
  • SW Product Engineering
  • Intergroup Coordination
  • Peer Reviews
  • Quantitative Process management
  • SW Quality Management
  • Defect Prevention
  • Technology Change Management
  • Process Change Management

1
initial
2
repeatable
3
defined
4
managed
optimizing
5
18
SW-CMM Maturity Levels
Optimizing (5) Continuously Improving Process
Process Control
Managed (4) Predictable Process
Process Measurement
Defined (3) Standard, Consistent Process
Process Definition
Repeatable (2) Disciplined Process
Basic Management Control
Initial (1) Heroics
19
The Initial Process (no KPA)
  • Risk of Total Chaos
  • No management mechanism in place to plan and
    track the work of individuals
  • If procedures are established they are abandoned
    during a crisis
  • PM Panic Management (make the biggest fire
    smaller)
  • tends to be continuous
  • capability of org characteristic of individuals

Basic Management Control
Initial (1)
20
To Improve to Repeatable Process
  • Understand the difference between speed and
    progress
  • Basic project control
  • Project management
  • project size estimation
  • Management oversight
  • quarterly review of process
  • Quality assurance
  • establish a QA organization (? 5-6 of
    development org)
  • Change control

21
The Repeatable Process
  • Provides control over the way the organization
    establishes its plan and commitments
  • basic software management controls exist for
    tracking cost, schedule and functionality
  • Experienced at doing similar work
  • realistic project commitments based upon previous
    results

Process Definition
Repeatable (2)
Basic Management Control
22
Repeatable (level 2) KPAs
  • Software configuration management
  • Software quality assurance
  • Software subcontract management
  • Software project tracking and oversight
  • Software project planning
  • Requirements management
  • major risks to the organization at this level
  • introduction of new tools will affect process
  • entering new territory, by trying new products
  • can be developing new types of products
  • major organizational changes can be disruptive

23
Getting to the Defined Process
  • Establish a process group
  • ? 1-3 of development org
  • Establish a development process architecture
  • describes the technical and management activities
    for proper execution of the development process
  • Introduce a family of software engineering
    methods and technologies
  • design and code inspection
  • formal/semi-formal design methods
  • library control systems
  • comprehensive testing methods

24
The Defined Process
  • Processes for development and maintenance of
    software is standardized across the corporation
  • software engineering is integrated into the
    larger engineering management processes
  • The Acid-test
  • When faced with a crisis they will continue to
    use the process that has been defined (might
    happen at level 2 as well)
  • But only qualitative
  • Little data to support the effectiveness of the
    process
  • need to move to a quantitative process

Process Measurement
Defined (3)
Process Definition
25
Defined (level 3) KPAs
  • Organization Software process definition
  • Organization Software process focus
  • Training program
  • Integrated software management
  • Peer reviews (including inspections)
  • Intergroup coordination
  • Software product engineering

26
To Improve to the Managed Process
  • Establish basic process measurements to identify
    quality and cost of each process step
  • cost-benefit analysis aimed at continuous
    improvement of the process in a generic way
  • Establish a process database and provide
    resources to maintain it (gathering of data)
  • Assess the quality of the product at each step
  • monitor and revise product quality goals as needed

27
The Managed Process
Process Control
  • Productivity and quality are quantitatively
    measured across the organization
  • Key software processes are instrumented
  • statistical quality control is employed
  • Definition of the measured data is key
  • only gather what you need ()
  • process data must not be used to compare projects
    or evaluate individuals

Managed (4)
Process Measurement
28
Managed (level 4) KPA
  • Quality management
  • Quantitative process management

29
To Improve to the Optimizing Process
  • Causal analysis
  • eliminate the causes of defects
  • Orderly transition of new technologies into the
    organization
  • Use process data to analyze and change the
    process
  • continuous improvement
  • of process to improve product quality
  • of productivity
  • of time needed to develop

30
The Optimizing Process
Optimizing (5)
  • Organizational focus is on continuous process
    improvement is supported by quantitative trend
    analysis as to process strengths and weaknesses
  • Process innovations and new technologies are
    introduced when supported by cost benefit
    analysis
  • Data is available to tune the process itself
  • Ability to put the resources where it counts

Process Control
31
Optimizing KPA
  • Process change management
  • Technology change management
  • Defect prevention

32
CMM vs XP
CMM LEVEL
  • Competent people heroics
  • Requirements Management
  • SW Project Planning
  • SW Project Tracking Oversight
  • SW Subcontract Management
  • SW Quality Assurance
  • SW Configuration Management
  • Organization Process Focus
  • Organization Process Defintion
  • Training Program
  • Integrated SW management
  • SW Product Engineering
  • Intergroup Coordination
  • Peer Reviews
  • Quantitative Process management
  • SW Quality Management
  • Defect Prevention
  • Technology Change Management
  • Process Change Management

1
- - - - - - -
2
3
4
5
33
CMM representations
  • two representations
  • provide alternative approaches to process
    improvement for familiarity with either approach
  • represent two different philosophical approaches
    to process improvement

Reference Menezes, CrossTalk
www.stsc.hill.af.mil/cmmi/more_cmmi.asp
34
CMM Representations (contd)
  • Representation 1. focus on the organization as a
    whole and provide a road map of successive stages
    aimed at improving the organizations ability to
    understand and control its process
  • staged view (comparable to SW-CMM)
  • Representation 2. focus on individual processes,
    allowing the organization to choose which process
    or set of processes need to have more capability
  • continuous view (comparable to systems
    engineering and IPD models)

Reference Menezes, CrossTalk
35
Representations (contd)
  • each representation is a 600 page document
  • equivalent staging
  • sometimes desirable to convert an organizations
    capability level achievements into a maturity
    level
  • cant translate from maturity level back
    capability level

Reference Menezes, CrossTalk
www.stsc.hill.af.mil/cmmi/more_cmmi.asp
36
Continuous Representation
  • groups process areas into
  • process management
  • project management
  • engineering
  • support
  • for each group, assigns a rating from 0 to 5,
    according to an organizations performance on
    process areas in that group

37
Staged Representation
  • groups process areas by maturity level
  • allows an overall assessment leading to an
    assessment of the maturity level observed in an
    organization

38
Some Differences between Representations
Reference www.sei.cmu.edu/cmmi/adoption/cmmi-faq
.html
39
Issues with the CMM
  • key process areas (KPAs) focus mostly on
    activities and supporting artifacts associated
    with a conventional waterfall process
  • requirements specifications, documented plans,
    quality assurance audits and inspections, and
    documented processes and procedures
  • very few of the KPAs address the evolving results
    (i.e., the software product) and associated
    engineering artifacts (use case models, design
    models, source code, or executable code)

Reference Royce, CMM vs. CMMI
40
Issues (contd)
  • no emphasis on the architecting/design process,
    assessment process, or deployment process
  • which have proven to be key discriminators for
    project success
  • also overemphasizes peer reviews, inspections and
    traditional Quality Assurance policing methods
  • although manual reviews and inspections may be
    capable of uncovering 60 of errors, they rarely
    uncover the architecturally significant flaws

Reference Royce, CMM vs. CMMI
41
Issues (contd)
  • most implementations of CMM drive organizations
    to produce more documents, more checkpoints, more
    artifacts, more traceability, more reviews, and
    more plans
  • thicker documents, more detailed information, and
    longer meetings are considered better
  • however, agile methods may be able to be mapped
    on to CMM stay tuned!

Reference Royce, CMM vs. CMMI
42
State of the Industry
Carnagie Melon Software Enginieering
Institute Sofware CMM - CBA, IPI, SPA and SCAMPI
Appraisal Results
100
August 2001 Goal for most organizations is to
achieve level 3 (Royce)
90
Based on most recent assessment since 1997, of
1018 organizations
80
September 2005
70
Based on most recent assessment since 2001, of
1612 organizations reporting a maturity rating
60
50
of Organizations
40
30
20
10
0
Initial
Repeatable
Defined
Managed
Optimizing
43
(No Transcript)
44
Short-comings and Future of CMM
  • Surprise, surprise - CMM is not a silver bullet
  • a mature process is no guarantee of a quality
    product
  • Not well suited for smaller companies / projects
  • Personal Software Process (PSP) is one attempt to
    address this need
  • Crude and harsh 5-point scale
  • if you fail just one of the KPAs, you fail the
    level
  • CMMs now exist for software, people, software
    acquisition, systems engineering and integrated
    product development
  • latest initiative CMM Integration (CMMI)

45
Evolution of CMM
  • initial Capability Maturity Model was developed
    specifically to address software process maturity
  • it was successfully adopted and used in many
    domains
  • other CMMs were developed for other disciplines
    and functions
  • CMMs now exist for software, people, software
    acquisition, systems engineering, and integrated
    product development

46
Not Just Software CMM
47
Decoding abbreviations
  • SCE Software Capability Evaluation
  • SCDE Software Development Capability Evaluation
  • SA-CMM Software Acquisition
  • P-CMM People
  • SE-CMM Systems Engineering
  • SSE-CMM Systems Security Engineering
  • IPD-CMM Integrated Product Development
  • CMMI CMM Integration

48
Process Standards we mostly dont have time to
look at
DoD-Std 2167A
SW CMM
EIA/IEEE J-Std-016
Mil-Std 498
IEEE/EIA Std 12207
ISO 9000 series
ISO/IEC 12207
Reference http//www.software.org/quagmire/
49
Recall our six steps
  • Software Capability Maturity Model is not the
    only method
  • Other options are available
  • ISO 9000 certification
  • ISO 15504 (SPICE)
  • Software Process Improvement and Capability
    dEtermination http//www.sqi.gu.edu.au/spice/
  • introduction of an agile method
  • Also www.spin.org Ottawa Software Process
    Improvement Network
  • many other SPINs exist

50
Supplemental References
  • Capability Maturity Model for Software (Version
    1.1). Carnegie Mellon Software Engineering
    Institute, 1993.
  • www.sei.cmu.edu/publications/documents/93.reports/
    93.tr.024.html
Write a Comment
User Comments (0)
About PowerShow.com