Practical Software Measurement Challenges and Strategies - PowerPoint PPT Presentation

1 / 27
About This Presentation
Title:

Practical Software Measurement Challenges and Strategies

Description:

Provide an overview of the challenges faced by software engineering teams in the ... EPMS (Primavera) IQMEn / E-IQMEn (Quality Metrics Environment) ... – PowerPoint PPT presentation

Number of Views:68
Avg rating:3.0/5.0
Slides: 28
Provided by: krishn55
Category:

less

Transcript and Presenter's Notes

Title: Practical Software Measurement Challenges and Strategies


1
Practical Software Measurement- Challenges and
Strategies
Krishna ArulGSG Scotland
2
Objectives
  • Provide an overview of the challenges faced by
    software engineering teams in the Software
    Measurement arena
  • Outline some of the strategies that are employed
    to meet these challenges in GSG

3
Overview
  • About Motorola
  • Organisation
  • Global Software Group
  • GSG Scotland
  • CMM
  • Concept of Six Sigma
  • Metrics
  • Define Measure
  • Data Set
  • Analysis
  • Tools Methodology
  • Improvements
  • Conclusions
  • Control
  • Future Plans

4
Motorolas Global Software Group
Premier provider of Software Systems Solutions,
Software Products, and Software Technologies to
Motorola Businesses and their customers worldwide
  • Idea conceived 1990 to support rapidly increasing
    software demand in Motorola products
  • First Center established in Bangalore India
  • Currently gt20 city locations around the world
  • Independent organization partnering with Motorola
    business units and their customers
  • Domain-focused centers of excellence
  • Skills orientation
  • Process focus
  • Process/methodology improvement to control costs
    and cycle-time
  • Current world-wide headcount gt 5000

5
GSG Locations
Nanjing
St. Petersburg
Livingston
Chicago
Montreal
Hyderabad
Beijing
Seoul
Krakow
Chendu
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
?
Kuala Lumpur
Perth
Turin
Cordoba
Phoenix
Ft Lauderdale
Bangalore
Singapore
Adelaide
6
GSG Scotland - Background
  • Centre Started in Dec 2000.
  • Why Scotland ?
  • Large talent pool of qualified Engineers from
    Scottish Universities.
  • Synergy with (Motorola) Semiconductors and ISLI
    (Institute of System Level Integration)
  • Proximity to, and ease of doing business with,
    main European markets
  • Scotland is now the lead Automotive centre for
    GSG.
  • Focussed on embedded software applications
    primarily automotive.
  • System-on-Chip (SoC) design team focussed on
    Motorola design needs post Freescale split
  • Achieved CMM L3 certification in May 2004,
    currently working towards Level 5 later this year

7
Six Sigma
3 sigma
3 sigma
8
Software Six Sigma
  • An overall business improvement methodology that
    drives consistently excellent products, services,
    designs and processes.
  • Metric 3.4 DPMO
  • Process Improvement
  • Management System E-Training ,E-Processes
    ,E-Tools ,E-Tracking, E-Visibility
  • Typical software processes operate at between 2.3
    and 3.0 Sigma
  • The best software processes operate at 4 to 5
    Sigma

9
Strategies Process Improvement
Analysis
More Metrics
Metrics
CMM Model
10
DMAIC - Define
  • DEFINE what is important to the Organization ?
  • But what is of paramount importance to GSG?
  • Parameters chosen for Measurement Analysis
    (Scorecard)
  • CSS ( Customer Satisfaction Survey)
  • COQ and COPQ (Cost of Quality and Cost of Poor
    Quality)
  • Productivity
  • Estimation Accuracy
  • Effort
  • Schedule
  • Size

11

DMAIC - Measure
  • Data Source
  • Annual Release data from Motorolas
  • Global Software Centres is used for analysis
  • Tools
  • WISE-TCS (Web based Total Customer Satisfaction
    Tool)
  • EPMS (Primavera)
  • IQMEn / E-IQMEn (Quality Metrics Environment)
  • JMP Tool / Minitab for Statistical Analysis
  • JMP - The Statistical Discovery Software

12
Customer Satisfaction Surveys
  • Pre and Post Project Surveys
  • Criteria of satisfaction and importance
  • Scorecard goal- 8.86 average and 75 of
    projects with all high importance areas rated at
    8 or above
  • Measured against a Baseline

13
COQ and COPQ
  • Cost Of Quality (COQ) is the sum of effort due to
    appraisal, prevention, internal failure and
    external failure, expressed as a percentage of
    total project effort.
  • Cost Of Poor Quality (COPQ) is the sum of effort
    due to internal failure and external failure,
    expressed as a percentage of total project
    effort.
  • COQ COPQ Appraisal Effort Prevention
    Effort

14
COQ and COPQ
  • Current Baseline based on Scorecard
  • Cost of Quality lt25 (/-10 for each project)
  • Cost of Poor Quality lt5

15
Productivity
  • Productivity is the ratio of Delta Code released
    to the customer (New Deleted Modified
    Reused Tool-generated) to the Total Project
    Effort.
  • Productivity increase 10 ? 0.62 KAELOC/SM

16
Estimation Accuracy
  • Estimation Accuracy is the ratio of Actual value
    to the Estimated value of a parameter.
  • Size, Effort and Schedule estimates are not
    stand-alone metrics but should be analysed in
    context with each other.
  • Deviations of all three estimations going outside
    of the limits are indications that the collective
    action of the estimation, planning and control
    processes is not performing well.

17
Estimation Accuracy - Size
  • The size estimation accuracy metric (ZEA)
    provides an insight into the project's ability to
    estimate the size of the project
  • ZEA is critical for Embedded Application where
    size is constraint by target device
  • Size Estimation Accuracy ZEA 100 /- 15

18
Estimation Accuracy - Effort
  • The effort estimation accuracy metric (EEA)
    metric provides an insight into the project's
    ability to estimate effort of the project
  • EEA critical for accurate cost estimation
  • Effort Estimation Accuracy EEA 100 /- 15

19
Estimation Accuracy - Schedule
  • The schedule estimation accuracy metric (SEA)
    metric provides an insight into the project's
    ability to estimate effort of the project
  • SEA critical for On Time Delivery
  • Schedule Estimation Accuracy SEA 100 /- 15

20
DMAIC - Analyze
  • E- IQMEn, PSD and SMSC
  • Project Summary Database (PSD) is the output from
    E- IQMEn, which serves as the GSG organizational
    data repository.
  • Software Metrics Summary Charts (SMSC) are
    integrated with the PSD, with no additional data
    input required by users.

21
Solution Design
22
DMAIC - Improve Software Metrics Summary
Charts-Organizational Health
23
DMAIC - Improve Software Metrics Summary
Charts-Organizational Health
24
DMAIC - Future Plans (1)
  • Concentrate on ensuring that the data in E-IQMEn
    is
  • Accurate The data should provide a true
    reflection of the status of GSG projects, both
    completed and active.
  • Complete The data should be span all major
    metrics areas (effort, size and faults).
  • Up to date Monthly updates to keep data current.
  • Inclusive E-IQMEn should contain as much GSG
    data as possible, for all types of GSG work.

25
DMAIC - Future Plans (2)
  • Continuing work in this area
  • Redefinition of project categories to simplify
    (and othogonalise) schema.
  • Building traceability through intermediate tools
    if required, to allow for better tracking of GSG
    effort usage.
  • Development of interfaces from E-IQMEn to other
    tools to facilitate once-only data entry.

26
DMAIC - Conclusions
  • Return On Investment of such analysis is very
    high to
  • the organization in terms of effort
    spent to
  • perform such analysis. i.e. in
    selection of Domains and
  • type of projects.
  • A strong Measurement System for Software Data
  • is established to ascertain if any
    correlation existed
  • between the domains, effort, errors
    ,and other parameters
  • Any Cost avoidance is Cost saving
  • Nobody said it was going to be easy

27
Thank You
  • Krishna Arul
  • Motorola
Write a Comment
User Comments (0)
About PowerShow.com