The Software Quality Challenge - PowerPoint PPT Presentation

1 / 182
About This Presentation
Title:

The Software Quality Challenge

Description:

Chapter 1 The Software Quality Challenge * Priority rating method Use two factors Factor A: Damage severity level: The severity of results in case the module or ... – PowerPoint PPT presentation

Number of Views:301
Avg rating:3.0/5.0
Slides: 183
Provided by: sst100
Category:

less

Transcript and Presenter's Notes

Title: The Software Quality Challenge


1
Chapter 1
  • The Software Quality Challenge

2
The uniqueness of software quality assurance
  • DO you think that there is a bug-free software?
  • Can software developers warrant their software
    applications and their documentation from any
    bugs or defects ?
  • What are the essential elemental differences
    between software and other industrial products
    such as automobiles, washing machines etc?

3
The essential differences between software and
other industrial products can be categorized as
follows
  1. Product complexity of operational modes the
    product permit.
  2. Product visibility SW products are invisible.
  3. Product development and production process.

4
The phases at which the possibility of detecting
defects in industrial products and software
products
  • SW products do not benefit from the
    opportunities for detection of defects at the
    three phases of the production process
  • Industrial products
  • Product development QA -gt product prototype
  • Product production planning Production - line
  • Manufacturing QA procedure applied
  • Software products
  • Product development QA -gt product prototype
  • Product production planning Not required
  • Manufacturing Copying the product printing
    copies

5
Factors affecting detecting defects in SW
products VS other industrial products
Characteristic SW products Other industrial products
Complexity Usually, v. complex allowing for v. large number of operational options Degree of complexity much lower
Visibility Invisible, impossible to detect defects or omissions by sight ( diskette or CD storing ) Visible, allowing effective detection of defects by sight
Nature of development and production process Opportunities to detect defects arise in only one phase, namely product development Opportunities to detect defects arise in all phases of development and production
6
Important Conclusion
  • The great complexity as well as invisibility of
    software, among other product characteristics,
    make the development of SQA methodologies and its
    successful implementation a highly professional
    challenge

7
The environment for which SQA methods are
developed
  • Pupils students
  • Hobbies
  • Engineers, economics , mgt other fields
  • SW development professionals
  • All those SW developers are required to deal with
    SW quality problems Bugs

8
SQA environmentThe main characteristics of this
environment
  • Contractual conditions
  • Subjection to customer-supplier relationship
  • Required teamwork
  • Cooperation and coordination with other SW teams
  • Interfaces with other SW systems
  • The need to continue carrying out a project
    despite team member changes.
  • The need to continue out SW maintenance for
    extended period.

9
Contractual conditions
  • the activities of SW development maintenance
    need to cope with
  • A defined list of functional requirements
  • The project budget
  • The project timetable

10
Subjection to customer-supplier relationship
  • SW developer must cooperate continuously with
    customer
  • To consider his request to changes
  • To discuss his criticisms
  • To get his approval for changes

11
Required teamwork
  • Factors motivating the establishment of a project
    team
  • Timetable requirements
  • The need of variety
  • The wish to benefit from professional mutual
    support review for enhancement of project
    quality

12
Cooperation and coordination with other SW teams
  • Cooperation may be required with
  • Other SW dev. Teams in the same org.
  • HW dev. teams in the same org.
  • SW HW dev. teams of other suppliers
  • Customer SW and HW dev. teams that take part in
    the projects dev.

13
Interfaces with other SW Systems
  • Input interfaces
  • Output interfaces
  • I/O interfaces to the machines control board, as
    in medical and lab. Control systems

14
The need to continue carrying out a project
despite team member changes.
  • During project dev. Period we might be face
  • Leave from the members of the team
  • Switch in employees
  • Transfer to another city

15
The need to continue out SW maintenance for
extended period.
  • From 5 to 10 years , customers need continue to
    utilizing their systems
  • Maintenance
  • Enhancement
  • Changes ( Modification )

16
Chapter 2
  • What is Software Quality ?

17
What is Software ?
  • IEEE Definition
  • Software Is
  • Computer programs, procedures, and possibly
    associated documentation and data pertaining to
    the operation of a computer system.

18
IEEE Definition is almost identical to the ISO
def. ( ISO/IEC 9000-3 )
  • Computer programs (Code)
  • Procedures
  • Documentation
  • Data necessary for operation the SW system.

19
TO sum up
  • Software quality assurance always includes
  • Code quality
  • The quality of the documentation
  • And the quality of the necessary SW data

20
SW errors, faults and failures
  • Questions arise from HRM conference Page 16.
  • An error can be a grammatical error in one or
    more of the code lines, or a logical error in
    carrying out one or more of the clients
    requirements.
  • Not all SW errors become SW faults.
  • SW failures that disrupt our use of the software.

21
The relationship between SW faults SW failures
  • Do all SW faults end with SW failures?
  • The answer is not necessarily
  • The SW fault becomes a SW failure only when it is
    activated.
  • Example page 17-18

22
Classification of the causes of SW errors
  • SW errors are the cause of poor SW quality
  • SW errors can be
  • Code error
  • Documentation error
  • SW data error
  • The cause of all these errors are human

23
The nine causes of software errors
  • Faulty requirement definition
  • Client-developer communication failures
  • Deliberate deviation from SW requirements
  • Logical design errors
  • Coding errors
  • Non-compliance with documentation and coding
    instructions
  • Shortcomings of the testing process
  • Procedure errors
  • Documentation errors

24
Faulty requirement definition
  • Erroneous definition of requirements
  • Absence of vital requirements
  • Incomplete definition of requirements
  • Inclusion of unnecessary requirements

25
Client-developer communication failures
  • Misunderstandings resulting from defective
    client-developer comunications.
  • Misunderstanding of the clients requirements
    changes presented to the developer
  • In written forms
  • Orally
  • Responses to the design problems
  • others

26
Deliberate deviation from SW requirements
  • The developer reuse SW modules taken from the
    earlier project
  • Due to the time budget pressure
  • Due to the unapproved improvements

27
Logical design errors
  • This is come from systems architects, system
    analysts, SW engineers such as
  • Erroneous algorithms
  • Process definitions that contain sequencing
    errors
  • Erroneous definition of boundary conditions
  • Omission of required SW system states
  • Omission of definitions concerning reactions to
    illegal operations

28
Coding errors
  • Misunderstanding the design documentation
  • Linguistic errors in the prog. Lang.
  • Errors in the application of CASE and other dev.
    Tools
  • etc

29
Non-compliance with documentation and coding
  • Team members who need to coordinate their own
    codes with code modules developed by
    non-complying team members
  • Individuals replacing the non-complying team
    member will find it difficult to fully understand
    his work.
  • Design review to other non-complying team

30
Shortcomings of the testing process
  • Incomplete testing plans
  • Failures to document and report errors and faults
  • Failures to promptly correct detected SW faults
    as a result of inappropriate indications of the
    reasons for the fault.
  • Incomplete correction of detected errors.

31
Procedure errors documentation errors
  • See example page 22

32
Software quality - Definition IEEE
  1. The degree to which a system, component, or
    process meets specified requirements.
  2. The degree to which a system, component, or
    process meets customer or user needs or
    expectations.

33
Software Quality Pressmans def.
  • Conformance to explicitly stated functional and
    performance requirements, explicitly documented
    standards, and implicit characteristics that are
    expected of all professionally developed
    software.

34
Software Quality Assurance The IEEE Definition
  • SQA is
  • A planned and systematic pattern of all actions
    necessary to provide adequate confidence that an
    item or product conforms to established technical
    requirements.
  • A set of activities designed to evaluate the
    process by which the products are developed or
    manufactured. Contrast with quality control.

35
IEEE SQA definition exclude the maintenance
timetable and budget issues.
  • The Author adopts the following
  • SQA should not be limited to the development
    process. It should be extended to cover the long
    years of service subsequent to product delivery.
    Adding the software maintenance functions into
    the overall conception of SQA.
  • SQA actions should not be limited to technical
    aspects of the functional requirements, It should
    include activities that deal with scheduling and
    timetable and budget .

36
SQA Expanded Definition
A systematic, planned set of actions necessary to
provide adequate confidence that the software
development process or the maintenance of a
software system product conforms to established
functional technical requirements as well as with
the managerial requirements of keeping the
schedule and operating within the budgetary
confines.
  • .

This definition corresponds strongly with the
concepts at the foundation of ISO 9000-3,
1997and also corresponds to the main outlines of
the CMM for softwareSee the Table 2.2 page 27
37
Software Quality Assurance Vs. Software Quality
Control
  • Quality Control a set of activities designed to
    evaluate the quality of a developed or
    manufactured product. It take place before the
    product is shipped to the client.
  • Quality Assurance the main objective is to
    minimize the cost of guaranteeing quality by a
    variety of activities performed throughout the
    causes of errors, and detect and correct them
    early in the dev. Process.

38
The objectives of SQA activitiessee page 29
  • Software development ( process-oriented )
  • Software maintenance ( Product-oriented )

39
SQA Vs Software Engineering
  • SW Engineering ( IEEE def. )
  • The application of a systematic, restricted,
    quantifiable approach to the development and
    maintenance of SW that is the application of
    engineering to software.

40
Chapter 3
  • Software Quality Factors

41
SQ. Factors
  • From the previous chapters we have already
    established that the requirements document is one
    of the most important elements for achieving SQ.
  • What is a Good SQ requirements document ?

42
The need for comprehensive SQ requirements
  • Our Sales IS seems v. good , but it is frequently
    fails, at least twice a day for 20 minutes or
    more.( SW house claims no responsibility.
  • Local product contains a SW and every thing is
    ok, but, when we began planning the development
    of a European version, almost all the design and
    programming will be new.
  • etc see page 36.

43
There are some characteristics common to all
these buts
  • All SW projects satisfactorily fulfilled the
    basic requirements for correct calculations.
  • All SW projects suffered from poor performance in
    important areas such as maintenance, reliability,
    SW reuse, or training.
  • The cause for poor performance of the developed
    SW projects in these areas was lack of predefined
    requirements to cover these important aspects of
    the SW functionality.
  • The solution is
  • The need for a comprehensive definition of
    requirements
  • ( SQ Factors )

44
Classification of SW requirements into SW quality
factors.
  • McCalls Factor Model
  • This model classifies all SW requirements into 11
    SW quality factors, grouped into 3 categories
  • Product operation Correctness, Reliability,
    Efficiency, Integrity, Usability
  • Product revision Maintainability, Flexibility,
    Testability
  • Product transition Portability, Reusability,
    Interoperability.
  • See the McCall model of SW quality factors tree
  • see page 38

45
Product operation SW quality factors
  • Correctness Output specifications are usually
    multidimensional some common include
  • The output mission
  • The required accuracy
  • The completeness
  • The up-to-dateness of the info.
  • The availability of the info.( the reaction time
    )
  • The standards for coding and documenting the SW
    system
  • See Example page 39.

46
Product operation SW quality factors
  • Reliability
  • Deals with failures to provide service. They
    determine the maximum allowed SW system failure
    rate, and can refer to the entire system or to
    one or more of its separate functions.
  • See examples page 39 ( heart-monitoring unit )

47
Product operation SW quality factors
  • Efficiency
  • Deals with the HW resources needed to perform all
    the functions of the SW system in conformance to
    all other requirements.
  • See examples page 40 ( CPU speed .. etc )
  • Integrity
  • Deals with the SW system security, that is
    requirements to prevent access to unauthorized
    persons.
  • See examples page 40

48
Product operation SW quality factors
  • Usability
  • Deals with the scope of staff resources needed to
    train a new employee and to operate the SW
    system.
  • See examples page 41

49
Product revision SW quality factors
  • Maintainability
  • Maintainability requirements determine the
    efforts that will be needed by users and
    maintenance personnel to identify the reasons for
    SW failures, to correct the failure, and to
    verify the success of the corrections.
  • Example Typical maintainability requirements
  • The size of a SW module will not exceed 30
    statements
  • The programming will adhere to the company coding
    standards and guidelines.

50
Product revision SW quality factors
  • Flexibility
  • The capabilities and efforts required to support
    adaptive maintenance activities are covered by
    flexibility requirements. This factors
    requirements also support perfective maintenance
    activities, such as changes and additions to the
    SW in order to improve its service and adapt it
    to changes in the firms technical or commercial
    environment.
  • Example page 42

51
Product revision SW quality factors
  • Testability
  • Deal with the testing of an IS as well as with
    its operation.
  • Providing predefined intermediate results and log
    files.
  • Automatic diagnostics performed by the SW system
    prior starting the system, to find out whether
    all components of SW system are in working order.
  • Obtain a report about detected faults.
  • Example page 42, 43

52
Product transition SW quality factors
  • Portability
  • Tend to the adaptation of a SW system to other
    environments consisting
  • Different HW
  • Different OS
  • Example SW designed to work under windows 2000
    env. Is required to allow low-cost transfer to
    Linux.

53
Product transition SW quality factors
  • Reusability
  • Deals with the use of SW modules originally
    designed for one project in a new SW project
    currently begin developed.
  • The reuse of SW is expected to save resources.,
    shorten the project period, and provide higher
    quality modules. These benefits of higher quality
    are based on the assumption that most SW faults
    have already been detected by SQA activities
    performed previously on it.

54
Product transition SW quality factors
  • Interoperability
  • Focus on creating interfaces with other SW
    systems or with other equipment firmware.
  • Example
  • The firmware of medical lab. equipment is
    required to process its results according to a
    standard data structure that can be then serve as
    input for a number of standard laboratory IS.

55
Alternative Models Of SW Quality Factors
  • Two other models for SQ factors
  • Evans and Marciniak 1987 ( 12 factors )
  • Deutsch and Willis 1988. ( 15 factors )
  • Five new factors were suggested
  • Verifiability
  • Expandability
  • Safety
  • Manageability
  • Survivability

56
Alternative Models Of SW Quality Factors
  • Five new factors were suggested
  • Verifiability define design and programming
    features that enable efficient verification of
    the design and programming ( modularity,
    simplicity, adherence to documentation and prog
    guidelines. )
  • Expandability refer to future efforts that will
    be needed to serve larger populations, improve
    services, or add new applications in order to
    improve usability.
  • Safety meant to eliminate conditions hazardous
    to equipment as a result of errors in process
    control SW.
  • Manageability refer to the admin. tools that
    support SW modification during the SW development
    and maintenance periods.
  • Survivability refer to the continuity of
    service. These define the minimum time allowed
    between failures of the system, and the maximum
    time permitted for recovery of service.

57
Who is interested in the definition of quality
requirements ?
  • The client is not the only party interested in
    defining the requirements that assure the quality
    of the SW product.
  • The developer is often interested also specially
  • Reusability
  • Verifiability
  • Porotability
  • Any SW project will be carried out according to 2
    requirements document
  • The clients requirements document
  • The developers additional requirements document.

58
Chapter 4
  • The Components Of the SQA system- Overview

59
The SQA system- an SQA architecture
  • SQA system components can be classified into 6
    classes
  • Pre-project components
  • Components of project life cycle activities
    assessment
  • Components of infrastructure error prevention and
    improvement.
  • Components of SQ management
  • Components of standardization, certification, and
    SQA system assessment
  • Organizing for SQA- the human components

60
Pre-project Components
  • To assure that
  • The project commitments have been adequately
    defined considering the resources required, the
    schedule and budget.
  • The development and quality plans have been
    correctly determined.

61
Components of project life cycle activities
assessment
  • The project life cycle composed of two stages
  • The development Life cycle stage
  • Detect design and programming errors
  • Its components divided into
  • Reviews
  • Expert opinions
  • Software testing
  • Assurance of the quality of the subcontractors
    work and customer-supplied parts.
  • The operation-maintenance stage
  • Include specialize maintenance components as well
    as development life cycle components, which are
    applied mainly for functionality improving
    maintenance tasks.

62
Components of infrastructure error prevention and
improvement
  • Main objectives of these components, which are
    applied throughout the entire organization, are
  • To eliminate or at least reduce the rate of
    errors, based on the organizations accumlated
    SQA experience.

63
Components of software quality management
  • This class of components is geared toward several
    goal
  • The major ones being the control of development
    and maintenance activities and introduction of
    early managerial support actions that mainly
    prevent or minimize schedule and budget failures
    and their outcomes.

64
Components of standardization, certification, and
SQA system assessment
  • The main objective of this class are
  • Utilization of international professional
    knowledge
  • Improvement of coordination of the organizational
    quality system with other organizations
  • Assessment of the achievements of quality systems
    according to a common scale.
  • The various standards classified into 2 groupes
  • Quality management standards
  • Project process standards.

65
Organizing for SQA- the human components
  • The SQA organizational base includes
  • Managers
  • Testing personnel
  • The SQA unit and practitioners interested in SQ.
  • The main objectives are
  • to initiate and support the implementation of SQA
    components
  • Detect deviation from SQA procedures and
    methodology
  • Suggest improvements

66
Part II Pre-project SQ components Chapter 5
  • Contract Review

67
Contract Review
  • Is the software quality element that reduces the
    probability of undesirable situation like in
  • ( CFV project ).
  • Contract review is a requirement by the ISO 9001
    and ISO 9000-3 guidelines.

68
The Contract review process and its stages
  • Several situations can lead a SW company to sign
    a contract with a customer such as
  • Participation in a tender
  • Submission of a proposal according to the
    customers RFP.
  • Receipt of an order from a companys customer
  • Receipt of an internal request or order from
    another department in the organization

69
The Contract review process and its stages
  • Contract review
  • is the SQA component devised to guide review
    drafts of proposal and contract documents.
  • If applicable, provides oversight ( supervision )
    of the contracts carried out with potential
    project partners and subcontractors.

70
The Contract review process itself is conducted
in two stages
  • Stage 1 Review of the proposal draft prior to
    submission to the potential customer ( proposal
    draft review ) Reviews the final proposal draft
    and proposals foundations
  • Customers requirement documents
  • Customers additional details and explanations of
    the requirements
  • Cost and resources estimates
  • Existing contracts or contract drafts of the
    supplier with partners and subcontractors.

71
The Contract review process itself is conducted
in two stages
  • Stage 2 Review of the proposal draft prior to
    signing ( Contract draft review )
  • Reviews the contract draft on the basis of the
    proposal and the understandings ( include changes
    ) reached during the contract negotiations
    sessions.
  • The individuals who perform the review thoroughly
    examine the draft while referring to a
    comprehensive range of review subjects ( a
    Check-list ) is very helpful for assuring the
    full coverage of relevant subjects.
  • See appendix 5A, 5B

72
Contract Review objectives
  • Proposal draft review objectives( assure the
    following )
  • Customer requirements have been clarified and
    documented
  • Alternative approaches for carrying out the
    project have been examined
  • Formal aspects of the relationship between the
    customer and SW firm have been specified.
  • Identification of development risks
  • Adequate estimation of project resources and
    timetable have been prepared.
  • Examination of the customers capacity to fulfill
    his commitments
  • Definition of partners and subcontractors
    participation conditions
  • Definition and projection proprietary rights.

73
Contract Review objectives
  • Contract draft review objectives( assure the
    following )
  • No un-clarified issues remain in the contract
    draft
  • All the understandings reached between the
    customer and the firm are to be fully and
    correctly documented.
  • No changes, additions, or omissions that have not
    been discussed and agreed upon should be
    introduced into contract draft.

74
Factors affecting the extent of a contract review
  • Project magnitude, usually measured in man-month
    resources.
  • Project technical complexity
  • Degree of staff acquaintance with and experience
    in the project area.
  • Project organizational complexity, the greater
    the number of organizations ( partners,
    subcontractors, and customers ) taking part in
    the project, the greater the contract review
    efforts required.

75
Who performs a contract review
  • The leader or another member of the proposal team
  • The members of the proposal team
  • An outside professional or a company staff member
    who is not a member of the proposal team.
  • A team of outside experts.

76
Implementation of a contract review of a major
proposal
  • The characteristics of the major proposal
  • Very large-scale project
  • Very high technical complexity
  • New professional area for the company
  • High organizational complexity
  • The difficulties of carrying out contract
    reviews for major proposals
  • Time pressures
  • Proper contract review requires substantial
    professional work
  • The potential contract review team members are
    very busy.

77
Implementation of a contract review of a major
proposal
  • Recommended avenues ( approaches ) for
    implementing major contract reviews
  • The contract review should be scheduled.
  • A team should carry out the contract review
  • A contract team leader should be appointed
  • The activities of the team leader include
  • Recruitment of the team members
  • Distribution of review tasks
  • Coordination between members
  • Coordination between the review team and the
    proposal team
  • Follow-up of activities, especially compliance
    with the schedule
  • Summarization of the findings and their delivery
    to the proposal team.

78
Contract review for internal projects
  • See table 5.1 page 86
  • The main point here is the internal relationship.
  • Loose relationships are usually characterized by
    insufficient examination of the projects
    requirements, its resources and development
    risks.
  • To avoid the previous problems we have to apply
    the contract review to the internal as external
    projects by implementing procedures that define
  • An adequate proposal for the internal project
  • Applying a proper contract review process
  • An adequate agreement between the internal
    customer and the internal supplier.

79
Chapter 6 Development and quality plans
  • Development plans and quality plans are the major
    elements needed for project compliance with ISO
    9000.3 standards and ISO/IEC 2001 and with IEEE
    730.
  • It is also an important element in the Capability
    Maturity Model ( CMM ) for assessment of SW
    development organization maturity.
  • The projects needs development and quality plans
    that
  • Are based on proposal materials that have been
    re-examined and thoroughly updated
  • Are more comprehensive than the approved
    proposal, especially with respect to schedules,
    resources, estimates, and development risk
    evaluations
  • Include additional subjects, absent from the
    approved proposal
  • others

80
Development plan and quality plan objectives
  1. Scheduling development activities that will lead
    to successful and timely completion of the
    project, and estimating the required manpower
    resources and budget.
  2. Recruiting team members and allocating
    development resources.
  3. Resolving development risks.
  4. Implementing required SQA activities
  5. Providing mgt. with data needed for project
    control.

81
Elements of the development plan
  • Project products
  • Project interfaces
  • Project methodology and development tools
  • SW development standards and procedures
  • The mapping of the development process.( proj.
    mgt. Gant )
  • Project milestones ( documents , code , report )
  • Project staff organization ( org. stru., prof.
    req., no of team mem., names of team leaders )
  • Development facilities ( SW, HW tools, space,
    period req. for each use )
  • Development risks ( see next slide )
  • Control methods
  • Project cost estimation

82
Development risks
  • Is a state or property of a development task or
    environment which, if ignored, will increase the
    likelihood of project failure. Such as
  • Technological gap
  • Staff shortages
  • Interdependence of organizational elements- the
    likelihood that suppliers or specialized HW or SW
    subcontractors, for example, will not fulfill
    their obligations or schedule.

83
Elements of Quality Plan
  • Quality goals ( quantitative measures example
    page 102 )
  • Planned review activities
  • The scope of review activity
  • The type
  • The schedule ( priorities )
  • The specific procedure to be applied
  • Who responsible for carrying out the rev. act.
  • Planned SW tests ( a complete list of planned SW
    tests should be provided ) each test
  • The unit, integration or the complete system to
    be tested
  • The type of testing activities to be carried out
  • The planned test schedule
  • The specific procedure
  • Who responsible

84
Elements of Quality Plan
  • Planned acceptance tests for externally developed
    SW
  • Configuration management configuration mgt tools
    and procedures, including those change-control
    procedures meant to be applied throughout the
    project

85
Dev. And Quality Plan for small projects
internal projects
  • See page 105 , 106

86
Chapter 7Integrating Quality activities in the
project life cycle
  • Classic and Other SW development Methodologies
  • SDLC ( Req. def. , Analysis, Design, Coding, sys.
    Tests, install and conversion, op. and
    maintenance )

87
Integrating Quality activities in the project
life cycle
  • Prototyping

88
Integrating Quality activities in the project
life cycle
  • The Spiral model See page 128
  • It is an improved metho. for overseeing large and
    more complex projects
  • Combines SDLC prototyping
  • At each iteration of the spiral the following
    activities are performed
  • Planning
  • Risk analysis and reslution
  • Engineering activities
  • Customer evaluation, comm, changes, etc

89
Integrating Quality activities in the project
life cycle
  • The object-oriented model.
  • Easy integration of existing sw modules ( Objects
    ) into newly developed sw sys.
  • A SW component library serves this purpose by
    supplying sw components for reuse. See page 130
  • Advantages of library reuse
  • Economy
  • Improve quality
  • Shorter development time
  • The advantages of OOPS will grow as the storage
    of reusable SW grows ( Example Microsoft and
    Unix )

90
Factors affecting intensity of quality assurance
activities in the development projects
  • Quality assurance activities will be integrated
    into development plan that implements one or more
    SW development models
  • Quality assurance planners for project are
    required to determine
  • The list of QA activities needed for a project
  • For each QA activity
  • Timing
  • Who perform the resources required
  • Team members, external body for QA
  • Resources required for removal of defects and
    introduction of changes.

91
Factors affecting intensity of quality assurance
activities in the development projects
  • Project factors
  • Magnitude of the project
  • Technical complexity and difficulty
  • Extent of reusable SW components
  • Severity of failure outcome if the project fails
  • Team factors
  • Professional qualification of team members
  • Team acquaintance with the project and its
    experience in the area
  • Availability of staff members who can
    professionally support team
  • Familiarity with team members, in other words the
    percentage of new staff members in the team
  • See example page 132

92
Verification, Validation and Qualification
  • Three aspects of quality assurance of the SW
    product are examined under the issues of
    verification, validation, and qualification( IEEE
    std 610.12-1990)
  • Verification the process of evaluating a system
    or component to determine whether the products of
    a given development phase satisfy the conditions
    imposed at the start of that phase.
  • It examines the consistency of the products being
    developed with products developed in the previous
    phases.
  • Examiner can assure that development phases have
    been completed correctly

93
Verification, Validation and Qualification
  • Validation the process of evaluating a system
    or component during or at the end of the
    development process to determine whether it
    satisfies specified requirements.
  • It represents the customers interest by
    examining the extent of compliance to his
    original req.
  • Comprehensive validation reviews tend to improve
    customer satisfaction from the system

94
Verification, Validation and Qualification
  • Qualification the process used to determine
    whether a system or component is suitable for
    operational use.
  • It focuses on operational aspects, where
    maintenance is the main issues
  • Planners are required to determine which of these
    aspects should be examined in each quality
    assurance activity.

95
A model for SQA defect removal effectiveness
Cost
  • The model deals with 2 quantitative aspects
  • Effectiveness in removing project defects
  • The cost of removal
  • See page 135

96
Defect removal effectiveness
  • It is assumed that any SQA activity filters (
    screens ) a certain percentage of existing
    defects.
  • In most cases the percentage of removed defects
    is somewhat lower than the percentage of detected
    defects as some corrections are ineffective or
    inadequate.
  • The next SQA activity will faces both the
    remaining defects and the new defects created in
    the current development phases.
  • It is assumed that the filtering effectiveness of
    accumulated defects of each QA activity is not
    less than 40.
  • Table 7.4 page 136 list the average filtering
    effectiveness by QA activities.

97
Cost of defect removal
  • The cost of defect removals varies by development
    phase, while costs rise substantially as the
    development process proceeds.
  • Example removal of a design defect detected in
    the design phase may require an investment of 2.5
    working days removal of the same defect may
    required 40 days during the acceptance tests.
  • Defect-removal costs based on some surveys are
    shown in table 7.5 page 137.

98
The Model
  • The model is based on the following assumptions
  • The development process is linear and sequential,
    following the waterfall model.
  • A number of new defects are introduced in each
    development phase ( see table 7.3 page 135 ).
  • Review and test SQA activities serve as filters,
    removing a percentage of the entering defects and
    letting the rest pass to the next phase. If we
    have 30 defects and the filtering efficiency 60
    then 18 defects will be removed 12 will stay to
    the next.
  • At each phase the incoming defects are the sum of
    defects not removed together with the new defects
    introduced ( created ) in the current development
    phase.
  • The cost is calculated for each QA activity by
    multiplying the number of defects removed by the
    relative cost of removing a defect. ( table 7.5 )
  • The remaining defects passed to the customer,
    will be detected by him.

99
The model presents the following
  • POD phase originated defects ( table 7.3 )
  • PD passed defects.
  • FE filtering effectiveness ( table 7.4 )
  • RD removed defects
  • CDR cost of defect removal ( table 7.5 )
  • TRC total removal cost.

100
Chapter 8 Reviews
  • IEEE definition Review process
  • A process or meeting during which a work product
    or set of products is presented to project
    personnel , managers, users, customers, or other
    interested parties for comment or approval.

101
Methodologies for reviewing documents
  • Reviews acquire special importance in the SQA
    process because they provide early direction and
    prevent the passing of design and analysis errors
    down-stream , to stages where error detection
    and correction are much complicated and costly
  • The methodologies for reviewing
  • Formal design review
  • Peer reviews ( inspections and walkthroughs )
  • Expert opinions
  • Standards for SW reviews are the subject of IEEE
    std 1028 ( IEEE, 1997 ).

102
Reviews Objectives ( Direct Objectives )
  • To detect analysis design errors as well as
    subjects where corrections, changes and
    completions are required with respect to the
    original specifications and approved changes.
  • To identify new risks likely to affect completion
    of the project.
  • To locate deviations from templates and style
    procedures and conventions. Correction of these
    deviations is expected to contribute to improved
    communication coordination resulting from
    greater uniformity of methods documentation
    style.
  • To approve the analysis or design product.
    Approval allows the team to continue to the next
    development phase.

103
Reviews Objectives ( Indirect Objectives )
  • To provide an informal meeting place for exchange
    of professional knowledge about development
    methods, tools, and techniques.
  • To record analysis and design errors that will
    serve as a basis for future corrective actions.
    The corrective actions are expected to improve
    development methods by increasing effective and
    quality, among other product features.

104
Formal design reviews ( DRs )
  • Formal design review, also called
  • Design reviews ( DRs )
  • Formal technical reviews ( FTR )
  • Without this approval, the development team
    cannot continue to the next phase of SW
    development project.
  • Formal design review can be conducted at any
    development milestone requiring completion of an
    analysis or design document, whether that
    document is a requirement specification or an
    installation plan.

105
A list of common Formal design reviews
  • DPR - development plan review
  • SRSR- Software requirement specification review
  • PDR Preliminary design review
  • DDR Detail design review
  • DBDR Data base design review
  • TPR Test plan review
  • STPR Software test procedure review
  • VDR Version description review
  • OMR operator manual review
  • SMR Support manual review
  • TRR Test readiness review
  • PRR Product release review
  • IPR Installation Plan review

106
The Formal Design Review will focus on
  • The participants
  • The prior preparations
  • The DR session
  • The recommended post-DR activities

107
The participants in a DR
  • All DRs are conducted by
  • A review leader
  • A review team
  • The review leader characteristics
  • Knowledge experience in development of projects
    of the type reviewed.
  • Seniority at a level similar to if not higher
    than that of the project leader
  • A good relationship with the project leader and
    his team
  • A position external to the project team.
  • Small dev. Departments and software houses
    typically have difficulties finding an
    appropriate candidate to lead the review team.
    One possible solution to this is the appointement
    of an external consultant.

108
The Review Team
  • It is desirable for non-project staff to make up
    the majority of the review team.
  • The size of the review team from 3 to 5 to be an
    efficient team

109
Preparation for a DR
  • A DR session are to be completed by all three
    main participants in the review
  • Review leader , an team
  • Development team.
  • Each one is required to focus on distinct aspects
    of the process.
  • Review leader preparations (main tasks)
  • To appoint the team members
  • To schedule the review sessions
  • To distribute the design document among team
    members ( hard copy, electronic copy etc )

110
Preparation for a DR
  • Review team preparations (main tasks)
  • Review the design document and list their
    comments prior to the review session
  • team members may use a review checklists.
  • See chapter 15 ( checklists )
  • Development team preparations ( main tasks )
  • Prepare a short presentation of the design
    document
  • The presentation should focus on the main
    professional issues awaiting approval rather than
    wasting time on description of the project in
    general.

111
The DR session
  • The agenda is the issue ( a typical DR session
    agenda )
  • A short presentation of the design document
  • Comments made by members of the review team.
  • Verification and validation in which each of the
    comments is discussed to determine the required
    actions ( corrections, changes and addition )
    that the project team has to perform.
  • Decisions about the design product ( document ),
    which determines the projects progress. These
    decisions take the following three forms

112
Decisions forms
  • Full approval enables immediate continuation to
    the next phase. It may be accompanied by demands
    for some minor corrections to be performed by
    project team.
  • Partial approval approval of immediate
    continuation to the next phase for some parts of
    the project, with major action items demanded for
    the remainder of the project.
  • Denial of approval demands to repeat of the DR

113
The DR report see appendix 8A
  • one of the review leader responsibilities is to
    issue a DR report immediately after the review
    session.
  • The development team should perform the
    corrections earlier and minimize the attendant
    delays to the project schedule.
  • The report major sections contain
  • A summary of the review discussion
  • The decision of the continuation of the project
  • A full list of the required actions ( corr,
    changes, additions) and the anticipated
    completion dates.
  • The name(s) of the review team member(s) assigned
    to follow up performance of corrections.

114
The follow-up process
  • The review leader himself is required to
    determine whether each action item has been
    satisfactory accomplished as a condition for
    allowing the project to continue to the next
    phase.
  • Follow-up should be documented to enable
    clarification.

115
Pressman (2000, chapter 8 )
  • Pressmans 13 golden guidelines for a
    successful design review
  • See page 157

116
Peer Reviewstwo review methods ( Inspection and
Walkthrough )
  • The major difference between formal design
    reviews and peer review methods is rooted in
    participants authority.
  • In peer reviews, as expected, the project
    leaders equals, members of his department and
    other units.
  • The other difference lies in degree of authority
    the objective of each review method.
  • The peer review main objectives lies in detecting
    errors deviations from standards.
  • The appearance of the CASE tools reduce the value
    of manual reviews such as inspection and
    walkthrough.
  • Researches find out that peer reviews are highly
    efficient as well as effective method.

117
Inspection Walkthrough
  • What differentiates a walkthrough from an
    inspection is the level of formality, inspection
    is the more formal of two.
  • Inspection emphasizes the objective of corrective
    actions.
  • Walkthroughs findings are limited to comments on
    the document reviewed.

118
Inspection Walkthrough
  • Inspection is usually based on a comprehensive
    infrastructure, including
  • Development of inspection checklists developed
    for each type of design document as well as
    coding language and tool, which are periodically
    updated.
  • Development of typical defect type frequency
    tables, based past findings, to direct inspectors
    to potential defect concentration areas.
  • Periodic analysis of the effectiveness of past
    inspections to improve the inspection methodology
  • Introduction of scheduled inspections into the
    project activity plan and allocation of the
    required resources, including resources for
    correction of detected defects.

119
Participants of peer reviews
  • A review leader
  • Main tasks qualification page 161
  • The author
  • Invariably a participant in each type of peer
    review.
  • Specialized professional
  • For inspections
  • A designer
  • A coder or implementer
  • A tester
  • For walkthrough
  • A standards enforcer
  • A maintenance expert
  • A user representative.

120
Team assignments
  • The presenter
  • The Scribe

121
Preparations for a peer review session
  • Leader preparation
  • Teams preparation

122
Session Documentation
  • Inspection session findings report
  • Prepared by the scribe
  • Inspection session summary report
  • Prepared by the leader
  • See appendix 8b , 8c

123
Post-Peer review activities
  • Post-inspection activities are conducted to
    attest to
  • The prompt, effective correction and reworking of
    all errors by the designer/author and his team,
    as performed by the inspection leader in the
    course of the assigned follow-up activities.
  • Transmission of the inspection reports to the
    internal Corrective Action Board ( CAB ) for
    analysis.
  • See Fig 8.2 ( comparison of the peer review
    methods ( Page 166 )

124
The efficiency of peer reviews
  • Some of the more common metrics applied to
    estimate the efficiency of peer reviews
  • peer review detection efficiency( average hrs
    worked per defect detected)
  • Peer review defect detection density ( average
    number of defects detected per page of the design
    document )
  • Internal peer review effectiveness ( of defects
    detected by peer reviews as of total defects
    detected by the developer).

125
Comparisons
  • See tables page 167-169

126
Expert opinions ( external )
  • It is good in the following situations
  • Insufficient in-house proff.
  • Temporary lack in-house proff.
  • Disagreements
  • In small organizations

127
Chapter 9Software testing - Strategies
  • Testing Definition
  • Testing is the process of executing a program
    with intention of finding errors.
  • IEEEdefinition
  • The process of operating a system or component
    under specified condition, observing or recording
    the results, and making an evaluation of some
    aspect of the system or component.
  • The process of analyzing a software item to
    detect the difference between the existing and
    required conditions ( that is, bugs ) and
    evaluate the features of the software item.

128
Software testing - Definition
  • Is a formal ( SW test plan ) process carried out
    by specialized testing team ( independent ) in
    which a software unit, several integrated
    software units or entire software package are
    examined by running the programs on a computer.
    All the associated tests are performed according
    to approved test procedures on approved test
    case.

129
Software testing objectives
  • Direct objectives
  • To identify and reveal as many errors as possible
    in the tested SW.
  • To bring the tested SW, after correction of the
    identified errors and retesting, to an acceptable
    level of quality.
  • To perform the required tests efficiently, within
    budgetary and scheduling limitations.
  • Indirect objectives
  • To compile a record of SW errors for use in error
    prevention ( by corrective preventive actions )

130
Software testing Strategies
  • To test the SW in its entirety, once the
    completed package is available otherwise known
    as big bang testing .
  • To test the SW piecemeal, in modules, as they are
    completed ( unit tests ) then to test groups of
    tested modules integrated with newly completed
    modules ( integrated tests ). This process
    continues until all the entire package is tested
    as whole ( system test ). This testing strategy
    is usually termed incremental testing

131
Incremental testing is also performed according
to two basic Strategies
  • Bottom-Up ( 4 stages ) see fig page 183
  • Top-down ( 6 stages )
  • The incremental pathes
  • Horizontal sequence ( breadth first )
  • Vertical sequence ( Depth first )

132
Stubs Drivers for incremental testing
  • Stubs and drivers are SW replacement simulators
    required for modules not available when
    performing a unit test or an integration test.
  • Stubs ( often termed a Dummy Module )replaces
    an unavailable lower level module, subordinate to
    the module tested.
  • It is required for top-down testing of incomplete
    systems. See example fig 9.2

133
Stubs Drivers for incremental testing
  • A driver is a substitute module but of the upper
    level module that activates the module tested.
  • The driver is passing the test data on to the
    tested module and accepting the results
    calculated by it.
  • It is required in bottom-up testing until the
    upper level modules are developed.
  • See example fig 9.2

134
Bottom-Up Vs Top-Down strategies
  • Main Adv. Of bottom-up
  • The relative ease of its performance
  • Main disadv.
  • The lateness at which the prog. As whole can be
    observed ( at the stage following testing of the
    last module )
  • Main adv. Of Top-down
  • The possibility it offers to demonstrate the
    entire prog. Function shortly after activation of
    the upper-level modules has been completed. This
    characteristic allows for early identification of
    analysis design errors related to algorithms,
    functional requir. , and the like.
  • Main disadv.
  • The relative difficulty of preparing the required
    stubs, with often require very complicated
    programming.
  • The relative difficulty of analyzing the result
    of the tests.

135
Big bang Vs. Incremental testing
  • The main disadvantages of big bang
  • Identification of errors becomes difficult
  • Corrections will be at the same time.
  • Error correction estimation of the required
    testing resources and testing schedule a rather
    fuzzy endeaver.
  • Incremental testing adv.
  • Usually performed on relatively small SW modules,
    as unit or integration tests.( more errors
    detection )
  • Identification and correction of errors is much
    simpler and requires fewer resources because it
    is performed on a limited volume of SW.

136
Software test classification.Classification
according to testing concept
  • Two testing classes have been developed
  • Black box ( functionality ) testing
  • Identifies bugs only according to SW
    malfunctioning as they are revealed in its
    erroneous output.
  • Incases that outputs are found to be correct,
    black box testing disregarded the internal path
    of calculations and processing performed.
  • White box ( structural ) testing
  • Examines internal calculation paths in order to
    identify bugs.
  • The white is meant to emphasize the contrast
    between methods

137
Software test classification.Classification
according to requirements
  • See table 9.1 Page 188

138
White Box Testing
  • White box testing concept requires verification
    of every program statement and comment.
  • See table 9.2 , white box testing enables
    performance of
  • data processing and calculations correctness
    tests
  • SW qualification tests
  • Maintainability tests
  • Reusability tests
  • Every computational operation in the sequence of
    operations created by each test case ( Path )
    must be examined.
  • This type of verification allows us to decide
    whether the processing operations and their
    sequences were programmed correctly for the path
    in question, but not for other pathes.

139
Data processing and calculation correctness
testsPath coverage line coverage
  • Path coverage total number of possible paths
  • 10 if-then-else 1024 paths
  • Line coverage for full line coverage, every line
    of code be executed at least once during the
    process of testing. The line coverage metrics for
    completeness of a line-testing plan are defined
    as the percentage of lines indeed executed during
    the tests. Flow chart and a program flow graph
    are used.
  • Example 191.

140
McCabes cyclomatic complexity metrics
  • Measures the complexity of a program or module at
    the same time as it determines the maximum number
    of independent paths needed to achieve full line
    Coverage of the program.
  • The measure is based on graph theory using
    program flow graph.
  • An independent path is defined with reference to
    the succession of independent paths accumulated.
  • Independent path is any path on the program flow
    graph that includes at least one edge that is not
    included in any former independent paths.
  • See table 9.5 page
Write a Comment
User Comments (0)
About PowerShow.com