Inspections and Reviews - PowerPoint PPT Presentation

1 / 41
About This Presentation
Title:

Inspections and Reviews

Description:

Inspection/walkthroughs advantages/disadvantages: ... Keno is a game of chance. ... Figure 5 shows the daily 'hits' (one game per day) in the first 20 Keno numbers. ... – PowerPoint PPT presentation

Number of Views:45
Avg rating:3.0/5.0
Slides: 42
Provided by: shmuelrot
Category:

less

Transcript and Presenter's Notes

Title: Inspections and Reviews


1
Inspections and Reviews
2
Peer Reviews
  • Peer reviews remove defects from the software
    product early.
  • They involve a methodical examination of the
    software product by the producers' peers to
    identify defects and areas where changes are
    needed.
  • The specific products that will undergo a peer
    review are identified in the project's software
    process and scheduled as part of the software
    project planning activities.

3
  • CMM defines the goals of peer reviews to be
  • (1) planning review activities, and
  • (2) identifying and removing software defects
  • Resources and funding must be provided for
    performing peer reviews on each software product
    to be reviewed.
  • Reviewers who receive training in the objectives,
    principles, and methods of peer reviews.
  • Peer reviews are planned, and the plans are
    performed according to a documented procedure.

4
  • Data on the conduct and results are recorded.
  • Measurements are made and used to determine the
    status of the peer review activities.
  • Peer reviews are often costly and timely and as a
    result are considered by the project to not be
    worth the effort.
  • They are slow to evolve with a company's software
    process and thus are reactive instead of
    proactive.

5
Types of Peer Reviews
  • There are several types of peer reviews in use.
  • Some involve significant investment from the
    project teams others are simple inspections of
    code.
  • The key to obtaining pay back on a peer review is
    to select the appropriate technique, depending on
    the magnitude of the project.
  • The common technique for peer review traditional
    "code walk-through," where the programmer walks
    into a room with his/her peers and defends
    his/her process and data flow creation.
  • This technique, though better than none, has not
    yielded ideal results for many reasons.

6
  • The formality of such meetings is costly and
    generally involves players that need not attend,
    and omits players
  • that should.
  • An adequate peer review should include
    technologists of expertise from outside the
    organization, and exclude those who have no value
    to add or gain.
  • Below are five of the major peer review
    techniques in use.

7
  • The five methods proposed are
  • inspections,
  • structured walkthroughs,
  • hybrid inspection/walkthroughs,
  • desk-checking, and
  • round-robin reviews.
  • The amount of effort (cost) to implement each one
    varies significantly, as it should in that the
    effort (cost) of each software development
    project varies significantly.

8
Inspections
  • A rigorous static analysis technique that
    requires training of the inspection team,
    well-defined roles that include a facilitator and
    scribe, and the complete measurement of defects
    encountered.
  • advantages/disadvantages
  • Highest measured efficiency of any known form of
    defect removal, the only technique to achieve
    efficiencies of 75 or more in field conditions
    and it may be tailored for the project.
  • It is also the most costly and time consuming.

9
Structured walkthroughs
  • A static analysis technique
  • A designer or programmer leads members of the
    development team and other interested parties
    through documentation or code.
  • Participants ask questions and make comments
    about possible problems.
  • Defects should be identified and logged during
    the review with action items assigned.
  • Upon conclusion of the review, a determination of
    how to proceed is made.

10
  • Advantages/disadvantages
  • The second most efficient method for removing
    defects, it may be tailored to the project
  • Not as costly as full inspection and usually
    requires a moderate size group of reviewers

11
hybrid inspection/walkthrough
  • A modified approach in which a group of
    participants, consisting of author, moderator,
    reviewers (minimum of two) and scribe, perform
    the review.
  • Combinations of roles are allowable, i.e.,
    moderator/scribe, and/or author/reviewer.
  • Reviews are structured so the anticipated benefit
    exceeds the minimum necessary support required of
    the individuals who are asked to participate.
  • Software review metrics are collected and
    monitored so as to determine the effectiveness of
    the reviews.

12
Desk-checking
  • Inspection/walkthroughs advantages/disadvantages
  • Requires a smaller group of reviewers, the roles
    may be combined, it concentrates on finding
    defects
  • Could lose effectiveness if too much analysis is
    done on defect evaluation.
  • Desk-checking is a private review and debugging
    carried out by individual programmers and
    analysts not involved in the software product.
  • Defects are identified and logged during the
    review.
  • Defect resolution, status tracking and
    communication take place after the review.

13
round-robin review
  • Advantages/disadvantages
  • The least expensive, it is easy to schedule and
    complete
  • it is typically the least effective review
    method and effectiveness depends largely on the
    reviewer.
  • Round-robin review is a process of desk-checking
    by multiple peers in a sequential manner.
  • A checker reviews, identifies and logs defects,
    then passes the folder to the next reviewer who
    is reviewing, adding and logging any additional
    defects.
  • This continues until all the reviewers have
    participated and the folder is returned to the
    author.

14
  • Advantages/disadvantages
  • More efficient than simple desk-checking,
    multiple reviewers are involved, roles can be
    assigned, typically a lower cost than other
    review techniques
  • Not as efficient as inspections.

15
Benchmarking Peer Review Proficiency
  • The data shown in Figure 1 shows defect counts
    from 10 different peer reviews.
  • This data comes from a study using 10 peer
    reviews teams (4 software engineers per team) to
    evaluate the same software requirements document.
  • The peer review results in Figure 1 have no
    variables except defect detection proficiency.

16
(No Transcript)
17
  • Large variations in the number of defects
    detected by each peer review
  • The chart does not display their proficiency.
  • Peer review proficiency is the percentage of the
    original defects that were detected by the
    review.
  • In this case, 92 unique defects were found by all
    participants through the design phase.
  • So, there were at least 92 defects.
  • Assuming a total of 100 defects existed, peer
    review proficiency will look like Figure 2.

18
(No Transcript)
19
what's acceptable?
  • What is an acceptable proficiency?
  • A (proficient) requirements review will find
    about 40 of the pre-existing requirements
    defects.
  • Semi-proficient reviewers may omit verification
    of applicable criteria.
  • Common causes for poor proficiency
  • 0) non-proficient reviewers
  • 1) insufficient review time,
  • 2) vague requirements,
  • 3) inadequate training, and
  • 4) lack of management commitment to quality and
    process improvement.

20
Finding Defects
  • Figure 3 depicts a number of defects in a
    document representing requirements, design, code,
    test, etc.

21
  • Defects, represented by an 'x', are shown to have
    attributes of
  • 1) difficulty to find and
  • 2) association with some capability.
  • A major risk in relying on peer reviews is that
    they may be superficial, detecting only
    easy-to-find defects.
  • Defects found in the sample are more
    representative of all defects than those found
    during a superficial review.
  • Stratified defects are defects that show up in
    all samples, indicating a systemic process
    problem.

22
  • For example, suppose data dictionary entries are
    generally incomplete, omitting some required
    characteristic of the data.
  • These are stratified defects that will not be
    detected by most superficial reviews because they
    are more difficult to find.
  • Since sample reviews evaluate software an inch
    wide and a yard deep, as opposed to a yard wide
    and an inch deep, they are more likely to
    identify incomplete data definitions.

23
Errors of Chance
  • The proficiency metric, based on sample findings,
    is inaccurate when only a few results are
    recorded, but it gets better when additional
    results are added.
  • Sources of metric error due to chance include
  • 1) the document under review may not be typical
    of all documents,
  • 2) the control evaluator's sample may not be
    typical of other parts of the same document, and
  • 3) the peer review measured may not be typical of
    all peer reviews.

24
(No Transcript)
25
The Keno Test
  • Keno is a game of chance.
  • The Keno operator draws 20 numbers at random from
    80 numbers available.
  • Players bet on which numbers will be drawn.
  • If you think of the 80 numbers as defects in a
    software document, and the Keno operator as the
    peer review, the Keno operator is always exactly
    25 proficient.
  • Since the proficient of the Keno operator is
    known, we can test the ability to predict their
    proficiency when I only some of the defects are
    found.

26
  • Figure 5 shows the daily 'hits' (one game per
    day) in the first 20 Keno numbers.
  • In this instance, the first 20 numbers represent
    the proficiency (25) rather than a sample of the
    document.
  • All of the numbers have an equal chance of being
    selected and they are equally important, so we
    can use any 20 numbers to represent the findings.
  • This test simulates a control evaluator, who is
    25 proficient, assessing a peer review (the Keno
    operator), who is also 25 proficient.

27
  • The result is an example of how control
    evaluators can predict peer review proficiency
    when they find only some of the defects.
  • As can be seen, the daily accuracy is not
    particularly good.
  • It was accurate only twice in 12 evaluations.
  • But my estimate of their proficiency, combining
    the results of several evaluations, is much
    better.

28
(No Transcript)
29
Fundamentals of Software Inspections
  • What is the Inspection Method?
  • A technique for finding and removing defects in
    specifications, software, documentation, and
    other deliverables as early as possible.
  • Inspection applies the concepts of statistical
    process control to produce high-quality
    deliverables at minimum cost.
  • Inspections can be used on ANY written document
    -- specifications, source code, contracts, test
    plans, test cases, etc.

30
Method
  • Inspection was developed by Michael Fagan of IBM
    and has undergone continuous quality improvement.
  • Inspections have been used at IBM, Bell-Northern
    Research, Tandem, and many other corporations to
    find defects faster, and hence at lower cost.

31
How Does Inspections Differ From Reviews?
  • Statistical quality control on the document
  • Inspections track a number of metrics designed to
    improve the inspection process itself!
  • Emphasis on earliest possible defect detection
  • Emphasis on immediate and controlled correction -
  • Corrections are also tracked to assure correction
  • Trained and certified inspection leaders
    ("moderators")
  • Moderator leadership ("Chief Moderator" concept)
  • Specialized "roles" to increase defect total find
    rate for team
  • Specialized checklists for each document type
  • Formal "entry" criteria for inspections start-up
  • Formal "exit" criteria for inspection completion

32
  • Measurable criteria for repeating bad inspections
  • Pareto analysis identifying error-prone
    components
  • Specific devices/techniques for avoiding
    individual "blame"
  • Author does not lead, read, explain, or defend
    docs.
  • Only author has control over correction
  • Team size computed based upon efficiency needs
  • All important documents are subject to inspection
  • Peer inspection - everyone learns new things on
    the job
  • Maximum meeting 2 hours (tiredness factor)

33
What is an Inspector?
  • Inspector a person looking for defects in
    documents.
  • Inspections involve other people.
  • An "author" creates the document that is
    inspected.
  • A "moderator" recruits a team of inspectors and
    organizes inspection activities.
  • A "scribe" records the defects found by
    inspectors.
  • A person may have one or more roles during
    inspections.
  • Contrast this to "reviews" and other previous
    methodologies, which do not necessarily assign
    roles in any formal fashion.

34
What Happens During an Inspection?
  • Inspection activities follow this cycle
  • An author gives a document to a moderator asking
    for inspection.
  • The moderator recruits a team of inspectors and
    gives them the document and all other documents
    necessary for the inspection.
  • Inspectors prepare by reading and noting all the
    defects found.
  • Each inspector prepares IN PRIVATE at least the
    amount of time recommended by the moderator.
  • After preparation, the inspection team conducts a
    "defect logging meeting."

35
  • The moderator is responsible for arranging the
    defect logging meeting.
  • After logging, the author takes the defect log
    and "reworks" their document by fixing all the
    logged defects.
  • The moderator tallies up the statistics from the
    meeting, and gives them to Quality Assurance for
    tracking and analysis.

36
What is a Defect?
  • A defect is a violation of a standard, or an
    inconsistency with a high-level document.
  • A standard tells the author how to produce a
    certain kind of document.
  • For example, the authors of source code use a "C
    Coding Standard" and other related standards.
  • A defect occurs when a low-level document fails
    to comply with a standard that is expected to be
    used.
  • A high level document is a document closely
    related to the low level document being
    inspected.
  • For example, a source file might have a design
    specification, and a user manual as high levels.

37
  • A defect occurs when a low level document does
    not follow correctly from its high levels.
  • For example, when a function (or proc) does not
    comply with its design spec or man page.
  • The moderator is responsible for getting you a
    copy of all relevant standards and high levels.
  • Inspections seek to move closer to what Professor
    Gerald Weinberg called "egoless programming."
  • Inspections methodology enforces good software
    engineering practices.

38
Preparation For a Defect Logging Meeting?
  • Find as many defects as possible, especially
    "unique", meaningful defects.
  • The moderator may assign a special "role" to try
    and find the unique defects that can be seen from
    that special perspective.
  • For example, moderator J might assign inspector B
    the role of "End User Customer", and suggests
    that B spend at least 2 hours inspecting the
    low-level.
  • Defects are recorded any way the inspector wishes
    to.
  • Estimate and record each defects "severity
    level."

39
  • Record any questions that you would like to ask
    the author.
  • Issues are also recorded for the author.
  • At the Defect Logging Meeting, one provides the
    following information
  • preparation time spent,
  • number of pages inspected,
  • number of defects found at each severity level,
    and
  • number of questions.

40
Defect Logging
  • The moderator asks each inspector to log their
    defects.
  • For each defect, the scribe records the location,
    severity, of occurrences, and a brief
    description.
  • Each inspector poses their questions to the
    author.
  • Depending upon the answer, a defect may be
    logged, or not.
  • The GOAL of the defect logging meeting is to log
    as many defects as possible PER MINUTE.
  • The meeting are next

41
  • Inspectors describe each defect is about 7 words
    or less.
  • Use telegram style, emphasizing key words.
  • Inspectors are NOT allowed to discuss how to fix
    the defect!
  • The author has sole responsibility to decide how
    to fix the defect.
  • This avoids a MAJOR PROBLEM with reviews
  • too much "fix on the fly,
  • too much discussion degenerating into chaos and
    long meetings,
  • too much defensiveness.
  • There is NO discussion about whether or not a
    defect really exists.
  • The author is NOT allowed to explain, describe,
    or defend, except in response to a question.
Write a Comment
User Comments (0)
About PowerShow.com