TQS Teste e Qualidade de Software Software Testing and Quality Test planning and documentation' Prob - PowerPoint PPT Presentation

1 / 43
About This Presentation
Title:

TQS Teste e Qualidade de Software Software Testing and Quality Test planning and documentation' Prob

Description:

1. Teste e Qualidade de Software, Mestrado em Engenharia Inform tica, Jo o Pascoal Faria, 2006. TQS - Teste e Qualidade de Software (Software ... (Mantis) 25 ... – PowerPoint PPT presentation

Number of Views:306
Avg rating:3.0/5.0
Slides: 44
Provided by: pagina
Category:

less

Transcript and Presenter's Notes

Title: TQS Teste e Qualidade de Software Software Testing and Quality Test planning and documentation' Prob


1
TQS - Teste e Qualidade de Software(Software
Testing and Quality)Test planning and
documentation. Problem reporting and follow-up.
Test measurements and test management.
João Pascoal Faria jpf_at_fe.up.pt
www.fe.up.pt/jpf
2
Test planning and documentation
3
The IEEE Standard 829-1998 for Software Test
Documentation
Specification
This is the situation of version 8291983. In
version 8291998 there is a single test plan.
(adapted from Ilene Burnstein, Practical Software
Testing)
4
The IEEE Standard 829-1998 Document types (1/2)
  • Test planning
  • Test plan
  • prescribes the scope, approach, resources, and
    schedule of the testing activities
  • identifies the items to be tested, the features
    to be tested, the testing tasks to be performed,
    the personnel responsible for each task, and the
    risks associated with the plan
  • Test specification
  • Test design specification
  • refines the test approach and identifies the
    features to be covered by the design and its
    associated tests
  • identifies the test cases and test procedures, if
    any, required to accomplish the testing and
    specifies the feature pass/fail criteria.
  • Test case specification
  • documents the actual values used for input along
    with the anticipated outputs
  • also identifies constraints on the test
    procedures resulting from use of that specific
    test case
  • (Test cases are separated from test designs to
    allow for use in more than one design and to
    allow for reuse in other situations.)

5
The IEEE Standard 829-1998 Overview of document
types (2/2)
  • Test specification (cont.)
  • Test procedure specification
  • identifies all steps required to operate the
    system and exercise the specified test cases in
    order to implement the associated test design
  • (Test procedures are separated from test design
    specifications as they are intended to be
    followed step by step and should not have
    extraneous detail.)
  • Test reporting
  • Test item transmittal report
  • identifies the test items being transmitted for
    testing in the event that separate development
    and test groups are involved or in the event that
    a formal beginning of test execution is desired
  • Test log
  • used by the test team to record what occurred
    during test execution
  • Test incident report
  • describes any event that occurs during the test
    execution which requires further investigation
  • Test summary report
  • summarizes the testing activities associated with
    one or more test design specifications

6
Project plan, quality assurance plan and test plan
(note for many authors, QA ? VV)
(adapted from Ilene Burnstein, Practical
Software Testing)
7
The IEEE Standard 829-1998 Test plan contents
  • (Applicable to master test plan and each of the
    level based test plans (unit, integration, etc.))
  • 1. Test plan identifier
  • Can serve to identify it as a configuration item
  • 2. Introduction (why)
  • Overall description of the project, the software
    system being developed or maintained, and the
    software items and/or features to be tested
  • Overall description of testing goals (objectives)
    and the testing approaches to be used
  • References to related or supporting documents
  • 3. Test items (what)
  • List the items to be tested procedures, classes,
    modules, libraries, components, subsystems,
    systems, etc.
  • Include references to documents where these items
    and their behaviors are described (requirements
    and design documents, user manuals, etc.)
  • List also items that will not be tested

8
The IEEE Standard 829-1998 Test plan contents
  • 4. Features to be tested (what)
  • Features are distinguishing characteristics
    (functionalities, quality attributes). They are
    closely related to the way we describe software
    in terms of its functional and quality
    requirements
  • Identify all software features and combinations
    of software features to be tested. Identify the
    test design specification associated with each
    feature and each combination of features.
  • 5. Features not to be tested (what)
  • Identify all features and significant
    combinations of features that will not be tested
    and the reasons.
  • 6. Approach (how)
  • Description of test activities, so that major
    testing tasks and task durations can be
    identified
  • For each feature or combination of features, the
    approach that will be taken to ensure that each
    is adequately tested
  • Tools and techniques
  • Expectations for test completeness (such as
    degree of code coverage for white box tests)
  • Testing constraints, such as time and budget
    limitations
  • Stop-test criteria

9
The IEEE Standard 829-1998 Test plan contents
  • 7. Item pass-fail criteria
  • Given a test item and a test case, the tester
    must have a set of criteria to decide whether the
    test has been passed or failed upon execution
  • The test plan should provide a general
    description of these criteria
  • Failures to a certain severity level may be
    accepted
  • 8. Suspension criteria and resumption
    requirements
  • Specify the criteria used to suspend all or a
    portion of the testing activity on the test items
    associated with this plan
  • Specify the testing activities that must be
    repeated, when testing is resumed
  • Testing is done in cycles test fix - (resume)
    test (suspend) fix - ...
  • Tests may be suspended when a certain number of
    critical defects has been observed
  • 9. Test deliverables
  • Test documents (possibly a subset of the ones
    described in the IEEE standard)
  • Test harness (drivers, stubs, tools developed
    especially for this project, etc.)
  • 10. Testing tasks
  • Identify all test-related tasks, inter-task
    dependencies and special skills required
  • Work Breakdown Structure (WBS)

10
The IEEE Standard 829-1998 Test plan contents
  • 11. Environmental needs
  • Software and hardware needs for the testing
    effort
  • 12. Responsibilities
  • Roles and responsibilities to be fulfilled
  • Actual staff involved (?)
  • 13. Staffing and training needs
  • Description of staff and skills needed to carry
    out test-related responsibilities
  • 14. Scheduling
  • Task durations and calendar
  • Milestones
  • Schedules for use of staff and other resources
    (tools, laboratories, etc.)

11
The IEEE Standard 829-1998 Test plan contents
  • 15. Risks and contingencies
  • Risks should be (i) identified, (ii) evaluated in
    terms of their probability of occurrence, (iii)
    prioritized, and (iv) contingency plans should be
    developed that can be activated if the risk
    occurs
  • Example of a risk some test items not delivered
    on time to the testers
  • Example of a contingency plan flexibility in
    resource allocation so that testers and equipment
    can operate beyond normal working hours (to
    recover from delivery delays)
  • 16. Testing costs (not included in the IEEE
    standard)
  • Kinds of costs
  • costs of planning and designing the tests
  • costs of acquiring the hardware and software
    necessary
  • costs of executing the tests
  • costs of recording and analyzing test results
  • tear-down costs to restore the environment
  • Cost estimation may be based on
  • Models (such as COCOMO for project costs) and
    heuristics (such as 50 of project costs)
  • Test tasks and WBS
  • Developer/tester ratio (such as 1 tester to 2
    developers)
  • Test impact items (such as number of procedures)
    and test cost drivers (or factors, such as KLOC)
  • Expert judgment
  • 17. Approvals
  • Dates and signatures of those that must approve
    the test plan

12
The IEEE Standard 829-1998 Test design
specifications
  • One or more documents
  • A test design specification describes how a group
    of features and/or test items is tested by a set
    of test cases and test procedures
  • May include a (test case to) features/requirements
    traceability matrix
  • Contents
  • Identifier
  • Features to be tested
  • Test items and features covered by this document
  • Approach refinements
  • Test techniques
  • Test case identification
  • Feature pass/fail criteria

13
The IEEE Standard 829-1998 Test case
specifications
  • Contents
  • Test case specification identifier
  • Test items
  • List of items and features to be tested by this
    test case
  • Input specifications
  • Output specifications
  • Environmental needs
  • Special procedural requirements
  • Intercase dependencies

14
The IEEE Standard 829-1998 Test procedure
specifications
  • Describe steps required for executing a set of
    test cases or, more generally, the steps used to
    analyze a software item in order to evaluate a
    set of features
  • Contents
  • Test procedure specification identifier
  • Purpose
  • Specific requirements
  • Procedure steps
  • Log, set up, proceed, measure, shut down,
    restart, stop, wrap up, contingencies

15
The IEEE Standard 829-1998 Test item transmittal
report
  • Accompanies a set of test items that are
    delivered for testing
  • Contents
  • Transmittal report identifier
  • Transmitted items
  • version/revision level
  • references to the items documentation and the
    test plan related to the transmitted items
  • persons responsible for the items
  • Location
  • Status
  • deviations from documentation, from previous
    transmissions or from test plan
  • incident reports that are expected to be resolved
  • pending modifications to documentation
  • Approvals

16
The IEEE Standard 829-1998 Test log
  • Records detailed results of test execution
  • Contents
  • Test log identifier
  • Description
  • Identify the items being tested including their
    version/revision levels
  • Identify the attributes of the environments in
    which the testing is conducted
  • Activity and event entries
  • Execution description
  • Procedure results
  • Environmental information
  • Anomalous events
  • Incident report identifiers
  • ...

17
The IEEE Standard 829-1998 Test incident report
  • Also called a problem report
  • Contents
  • Test incident report identifier
  • Summary
  • Summarize the incident
  • Identify the test items involved indicating their
    version/revision level
  • References to the appropriate test procedure
    specification, test case specification, and test
    log
  • Incident description
  • inputs, expected results, actual results,
    anomalies, date and time, procedure step,
    environment, attempts to repeat, testers,
    observers
  • any information useful for reproducing and
    repairing
  • Impact
  • If known, indicate what impact this incident will
    have on test plans, test design specifications,
    test procedure specifications, or test case
    specifications
  • severity rating (?)

18
The IEEE Standard 829-1998 Test summary report
  • Contents
  • Test summary report identifier
  • Summary
  • Summarize the evaluation of the test items
  • Identify the items tested, indicating the
    environment in which the testing activities took
    place
  • Variances
  • of the test items from their original design
    specifications
  • Comprehensiveness assessment
  • Evaluate the comprehensiveness of the testing
    process against the comprehensiveness criteria
    specified in the test plan if the plan exists
  • Identify features or feature combinations that
    were not sufficiently tested and explain the
    reasons
  • Summary of results
  • Summarize the results of testing
  • Identify all resolved incidents and summarize
    their resolutions
  • Identify all unresolved incidents.
  • Evaluation
  • Provide an overall evaluation of each test item
    including its limitations
  • This evaluation shall be based upon the test
    results and the item level pass/fail criteria
  • An estimate of failure risk may be included
  • Summary of activities

19
Lessons by Cem Kaner et alTest documentation
  • Lesson 142 To apply a solution effectively, you
    need to understand the problem clearly
  • Lesson 143 Don't use test documentation
    templates A template won't help unless you don't
    need it
  • Lesson 144 Use test documentation templates They
    foster consistent communication
  • Lesson 145 Use the IEEE Standard 829 for test
    documentation
  • Lesson 146 Don't use the IEEE Standard 829
  • IEEE Std provides templates, see lesson 143
  • assumes a waterfall-like approach
  • no guidelines for tailoring according to the
    needs of a project
  • too many documentation
  • absence of cost awareness
  • difficult to conciliate with automated testing
  • ...
  • Lesson 147 Analyze your requirements before
    deciding what products to build this applies as
    much to your documentation as to your software

20
Lessons by Cem Kaner et alTest documentation
  • Lesson 148 To analyze your test documentation
    requirements, ask questions like the ones in this
    list
  • What is your group's mission, and what are your
    objectives in testing your product?
  • Is your test documentation a product (to give
    someone else to use) or a tool (to use in-house)?
  • Is software quality driven by legal issues or by
    market forces?
  • How quickly is the design changing?
  • How quickly does the specification change to
    reflect design change?
  • When you test, do you hope to prove conformance
    to specs or nonconformance with customer
    expectations?
  • Does your testing style rely more on
    already-defined tests or on exploration?
  • Should test documentation focus on what to test
    (objectives) or on how to test for it
    (procedures)?
  • (...)
  • Lesson 149 Summarize your core documentation
    requirements in one sentence with no more than
    three components
  • Example "The test documentation set will
    primarily support our efforts to find bugs in
    this version, to delegate work, and to track
    status"

21
Lessons by Cem Kaner et al Test planning
  • Lesson 274 Three basic questions to ask about
    test strategy are "why bother?", "who cares?",
    and "how much?"
  • Lesson 275 There are many possible test
    strategies
  • Lesson 276 The real test plan is the set of ideas
    that guides your test process
  • Lesson 277 Design your test plan to fit your
    context
  • Lesson 278 Use the test plan to express choices
    about strategy, logistics, and work products
  • Lesson 279 Don't let logistics and work products
    blind you to strategy
  • Lesson 280 How to lie with test cases
  • Lesson 281 Your test strategy is more than your
    tests
  • Lesson 282 Your test strategy explains your
    testing
  • Lesson 283 Apply diverse half-measures

22
Lessons by Cem Kaner et al Test planning
  • Lesson 284 Cultivate the raw materials of
    powerful test strategies
  • Lesson 285 Your first strategy on a project is
    always wrong
  • Lesson 286 At every phase of the project, ask
    yourself "what can I test now and how can I test
    it?"
  • Lesson 287 Test to the maturity of the product
  • Lesson 288 Use test levels to simplify
    discussions of test complexity
  • Lesson 289 Test the gray box
  • Lesson 290 Beware of ancestor worship when
    reusing test materials
  • Lesson 291 Two testers testing the same thing are
    probably not duplicating efforts
  • Lesson 292 Design your test strategy in response
    to project factors as well as product risks
  • Lesson 293 Treat test cycles as the heartbeat of
    the test process

23
Bug reporting and follow-up
24
Automated bug reporting /tracking
(Mantis)
(source Ron Paton)
25
Automated bug reporting /tracking
Figure 6-1. Lifecycle of a Bugzilla Bug
(Bugzilla open source)
(source http//www.bugzilla.org/docs/2.22/pdf/Bug
zilla-Guide.pdf )
26
Manual bug reporting
(source Cem Kaner, Testing Computer Software)
27
Tracking test suites
(source Ron Paton)
28
Lessons by Cem Kaner et al Bug advocacy
  • Lesson 55 You are what you write
  • Lesson 56 Your advocacy drives the repair of the
    bugs you report
  • Lesson 57 Make your bug report an effective sales
    tool
  • Lesson 58 Your bug report is your representative
  • Lesson 59 Take the time to make your bug reports
    valuable
  • Lesson 60 Any stakeholder should be able to
    report a bug
  • Lesson 61 Be careful about rewording other
    people's bug reports
  • Lesson 62 Report perceived quality gaps as bugs
  • Lesson 63 Some stakeholders cannot report
    bugs-you're their proxy
  • Lesson 64 Draw the affected stakeholder's
    attention to controversial bugs
  • Lesson 65 Never use the bug-tracking system to
    monitor programmers' performance
  • Lesson 66 Never use the bug-tracking system to
    monitor testers' performance
  • Lesson 67 Report defects promptly
  • Lesson 68 Never assume that an obvious bug has
    already been filed

29
Lessons by Cem Kaner et al Bug advocacy
  • Lesson 69 Report design errors
  • Lesson 70 Extreme-looking bugs are potential
    security flaws
  • Lesson 71 Uncorner your corner cases
  • Lesson 72 Minor bugs are worth reporting and
    fixing
  • Lesson 73 Keep clear the difference between
    severity and priority
  • Lesson 74 A failure is a symptom of an error, not
    the error itself
  • Lesson 75 Do follow-up testing on seemingly minor
    coding errors
  • Lesson 76 Always report nonreproducible errors
    they may be time bombs
  • Lesson 77 Nonreproducible bugs are reproducible
  • Lesson 78 Be conscious of the processing cost of
    your bug reports
  • Lesson 79 Give special handling to bugs related
    to the tools or environment
  • Lesson 80 Ask before reporting bugs against
    prototypes or early private versions
  • Lesson 81 Duplicate bug reports are a
    self-correcting problem
  • Lesson 82 Every bug deserves its own report

30
Lessons by Cem Kaner et al Bug advocacy
  • Lesson 83 The summary line is the most important
    line in the bug report
  • Lesson 84 Never exaggerate your bugs
  • Lesson 85 Report the problem clearly, but don't
    try to solve it
  • Lesson 86 Be careful of your tone. Every person
    you criticize will see the report
  • Lesson 87 Make your reports readable, even to
    people who are exhausted and cranky
  • Lesson 88 Improve your reporting skills
  • Lesson 89 Use market or support data when
    appropriate
  • Lesson 90 Review each other's bug reports
  • Lesson 91 Meet the programmers who will read your
    reports
  • Lesson 92 The best approach may be to demonstrate
    your bugs to the programmers
  • Lesson 93 When the programmer says it's fixed,
    make sure it isn't still broken
  • Lesson 94 Verify bug fixes promptly
  • Lesson 95 When fixes fail, talk with the
    programmer

31
Lessons by Cem Kaner et al Bug advocacy
  • Lesson 96 Bug reports should be closed by testers
  • Lesson 97 Don't insist that every bug be fixed.
    Pick your battles
  • Lesson 98 Don't let deferred bugs disappear
  • Lesson 99 Testing inertia should never be the
    cause of bug deferral
  • Lesson 100 Appeal bug deferrals immediately
  • Lesson 101 When You decide to fight, decide to
    win!

32
Test measurements and test management
33
Test measurements
  • Measurements for monitoring test costs
  • Number of test cases developed
  • Size of the test harness (in KLOC)
  • Time spent in testing
  • Cost of testing
  • Measurements for monitoring test adequacy
  • Percentage of source code coverage
  • Percentage of requirements coverage
  • Measurements for monitoring product quality
  • Number of pre-relase defects found (by testing)
  • Number of post-relase defects found (by users)
  • Overall number of defects found
  • Measurements for monitoring test effectiveness
  • Number of defects found by testing / KLOC (of the
    software under test)
  • Number of defects found by testing / Number of
    test cases
  • Number of defects found by testing / Number of
    overall defects found (by testing and by users)
  • Evolution of previous measurements along time

34
Test team organization
fonte Filipe Carlos,
35
Test team organization
fonte Filipe Carlos,
36
Stop-test criteria (1/2)
  • The project runs out of time and resources
  • Not a good criterion
  • The detection of a specific number of defects has
    been accomplished
  • Not a good criterion (why?)
  • All the planned test that were developed have
    been executed and passed
  • All specified coverage goals have been met
  • Example 100 branch coverage in unit testing and
    all the requirements covered by system tests
  • The rates of defect detection for a certain time
    period have fallen below a specified level
  • Example we stop testing when we find 5 defects
    or less, with impact equal to, or below severity
    level 3, per week

37
Stop-test criteria (2)
  • Fault seeding ratios are favorable
  • Fault seeding defect injection based on
    historical defect data
  • Example at least 90 of the defects injected
    have been detected by the test suite
  • A specified reliability level has been met
  • Example the mean time to failure (MTTF) is
    longer than 24h
  • Estimated with statistical testing based on a
    usage/operational profile
  • Requires a high-maturity test organization
  • A specified fault-free confidence level has been
    met
  • The software is fault free, or has less than a
    certain number of defects per KLOC, with a
    certain confidence level (example 95)
  • Estimated with fault seeding techniques
  • Requires a high-maturity test organization
  • Use a combination of criteria, depending on each
    project!

38
Balance for optimal stop-test decision
  • Stop testing too late
  • Wasteful of resources
  • Delay time to market
  • Increased costs
  • Delayed schedules
  • Stop testing too early
  • Defects remain and cause loss or damage to life
    and property
  • Customer dissatisfaction
  • High costs to repair
  • Costs of hot line calls

39
Release decision
  • Usually, it is not cost-effective to fix all bugs
    found before releasing the product
  • Some bugs are very difficult to fix
  • Difficult to locate (example random bugs)
  • Difficult to correct (example third party
    component)
  • Reduce overall risk
  • Fix all high severity bugs
  • Fix (the easiest) 95 of the medium severity bugs
  • Fix (the easiest) 70 of low severity bugs

40
Release decision
Classificação de Anomalias
(fonte , Universidade
do Minho, 7 de Junho de 2003)
41
Release decision
Antes deste ponto não se deve disponibilizar o
produto para os clientes, pois a capacidade de
resposta é insuficiente
(fonte Filipe Carlos, Novabase)
42
Tool support
Test Management
Test Case
detects
verifies
verifies
Bug Tracking
Configuration Management
fixed in
Version (or patch)
Bug
detected in
deploys
affects
develop execute
develop
fix
Project Management
Requirements Management
implement
Requirement (or feature)
Task
What is needed is an Integrated Software
Engineering Environment (ISEE)
43
References and further reading
  • Practical Software Testing, Ilene Burnstein,
    Springer-Verlag, 2003
  • From academics
  • Chapter 7 (Test Goals, Policies, Plans, and
    Documentation) describes the IEEE standard
    829-1983 for Software Test Documentation
  • Appendix II (Sample Test Plan) provides a
    complete example of a test plan documented
    according to the standard
  • IEEE Standard For Software Test Documentation -
    IEEE Std 829-1998
  • Includes examples
  • Lessons Learned in Software Testing a
    Context-driven Approach, Cem Kaner, James Bach,
    John Wiley Sons Inc, 2002
  • Practical advice
  • Chapters 4 (Bug Advocacy), 6 (Documenting
    Testing) and 11 (Planning the Testing Strategy)
  • Software Testing, Ron Patton, SAMS, 2001
  • From practitioners, with lots of practical
    information, ready to use
  • Part V (Working with Test Documentation)
    Chapters 16 (Planning Your Test Effort), 17
    (Writing and Tracking Test Cases), 18 (Reporting
    What You Find) and 19 (Measuring Your Success)
  • Testing Computer Software,  2nd Edition, Cem
    Kaner, Jack Falk, Hung Nguyen, John Wiley Sons,
    1999
  • Chapters 5 (Reporting and analyzing bugs) and 12
    (Test planning and test documentation)
Write a Comment
User Comments (0)
About PowerShow.com