ITIS 3310 Software Architecture and Design Chapter 13 Software Testing Strategies - PowerPoint PPT Presentation

1 / 43
About This Presentation
Title:

ITIS 3310 Software Architecture and Design Chapter 13 Software Testing Strategies

Description:

Testing is the process of exercising a. program with the specific intent of finding ... If at an impasse: Step away from problem a while. Get help from someone else ... – PowerPoint PPT presentation

Number of Views:67
Avg rating:3.0/5.0
Slides: 44
Provided by: roger282
Category:

less

Transcript and Presenter's Notes

Title: ITIS 3310 Software Architecture and Design Chapter 13 Software Testing Strategies


1
ITIS 3310 Software Architecture and Design
Chapter 13Software Testing Strategies
2
Software Testing
Testing is the process of exercising a program
with the specific intent of finding errors prior
to delivery to the end user.
3
Software Testing
  • Any non trivial program has errors
  • Logical
  • Typos
  • Misunderstanding in implementing
  • Etc.
  • Testing exposes these errors
  • Testing does not prove the program is correct

4
What Testing Shows
errors
requirements conformance
performance
an indication of quality
5
Who Tests the Software?
developer
independent tester
Understands the system
Must learn about the system,
but, will test "gently"
but, will attempt to break it


and, is driven by quality
and, is driven by "delivery"


6
Testing Strategy
  • We begin by testing-in-the-small and move
    toward testing-in-the-large

7
Testing Strategy
Requirements
Engineering
Does the software meet the requirements?
Does the system as a whole work?
validation test
system test
Code
Design
Do the individual components work?
Do the components work together?
integration test
unit test
8
Completion Testing
  • How do we know were done?
  • When we run out of money and/or time
  • Testing is never done
  • Every time the customer runs the application its
    being tested
  • Neither are particularly satisfying answers
  • Need better testing metrics and guidelines

9
Strategic Issues
  • State requirements quantifiably from the
    beginning
  • Minimal number of errors is important
  • So is
  • Portability
  • Maintainability
  • Usability
  • State testing objectives explicitly (measurably)
  • Understand the users of the software and develop
    a profile for each user category

10
Strategic Issues
  • Develop a testing plan that emphasizes rapid
    cycle testing
  • Small increments useful to customer
  • Build robust software that is designed to test
    itself
  • Software itself can detect errors
  • Use effective formal technical reviews as a
    filter prior to testing
  • Uncover errors during development
  • Before testing even begins

11
Strategic Issues
  • Conduct formal technical reviews to assess the
    test strategy and test cases themselves
  • Look for
  • Inconsistencies
  • Omissions
  • Errors
  • Develop a continuous improvement approach for the
    testing process
  • Measure the testing process
  • Use a statistical approach to testing and quality
    assurance

12
Unit Testing
module to be tested
results
software engineer
test cases
13
Unit Testing
module to be tested


interface
local data structures

boundary conditions
independent paths

error handling paths
test cases
14
Unit Test Environment
What is the impact of cohesion on testing this
module?
driver

interface
local data structures
  • When only one function is performed
  • Number of test cases is reduced
  • Errors can be predicted better
  • Errors can be uncovered better


Module
boundary conditions
independent paths

error handling paths
stub
stub
test cases
RESULTS
15
Integration Testing Strategies
Options the big bang approach an
incremental construction strategy
16
Integration Testing Strategies
  • Big Bang
  • System is tested as a whole
  • Components are combined before testing
  • How to tell where errors came from?
  • If errors are corrected do they
  • Spawn new errors?
  • Allow previously missed errors to show?

17
Integration Testing Strategies
  • Incremental Integration
  • Construct and test in small increments
  • Errors easier to isolate
  • Interfaces can be tested more thoroughly
  • Testing can proceed systematically

18
Top Down Integration
A
top module is tested with
stubs
B
F
G
stubs are replaced one at
a time, "depth first"
C
as new modules are integrated,
some subset of tests is re-run
D
E
19
Bottom-Up Integration
A
B
F
G
drivers are replaced one at a
time, "depth first"
C
worker modules are grouped into
builds and integrated
D
E
cluster
20
Sandwich Testing
A
Top modules are tested with stubs
B
F
G
C
Worker modules are grouped into
builds and integrated
D
E
cluster
21
Regression Testing
  • Softwares behavior changes as modules are added
    or channged
  • New I/O
  • New control logic
  • Does it still work?
  • To find out, re-execute some subset of previous
    tests to confirm operation, i.e., regression
    testing
  • Sample tests to exercise functionality
  • Additional tests for functions likely to be
    affected by changes
  • Tests that focus on components that have changed

22
Smoke Testing
  • A common approach for creating daily builds for
    product software
  • Smoke testing steps
  • Software components that have been translated
    into code are integrated into a build
  • A build includes all data files, libraries,
    reusable modules, and engineered components that
    are required to implement one or more product
    functions
  • A series of tests is designed to expose errors
    that will keep the build from properly performing
    its function
  • The intent should be to uncover show stopper
    errors that have the highest likelihood of
    throwing the software project behind schedule
  • The build is integrated with other builds and the
    entire product (in its current form) is smoke
    tested daily
  • The integration approach may be top down or
    bottom up

23
Object-Oriented Testing
  • Testing strategy changes
  • The concept of the unit broadens due to
    encapsulation
  • Integration focuses on classes and their
    execution across a thread or in the context of
    a usage scenario
  • Validation uses conventional black box methods
  • Test case design draws on conventional methods,
    but also encompasses special features

24
OOT Strategy
  • Class testing is the equivalent of unit testing
  • Operations within the class are tested
  • The state behavior of the class is examined
  • Integration applied three different strategies
  • Thread-based testingintegrates the set of
    classes required to respond to one input or event
  • Use-based testingintegrates the set of classes
    required to respond to one use case
  • Cluster testingintegrates the set of classes
    required to demonstrate one collaboration

25
High Order Testing
  • Validation testing
  • Focus is on software requirements
  • Does this program match the specifications?
  • Does this program do what the customers wants?
  • BIG problem if testing uncovers an error. Why?
  • Alpha/Beta testing
  • Focus is on customer usage
  • Hard to tell how real customers will use a system
  • Alpha Conducted at developers site by
    end-users in a controlled environment
  • Beta Conducted at end-user sites as a live
    application

26
High Order Testing
  • System testing
  • Purpose is to fully exercise the system
  • Software is but one element in a larger construct
  • People
  • Equipment
  • Recovery testing
  • Forces the software to fail in a variety of ways
    and verifies that recovery is properly performed

27
High Order Testing
  • Security testing
  • Verifies that protection mechanisms built into a
    system will, in fact, protect it from improper
    penetration
  • Stress testing
  • Executes a system in a manner that demands
    resources in abnormal quantity, frequency, or
    volume
  • Performance Testing
  • Test the run-time performance of software within
    the context of an integrated system

28
Debugging A Diagnostic Process
29
Consequences of Bugs
30
The Debugging Process
test cases
results
new test cases
regression tests
suspected causes
Debugging
corrections
identified causes
31
Debugging Effort
time required to diagnose the symptom
and determine the cause
time required to correct the error and
conduct regression tests
32
Symptoms Causes
How is coupling related to this?
symptom and cause may be
geographically separated

symptom may disappear when
another problem is fixed

cause may be due to a
combination of non-errors

cause may be due to a system
or compiler error

cause may be due to
symptom
assumptions that everyone
cause
believes

symptom may be intermittent
33
(No Transcript)
34
(No Transcript)
35
(No Transcript)
36
(No Transcript)
37
(No Transcript)
38
Debugging Techniques
Brute force / testing

Backtracking

Induction

Deduction
39
Debugging Techniques
  • Brute force
  • Load up on
  • Output statements
  • Memory dumps
  • Traces
  • Were looking for a clue
  • Huge amount of time and effort
  • Backtracking
  • Start from error
  • Work backwards
  • Size of program may lead to many possible paths

40
Debugging Techniques
  • Cause elimination
  • Induction
  • Inference of a generalized conclusion from
    particular instances
  • Jones was late yesterday so hell probably be
    late today.
  • Deduction
  • Inference in which the conclusion about
    particulars follows necessarily from general or
    universal premises
  • Socrates is a stonemason. Socrates is a
    philosopher. Therefore, at least one stonemason
    is a philosopher.

41
Debugging Techniques
  • Cause elimination
  • Method 1
  • Collect data about the error
  • Advance a hypothesis using this data
  • Use the data to prove/disprove the hypothesis
  • Method 2
  • List all possible causes of error
  • Develop tests to eliminate each possible cause
  • Automated debugging tools

42
Debugging Final Thoughts
  • Don't run off half-cocked, think about the
    symptom you're seeing
  • Use tools (e.g., dynamic debugger) to gain more
    insight
  • If at an impasse
  • Step away from problem a while
  • Get help from someone else
  • Be absolutely sure to conduct regression tests
    when you do "fix" the bug

43
Summary
  • Testing is the largest portion of the technical
    effort in the software process
  • Objective is to find deficiencies, not prove the
    software is correct
  • Testing is a systematic, planned activity
  • Debugging is still an art form
Write a Comment
User Comments (0)
About PowerShow.com