Title: Integration Testing the ghostly grey area between white box testing and black box testing
1Integration Testingthe ghostly grey area between
white box testing and black box testing
- Presented at IQAA 2006 Conference
- Quality Under Pressure
- October 13, 2006
- Indianapolis Quality Assurance Association
- http//www.iqaa.org
2Your Panelists
- Peter Ackerman, SEI CMMI-Stage I
- Kenneth L. Shafer, CDP/CCP
3Integration Testing
- What is it?
- How is it different from Unit Testing / White Box
Testing and Systems Testing / Black Box Testing? - When is it appropriate to do Integration Testing?
- What are some perspectives, techniques, and
strategies for Integration Testing? - What are some examples of gotchas that you
should look for and address with Integration
Testing - What are some good references to provide follow up
4What is Integration Testing?
- Definition by Daniel Mosley in The Handbook of
MIS Application Software Testing Methods,
Techniques, and Tools for Assuring Quality
Through Testing (1993) Integration testing is
the testing of modules, programs, subsystems, and
even entire systems to prove that they can
interact properly. The common link among modules,
programs, subsystems, etc. is that they can 1)
share data globally and 2) they can share data
locally defined, through specially designed and
constructed interfaces. Mosley heavily
influenced by Glen Myers and The Art of Software
Testing (1979).
5What is Integration Testing?
- Characteristics of Integration Testing
- A Higher-order type of testing than Unit
Testing, but falling short of systems testing or
User Acceptance Testing - Performed after Unit Testing but before Systems
Testing - Often reveals undetected design flaws
- Can be done non-incrementally (big-bang
approach), or incrementally (eg. top-down,
bottom-up, sandwich approaches) - Can apply to custom-developed or COTS integration
6How is Integration Testing Different?
- Integration Testing is not Unit Testing
- It is higher-order than Unit Testing
- It presumes Unit Testing has been performed
- Knowledge of the internal structure of the code
Unit is not as important - It emphasizes the interactions, interfaces, and
connections between Units, not the behavior
within a Unit
7How is Integration Testing Different?
- Integration Testing is not Systems Testing or
User Acceptance Testing - It is lower-order than Systems Testing or UAT
- It presumes Unit Testing and Integration Testing
has already been performed - The functionality of the integration of a limited
number of modules still may inadequate to
accomplish a business purpose - Emphasizes technical interactions among modules
- Conducted by a programmer/developer, or a system
architect, rather than a tester/QA person or a
user
8When do you do Integration Testing?
- When the system is very large (500K LOC)
- When the system has a large number of internal or
external interfaces (trading partners) - When the product involves integrating hardware
and software - When the system architecture involves n-tier or
distributed computing as opposed to a monolithic
structure - When you are observing too many defects in System
Test or UAT - When your system is being developed in a
mixed-language environment or must interface with
legacy systems
9When do you do Integration Testing?
- When it is difficult locating the source of the
defects identified during System Test or UAT - When you have a large number of interactions
among software components - When you are integrating COTS with
custom-developed software - When the system being developed is a real-time
system - When you have requirements for higher quality
(doing Integration Testing as a transition
between Unit Testing and Systems Testing is
preferred to just doing more Systems Testing)
10Perspectives, Techniques, Strategies
- Always keep in mind Roger Pressmans Three
Perspectives of Software Engineering - Data-oriented perspective data at rest
- Functional perspective data flow, or data in
motion - Behavioral perspective data states changing
over time
11Data-Oriented Techniques and Strategies
- Entity-Relationship Design (ERD) Data Model
Verification applies especially to COTS - Create/Read/Update/Delete (CRUD) verification of
the data model at the column (data element) level
of granularity - Special consideration to data types, and
potential conflicts among modules, especially in
mixed-language environments - Special consideration to control totals and
checksums when converting data from legacy
systems
12Functional-Oriented Techniques and Strategies
- Emphasize inspection of module interfaces
(internal interfaces), such as subprogram
parameters, object methods, and inter-module
messages - Special attention to module error testing -
error detection and recovery - Also inspect external interfaces, at system
boundary (ftp, files, cartridge tape) - Pay special attention to conflicts in data types
or data representations (eg. single precision or
double precision floating point), especially in
mixed-language environments
13Functional-Oriented Techniques and Strategies
- Capture snapshots of data flows between
components (messages, transactions, temporary
files, permanent files) - Pay special attention to control-flow or
screen-navigation, especially with Web-based or
GUI interfaces
14Behavioral-Oriented Techniques and Strategies
- Perform life-cycle testing on each data entity
(object), to confirm that it can attain each of
its permitted states, or statuses - Understand that the purpose of a transaction is
to move an object from one status to another, not
just to update the database - Confirm that the state-to-state transitions of
those objects occur under the correct conditions - Perform round-trip testing, or end-to-end
testing, in transaction-based systems, achieving
the correct persistence in geographically
separated databases
15Special Considerations for Real-Time Systems
- Integration testing of real-time systems must pay
close attention to how real-time components
communicate and interact with each other - In particular, pay special attention to -
- shared memory (may have to have a special
independent peek utility to watch the data on
the fly - messages (may have to have a special tool or
utility to capture the messages on the fly) - semaphores / synchronization locks
- knowledge of low-level hardware/software
interaction
16Examples
- Point-of-Sale system C system failed to send
correct PICture of credit/debit item to COBOL
Standard Financial Accounting System, due to
inadequate knowledge of data types - Point-of-Sale system produced conflicting reports
on transactions due to multiple date columns
referenced differently by multiple programmers - Motor Vehicle System message returned from
application server truncated due to COBOL
low-values interpreted as string terminator
by C middleware - Real-Time Process Control system hung in infinite
loop due to mis-interpretation of a special
system call to UART board not distinguishing
between character queue size and overflow bit - Banking system did not properly append/scratch
external feed or remove duplicates caused by
multiple transmissions
17References
- Glen Myers, The Art of Software Testing, 1979
- Robert Glass, Facts and Fallacies of Software
Engineering, 2002 - Robert Pressman, Software Engineering, 2004
- Daniel Mosley, The Handbook of MIS Application
Software Testing, 1993 - David Schultz, A Case Study in System Integration
using the Build Approach, ACM, 1979 - Bill Curtis, A Field Study of the Software Design
Process for Large Systems, ACM, 1988 - Brian Marick, many articles on object-oriented
testing
18Panel Discussion
- Peter Ackerman, SEI CMMI-Stage I
- Kenneth L. Shafer, CDP/CCP