ICS 52: Introduction to Software Engineering - PowerPoint PPT Presentation

About This Presentation
Title:

ICS 52: Introduction to Software Engineering

Description:

Your car. Your call to your mom. Your homework. Your hospital visit ... Complete formal specs. of problem to be solved. Correctness-preserving transformation ... – PowerPoint PPT presentation

Number of Views:20
Avg rating:3.0/5.0
Slides: 42
Provided by: ics9
Learn more at: https://ics.uci.edu
Category:

less

Transcript and Presenter's Notes

Title: ICS 52: Introduction to Software Engineering


1
ICS 52 Introduction to Software Engineering
  • Lecture Notes for Summer Quarter, 2003
  • Michele Rousseau
  • Topic 11
  • Partially based on lecture notes written by
    Sommerville, Frost, Van Der Hoek, Taylor
    Tonne. Duplication of course material for any
    commercial purpose without the written permission
    of the lecturers is prohibited

2
Todays Lecture
  • Quality assurance
  • An introduction to testing

3
ICS 52 Life Cycle
Requirements phase
Verify
Implementation phase
Test
Testing phase
Verify
4
Implementation/Testing Interaction
Implementation (previous lecture)
Testing (this lecture)
5
The Seriousness of the problem
  • Mars Pathfinder Metric or English system
  • Audi 5000 auto accelerate feature or fault?
  • Mariner 1 launch veered off course
  • ATT telephone network - down for 9 hours
  • Ariane 5
  • Pentium FPU error
  • X-ray machine over-radiation
  • LAS

6
Impact of Failures
  • Not just out there
  • Mars Pathfinder
  • Mariner 1
  • Ariane 5
  • But also at home
  • Your car
  • Your call to your mom
  • Your homework
  • Your hospital visit

Peter Neumanns Risks Forum http//catless.ncl.ac
.uk/Risks
7
Quality Assurance
  • What qualities do we want to assure?
  • Correctness (most important?)
  • How to assure correctness?
  • By running tests
  • How else?
  • Can qualities other than correctness be assured
    ?
  • How is testing done?
  • When is testing done?
  • Who tests?
  • What are the problems?

8
Software Qualities
  • Correctness
  • Reliability
  • Robustness
  • Performance
  • Usability
  • Verifiability
  • Maintainability
  • Repairability
  • Safety
  • Evolvability
  • Reusability
  • Portability
  • Survivability
  • Understandability

We want to show relevant qualities exist
9
Quality Assurance
  • Assure that each of the software qualities is met
  • Goals set in requirements specification
  • Goals realized in implementation
  • Sometimes easy, sometimes difficult
  • Portability versus safety
  • Sometimes immediate, sometimes delayed
  • Understandability versus evolvability
  • Sometimes provable, sometimes doubtful
  • Size versus correctness

10
Verification and Validation
  • Verification
  • Are we building the product right? (Boehm)
  • The Software should conform to its specification
  • testing, reviews, walk-throughs, inspections
  • internal consistency consistency with previous
    step
  • Validation
  • Are we building the right product?
  • The software should do what the user really
    requires
  • ascertaining software meets customers intent
  • Correctness has no meaning independent of
    specifications

11
Problem 1Eliciting the Customers Intent
Real needs
Actual Specs
Correct Specs
No matter how sophisticated the QA process
is, there is still the problem of creating the
initial specification
12
Problem 2 QA is tough
  • Complex data communications
  • Electronic fund transfer
  • Distributed processing
  • Web search engine
  • Stringent performance objectives
  • Air traffic control system
  • Complex processing
  • Medical diagnosis system

Sometimes, the software system is
extremelycomplicated making it tremendously
difficult to perform QA
13
Problem 3 Management Aspects of QA
  • Who does what part of the testing?
  • QA (Quality Assurance) team?
  • Are developers involved?
  • How independent is the independent testing group?
  • What happens when bugs are found?
  • What is the reward structure?

Project Management
?
Development Group
QA Group
?
14
Problem 4 QA vs Developers
  • Quality assurance lays out the rules
  • You will check in your code every day
  • You will comment your code
  • You will
  • Quality assurance also uncovers the faults
  • Taps developers on their fingers
  • Creates image of competition
  • Quality assurance is viewed as cumbersome
  • Just let me code
  • What about rewards?

Quality assurance has a negative connotation
15
Problem 5 Cant test exhaustively
There are 1014 possible paths! If we execute one
test per millisecond, it would take 3.170 years
to test this program!! ? Out of question
16
Simple Example A 32-Bit Multiplier
  • Input 2 32-bit integers
  • Output the 64-bit product of the inputs
  • Testing hardware checks one billion products
    per second (or roughly one check per 2-30
    seconds)
  • How long to check all possible products?
  • 264 ??2-30 234 seconds ? ?512 years
  • What if the implementation is based on table
    lookups?
  • How would you know that the spec is correct?

17
An Idealized View of QA
Complete formal specsof problem to be solved
Correctness-preserving transformation
Design, in formal notation
Correctness-preserving transformation
Code, in verifiable language
Correctness-preserving transformation
executable machine code
Correctness-preserving transformation
Execution on verified hardware
18
A Realistic View of QA
Mixture of formal and informal specifications
Manual transformation
Design, in mixed notation
Manual transformation
Code, in C, Ada, Java,
Compilation by commercial compiler
Pentium machine code
Commercial firmware
Execution on commercial hardware
19
The V V process
  • Is a whole life-cycle process - V V must be
    applied at each stage in the software process.
  • Has two principal objectives
  • The discovery of defects in a system
  • The assessment of whether or not the system is
    usable in an operational situation.

20
Static and dynamic verification
  • Software inspections Concerned with analysis of
    the static system representation to discover
    problems (static verification)
  • May be supplement by tool-based document and code
    analysis
  • Software testing Concerned with exercising and
    observing product behaviour (dynamic
    verification)
  • The system is executed with test data and its
    operational behaviour is observed

21
Static and dynamic VV
22
V V confidence
  • Depends on systems purpose, user expectations
    and marketing environment
  • Software function
  • The level of confidence depends on how critical
    the software is to an organisation
  • User expectations
  • Users may have low expectations of certain kinds
    of software
  • Marketing environment
  • Getting a product to market early may be more
    important than finding defects in the program

23
V V planning
  • Careful Planning is essential
  • Start Early remember the V model
  • Perpetual Testing
  • Balance static verification and testing
  • Define standards for the testing process rather
    than describing product tests

24
Static Analysis
  • Software Inspection
  • Examine the source representation with the aim of
    discovering anomalies and defects
  • May be used before implementation
  • May be applied to any representation of the
    system (requirements, design, test data, etc.)
  • Very effective technique for discovering errors

25
Inspection success
  • Many different defects may be discovered in a
    single inspection.
  • In testing, one defect ,may mask another so
    several executions are required
  • They reuse domain and programming knowledge so
    reviewers are likely to have seen the types of
    error that commonly arise

26
Inspections and testing
  • Inspections and testing are complementary and not
    opposing verification techniques
  • Both should be used during the V V process
  • Inspections can check conformance with a
    specification
  • Cant check conformance with the customers real
    requirements
  • Cannot validate dynamic behaviour
  • Inspections cannot check non-functional
    characteristics such as performance, usability,
    etc.

27
Inspections and testing
  • Inspections and testing are complementary and not
    opposing verification techniques
  • Both should be used during the V V process
  • Inspections can check conformance with a
    specification but not conformance with the
    customers real requirements
  • Inspections cannot check non-functional
    characteristics such as performance, usability,
    etc.

28
Testing
  • The only validation technique for non-functional
    requirements
  • Should be used in conjunction with static
    verification to provide full VV coverage

Program testing can be used to show the presence
of bugs, but never to show their absence.
E. W. Dijkstra
29
What is Testing
  • Exercising a module, collection of modules, or
    system
  • Use predetermined inputs (test case)
  • Capture actual outputs
  • Compare actual outputs to expected outputs
  • Actual outputs equal to expected outputs ?
    test case succeeds
  • Actual outputs unequal to expected outputs ?
    test case fails

30
Limits of software testing
  • Good testing will find bugs
  • Good testing is based on requirements,i.e.
    testing tries to find differences between the
    expected and the observed behavior of systems or
    their components
  • VVshould establish confidence that the software
    is fit for purpose
  • BUT remember Testing can only prove the presence
    of bugs - never their absence cant prove it is
    defect free
  • Rather, it must be good enough for its intended
    use and the type of use will determine the degree
    of confidence that is needed

31
Testing Terminology
  • Failure Incorrect or unexpected output, based
    on specifications
  • Symptom of a fault
  • Fault Invalid execution state
  • Symptom of an error
  • May or may not produce a failure
  • Error Defect or anomaly or bug in source code
  • May or may not produce a fault

32
Testing and debugging
  • Defect testing debugging Different processes
  • VV -establishes existence of defects in a
    program
  • Debugging locate and repair

33
The debugging process
34
Testing Goals
  • Reveal failures/faults/errors
  • Locate failures/faults/errors
  • Show system correctness
  • Improve confidence that the system performs as
    specified (verification)
  • Improve confidence that the system performs as
    desired (validation)
  • Desired Qualities
  • Accurate
  • Complete / thorough
  • Repeatable
  • Systematic

35
Test Tasks
  • Devise test cases
  • Target specific areas of the system
  • Create specific inputs
  • Create expected outputs
  • Choose test cases
  • Not all need to be run all the time
  • Regression testing
  • Run test cases
  • Can be labor intensive

All in a systematic, repeatable, and accurate
manner
36
Levels of Testing
  • Unit/component testing testing of code unit
    (subprogram, class, method/function, small
    subsystem)
  • Often requires use of test drivers
  • Integration testing testing of interfaces
    between units
  • Incremental or big bang approach?
  • Often requires drivers and stubs
  • System or acceptance testing testing complete
    system for satisfaction of requirements
  • often performed by user / customer

37
What is the problem we need to address?
  • Want to verify software --gt
  • Need to test --gt
  • Need to decide on test cases --gt
  • But, no set of test cases guarantees absence of
    bugs,
  • What is a systematic approach to the selection of
    test cases that will lead to the accurate,
    acceptably thorough, and repeatable
    identification of errors, faults, and failures?

So,
38
Two Approaches
  • White box (or Glass Box) testing
  • Structural testing
  • Test cases designed, selected, and ran based on
    structure of the code
  • Scale tests the nitty-gritty
  • Drawbacks need access to source code
  • Black box testing
  • Specification-based testing
  • Test cases designed, selected, and ran based on
    specifications
  • Scale tests the overall system behavior
  • Drawback less systematic

39
Test Oracles
  • Provide a mechanism for deciding whether a test
    case execution succeeds or fails
  • Critical to testing
  • Used in white box testing
  • Used in black box testing
  • Difficult to automate
  • Typically relies on humans
  • Typically relies on human intuition
  • Formal specifications may help

40
Example
  • Your test shows cos(0.5) 0.8775825619
  • You have to decide whether this answer is
    correct?
  • You need an oracle
  • Draw a triangle and measure the sides
  • Look up cosine of 0.5 in a book
  • Compute the value using Taylor series expansion
  • Check the answer with your desk calculator

41
Use the Principles
  • Rigor and formality
  • Separation of concerns
  • Modularity
  • Abstraction
  • Anticipation of change
  • Generality
  • Incrementality
Write a Comment
User Comments (0)
About PowerShow.com