Fundamentals of Software Testing - PowerPoint PPT Presentation

About This Presentation
Title:

Fundamentals of Software Testing

Description:

Fundamentals of Software Testing UNIT 1 * * * * * * * * * * Regression Testing Re-testing after fixes or modifications of the software or its environment. – PowerPoint PPT presentation

Number of Views:183
Avg rating:3.0/5.0
Slides: 38
Provided by: paw37
Category:

less

Transcript and Presenter's Notes

Title: Fundamentals of Software Testing


1
Fundamentals of Software Testing
  • UNIT 1

2
Testing
  • Testing involves operation of a system or
    application under controlled conditions
    evaluating the results.
  • The controlled conditions should include both
    normal abnormal conditions.
  • Testing should intentionally attempt to make
    things go wrong to determine if things happen
    when they shouldnt or things dont happen when
    they should.

3
Testing Terminology
  • Errors
  • Faults/Defects
  • Failures
  • Test Cases
  • Test
  • Test oracle
  • Test Bed
  • Software Quality

4
What is a Bug ?
  • A flaw in a system or system component that
    causes the system or component to fail to perform
    its required function.
  • Most are simple, subtle failures, with many being
    so small that its not always clear which ones
    are true failures, and which ones arent.

5
Terms for software failures
  • 1. Defect 8. Failure
  • 2. Fault 9. Inconsistency
  • 3. Problem 10. Feature
  • 4. Error 11. Bug
  • 5. Incident
  • 6. Anomaly
  • 7. Variance

6
Fault, failure defect
  • Tend to imply a condition thats really severe,
    may be even dangerous.
  • It doesnt sound right to call an incorrectly
    colored icon a fault.
  • These words also tend to imply blame its his
    fault that the s/w failed.

7
Anomaly, incident variance
  • Dont sound quite so negative and infer more
    unintended operation than an all-out failure.
  • the president stated that it was a software
    anomaly that caused the missile to go off course.

8
Problem, error bug
  • Probably the most generic terms used.
  • A bugs a bugs a bug .

9
but in SDLC these terms differs like
  • Design deviation Error
  • Coding deviation Bug
  • Testing Issue/Bug
  • Maintenance Defect
  • At Client -- Failure

10
Why does software have bugs ?
  • Miscommunication or no communication
    (requirements)
  • Software complexity
  • Programming errors
  • Changing requirements
  • Time pressures
  • Poorly documented code
  • Software development tools
  • Egos

11
Fig Bugs are caused for numerous reasons, but
the main cause can be traced to the specification.
12
The Cost of Bugs
FigThe cost to fix bugs increases dramatically
over time
13
What is Verification ?
  • Verification ensures the product is designed to
    deliver all functionality to the customer.
  • Typically involves reviews and meetings to
    evaluate documents, plans, code, requirements,
    and specifications.

14
Verification
  • This can be done with checklists, issues lists,
    walkthroughs, and inspection meetings.
  • Verification takes place before validation.

15
What is Validation ?
  • Validation typically involves actual testing and
    evaluates the product itself.
  • The process of executing something to see how it
    behaves.
  • The output of validation is a nearly perfect,
    actual product.

16
What kind of testing should be considered ?
  • Black Box Testing
  • White Box Testing
  • Unit Testing
  • Incremental Integration Testing
  • Integration Testing
  • Functional Testing
  • System Testing
  • End-to-End Testing
  • Sanity or Smoke Testing
  • Regression Testing
  • Acceptance Testing
  • Load Testing
  • Stress Testing
  • Performance Testing
  • Usability Testing
  • Install / Uninstall Testing

17
Contd.
  • Recovery Testing
  • Failure Testing
  • Security Testing
  • Compatibility Testing
  • Exploratory Testing
  • Ad-hoc Testing
  • User Acceptance Testing
  • Comparison Testing
  • Alpha Testing
  • Beta Testing
  • Mutation Testing

18
Black Box Testing
  • Not based on any knowledge of internal design or
    code.
  • Tests are based on requirements and
    functionality.
  • It will not test hidden functions errors
    associated with them will not be found in black
    box testing.

19
White Box Testing
  • Based on knowledge of the internal logic of an
    applications code.
  • Tests are based on coverage of code statements,
    branches, paths, conditions.
  • It will not detect missing function.

20
Unit Testing
  • Process of testing the individual component of a
    program.
  • Discover discrepancies between the modules
    interface specification its actual behavior.
  • Verify the control flow and data flow.
  • Requires the knowledge of the code hence done by
    the developers.

21
Incremental Integration Testing
  • Done by programmers or by Testers.
  • Continuous testing of an application as new
    functionality is added.
  • Requires that various aspects of an applications
    functionality be independent enough to work
    separately before all parts of the program are
    completed.

22
Integration Testing
  • Testing of combined parts of an application to
    determine if they function together correctly.
  • To discover errors in the interface between the
    components, verify communication between units.
  • Done by developers / QA teams.

23
Functional Testing
  • Black box type testing.
  • Detect discrepancies between a programs
    functional specification and actual behavior.
  • Verifies that the software provides expected
    services.
  • Done by testers.

24
System Testing
  • Black box type testing.
  • Attempting to demonstrate that a program Or
    system does not meet its original requirements
    objectives, as stated in the requirements
    specification.
  • Done by testing group before the product is made
    available to customer.

25
Types / Goals of System Testing
  • Usability Testing
  • Performance Testing
  • Load Testing
  • Stress Testing
  • Volume Testing
  • Security Testing
  • Configuration Testing
  • Install ability Testing
  • Recovery Testing
  • Service ability Testing
  • Reliability / Availability Testing

26
Usability Testing
  • Testing for user-friendliness.
  • User interviews, surveys, video recording of user
    sessions, other techniques can be used.
  • Identify discrepancies between the user interface.

27
Performance Testing
  • Evaluate the compliance of a system or components
    with specified performance requirements.
  • Often used interchangeably with stress and
    load testing.
  • Ideally performance testing is defined in
    requirements documentation.

28
Load Testing
  • Testing an application under heavy loads, such as
    testing of a website under a range of loads to
    determine at what point the systems response
    time degrades or fails.

29
Stress Testing
  • Used interchangeably with load performance
    testing.
  • Testing conducted to evaluate a system or
    component at or beyond the limits of its
    specified requirements.
  • System functional testing while under unusually
    heavy loads, heavy repetition of certain actions
    or inputs, etc..

30
Configuration Testing
  • To determine whether the program operates
    properly when the software or hardware is
    configured in a required manner

31
Compatibility Testing
  • To determine whether the compatibility objectives
    of the program have been met.
  • Testing how well application performs in a
    particular hardware / software / operating system
    / network etc environment.

32
Installability Testing
  • Testing of full, partial, or upgrade install /
    uninstall processes.
  • To identify the ways in which the installation
    procedures lead to incorrect results installation
    option
  • New
  • Upgrade
  • Customized / Complete
  • Under normal abnormal conditions

33
Recovery Testing
  • Testing how well a system recovers from crashes,
    hardware failures, or other catastrophic
    problems.
  • Typically used interchangeably with fail-over
    testing.

34
Regression Testing
  • Re-testing after fixes or modifications of the
    software or its environment.
  • Verify that changes fixes have not introduced new
    problems.
  • It can be difficult to determine how much
    re-testing is needed, especially near the end of
    development cycle.

35
Acceptance Testing
  • Final testing based on specifications of the
    end-user or customer, or based on use by end-user
    / customer over some limited period of time.
  • Determine whether the software is ready for final
    deployment.
  • Done after the testing group has satisfactorily
    completed usability, function, testing.

36
Contd
  • Organizations can arrange for alternative forms
    of acceptance testing
  • ALPHA
  • BETA

37
ALPHA BETA
  • Both involve running operating the s/w in
    production mode for a pre-specified period.
  • The ALPHA test is usually performed by end users
    inside the development org.
  • The BETA test is usually performed by a selected
    subset of actual customers outside the company,
    before the s/w is made available to all customers.
Write a Comment
User Comments (0)
About PowerShow.com