User-session based Testing of Web Applications - PowerPoint PPT Presentation

About This Presentation
Title:

User-session based Testing of Web Applications

Description:

User-session based Testing of Web Applications Two Papers A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis Uses concept ... – PowerPoint PPT presentation

Number of Views:194
Avg rating:3.0/5.0
Slides: 36
Provided by: Jehan4
Learn more at: http://www.cs.umd.edu
Category:

less

Transcript and Presenter's Notes

Title: User-session based Testing of Web Applications


1
User-session based Testing of Web Applications
2
Two Papers
  • A Scalable Approach to User-session based Testing
    of Web Applications through Concept Analysis
  • Uses concept analysis to reduce test suite size
  • An Empirical Comparison of Test Suite Reduction
    Techniques for User-session-based Testing of Web
    Applications
  • Compares concept analysis to other test suite
    reduction techniques

3
Talk Outline
  • Introduction
  • Background
  • User-session Testing
  • Concept Analysis
  • Applying Concept Analysis
  • Incremental Reduced Test Suite Update
  • Empirical Evaluation (Incremental vs. Batch)
  • Empirical Comparison of Concept Analysis to other
    Test Suite Reduction Techniques
  • Conclusions

4
Characteristics of Web-based Applications
  • Short time to market
  • Integration of numerous technologies
  • Dynamic generation of content
  • May contain millions of LOC
  • Extensive use
  • Need for high reliability, continuous
    availability
  • Significant interaction with users
  • Changing user profiles
  • Frequent small maintenance changes

5
User-session Testing
  • User session
  • A collection of user requests in the form of URL
    and name-value pairs
  • User sessions are transformed into test cases
  • Each logged request in a user session is changed
    into an HTTP request that can be sent to a web
    server
  • Previous studies of user-session testing
  • Previous results showed fault detection
    capabilities and cost effectiveness
  • Will not uncover faults associated with rarely
    entered data
  • Effectiveness improves as the number of sessions
    increases (downside cost increases as well)

6
Contributions
  • View user sessions as use cases
  • Apply concept analysis for test suite reduction
  • Perform incremental test suite update
  • Automate testing framework
  • Evaluate cost effectiveness
  • Test suite size
  • Program coverage
  • Fault detection

7
Concept Analysis
  • Technique for clustering objects that have common
    discrete attributes
  • Input
  • Set of objects O
  • Set of attributes A
  • Binary relation R
  • Relates objects to attributes
  • Implemented as a Boolean-valued table
  • A row for each object in O
  • A column for each attribute in A
  • Table entry o, a is true if object o has
    attribute a, otherwise false

8
Concept Analysis (2)
  • Identifies concepts given (O, A, R)
  • Concept is a tuple (Oi, Aj)
  • Concepts form a partial order
  • Output
  • Concept lattice represented by a DAG
  • Node represents concept
  • Edge denotes the partial ordering
  • Top element T most general concept
  • Contains attributes that are shared by all
    objects in O
  • Bottom element ? most special concept
  • Contains objects that have all attributes in A

9
Concept Analysis for Web Testing
  • Binary relation table
  • User session s object
  • URL u attribute
  • A pair (s, u) is in the relation table if s
    requests u

10
Concept Lattice Explained
  • Top node T
  • Most general concept
  • Contains URLs that are requested by all user
    sessions
  • Bottom node ?
  • Most special concept
  • Contains user sessions that requests all URLs
  • Examples
  • Identification of common URLs requested by 2 user
    sessions
  • us3 and us4
  • Identification of user sessions that jointly
    request 2 URLs
  • PL and GS

11
Concept Analysis for Test Suite Reduction
  • Exploit lattices hierarchical use-case
    clustering
  • Heuristic
  • Identify smallest set of user sessions that will
    cover all URLs executed by original suite

12
Incremental Test Suite Update
13
Incremental Test Suite Update (2)
  • Incremental algorithm by Godin et al.
  • Create new nodes/edges
  • Modify existing nodes/edges
  • Next-to-bottom nodes may rise up in the lattice
  • Existing internal nodes never sink to the bottom
  • Test cases are not maintained for internal nodes
  • Set of next-to-bottom nodes (user sessions) form
    the test suite

14
Web Testing Framework
15
Empirical Evaluation
  • Test suite reduction
  • Test suite size
  • Replay time
  • Oracle time
  • Cost-effectiveness of incremental vs. batch
    concept analysis
  • Program coverage
  • Fault detection capabilities

16
Experimental Setup
  • Bookstore Application
  • 9748 LOC
  • 385 methods
  • 11 classes
  • JSP front-end, MySQL backend
  • 123 user sessions
  • 40 seeded faults

17
Test Suite Reduction
  • Metrics
  • Test suite size
  • Replay time
  • Oracle time

18
Incremental vs. Batch Analysis
  • Metric
  • Space costs
  • Relative sizes of files required by incremental
    and batch techniques
  • Methodology
  • Batch 123 user sessions processed
  • Incremental 100 processed first, then 23
    incrementally

19
Program Coverage
  • Metrics
  • Statement coverage
  • Method coverage
  • Methodology
  • Instrumented Java classes using Clover
  • Restored database state before replay
  • Wget for replaying user sessions

20
Fault Detection Capability
  • Metric
  • Number of faults detected
  • Methodology
  • Manually seeded 40 faults into separate copies of
    the application
  • Replayed user sessions through
  • Correct version to generate expected output
  • Faulty version to generate actual output
  • Diff expected and actual outputs

21
Empirical Comparison ofTest Suite Reduction
Techniques
22
Empirical Comparison of Test Suite Reduction
Techniques
  • Compared 3 variations of Concept with 3
    requirements-based reduction techniques
  • Random
  • Greedy
  • Harrold, Gupta, and Soffas reduction (HGS)
  • Each requirements-based reduction technique
    satisfies program or URL coverage
  • Statement, method, conditional, URL

23
Random and Greedy Reduction
  • Random
  • Selection process continues until reduced test
    suite satisfies some coverage criterion
  • Greedy
  • Each subsequent test case selected provides
    maximum coverage of some criterion
  • Example
  • Select us6 maximum URL coverage
  • Then, select us2 most marginal improvement for
    all-URL coverage criterion

24
HGS Reduction
  • Selects a representative set from the original by
    approximating the optimal reduced set
  • Requirement cardinality of test cases
    covering that requirement
  • Select most frequently occurring test case with
    lowest requirement cardinality
  • Example
  • Consider requirement with cardinality 1 GM
  • Select us2
  • Consider requirement with cardinality 2 PL and
    GB
  • Select test case that occurs most frequently in
    the union
  • us6 occurs twice, us3 and us4 once
  • Select us6

25
Empirical Evaluation
  • Test suite size
  • Program coverage
  • Fault detection effectiveness
  • Time cost
  • Space cost

26
Experimental Setup
  • Bookstore application
  • Course Project Manager (CPM)
  • Create grader/group accounts
  • Assign grades, create schedules for demo time
  • Send notification emails about account creation,
    grade postings

27
Test Suite Size
  • Suite Size Hypothesis
  • Larger suites than
  • HGS and Greedy
  • Smaller suites than
  • Random
  • More diverse in terms of use case representation
  • Results
  • Bookstore application
  • HGS-S, HGS-C, GRD-S, GRD-C created larger suites
  • CPM
  • Larger suites than HGS and Greedy
  • Smaller than Random

28
Test Suite Size (2)
29
Program Coverage
  • Coverage Hypothesis
  • Similar coverage to
  • Original suite
  • Less coverage than
  • Suites that satisfy program-based requirements
  • Higher URL coverage than
  • Greedy and HGS with URL criterion
  • Results
  • Program coverage comparable to (within 2 of)
    PRG_REQ techniques
  • Slightly less program coverage than original
    suite and Random
  • More program coverage than URL_REQ techniques,
    Greedy and HGS

30
Program Coverage (2)
31
Fault Detection Effectiveness
  • Fault Detection Hypothesis
  • Greater fault detection effectiveness than
  • Requirements-based techniques with URL criterion
  • Similar fault detection effectiveness to
  • Original suite
  • Requirements-based techniques with program-based
    criteria
  • Results
  • Best fault detection but low number of faults
    detected per test case - Random PRG_REQ
  • Similar fault detection to the best PRG_REQ
    techniques
  • Detected more faults than HGS-U

32
Fault Detection Effectiveness (2)
33
Time and Space Costs
  • Costs Hypothesis
  • Less space and time than
  • HGS, Greedy, Random
  • Space for Concept Lattice vs. space for
    requirement mappings
  • Results
  • Costs considerably less than PRG_REQ techniques
  • Collecting coverage information for each session
    is the clear bottleneck of requirements-based
    approaches

34
Conclusions
  • Problems with Greedy and Random reduction
  • Non-determinism
  • Generated suites with wide range in size,
    coverage, fault detection effectiveness
  • Test suite reduction based on concept-analysis
    clustering of user sessions
  • Achieves large reduction in test suite size
  • Saves oracle and replay time
  • Preserves program coverage
  • Preserves fault detection effectiveness
  • Chooses test cases based on use case
    representation
  • Incremental test suite reduction/update
  • Scalable approach to user-session-based testing
    of web applications
  • Necessary for web applications that undergoes
    constant maintenance, evolution, and usage
    changes

35
References
  • Sreedevi Sampath, Valentin Mihaylov, Amie Souter,
    Lori Pollock "A Scalable Approach to User-session
    based Testing of Web Applications through Concept
    Analysis," Automated Software Engineering
    Conference (ASE), September 2004.
  • Sara Sprenkle, Sreedevi Sampath, Emily Gibson,
    Amie Souter, Lori Pollock, "An Empirical
    Comparison of Test Suite Reduction Techniques for
    User-session-based Testing of Web Applications,"
    International Conference on Software Maintenance
    (ICSM), September 2005.
Write a Comment
User Comments (0)
About PowerShow.com