Restructuring Test Variabilities in Software Product Lines - PowerPoint PPT Presentation

1 / 36
About This Presentation
Title:

Restructuring Test Variabilities in Software Product Lines

Description:

Cloned code and no modularization. Increasing developer's effort ... Eliminate cloned code, avoiding future inconsistences or even bugs ... – PowerPoint PPT presentation

Number of Views:48
Avg rating:3.0/5.0
Slides: 37
Provided by: ivancorde
Category:

less

Transcript and Presenter's Notes

Title: Restructuring Test Variabilities in Software Product Lines


1
Restructuring Test Variabilities in Software
Product Lines
  • Márcio de Medeiros Ribeiro
  • mmr3_at_cin.ufpe.br
  • Advisor Professor Paulo Borba
  • phmb_at_cin.ufpe.br

2
Introduction
  • Software development
  • Critical need to reduce cost, effort, and
    time-to-market
  • Complexity and size are increasing
  • Customers requesting products according to their
    needs
  • Approach to reduce such difficulties

Software Product Lines
3
Introduction
  • Core assets and variabilities in SPL
  • Requirements, Source code, Tests

4
Mass Customization
5
Variabilities not well structured
Landscape view
Stop button
Test Case 01
  • Cloned Code negative effects
  • on the cost to maintain the SPL
  • Variabilities not separated
  • No modularity!

6
We need Modularity!
  • How to restructure test variabilities to achieve
    modularity?

7
Problem
  • Which mechanisms to use for restructuring test
    variabilities?
  • Selecting the correct ones
  • Difficult task due to the great variety of
    mechanisms
  • Selecting the incorrect ones
  • Negative effects on the cost to maintain the SPL
  • Cloned code and no modularization
  • Increasing developers effort
  • Decreasing productivity when evolving SPL

8
Proposed Solution
  • Decision Model and Tool to help developers when
    choosing mechanisms to restructure test
    variabilities

???
9
Decision Model for Restructuring Tests in
Software Product Lines
10
Analyzed Tests
  • Test Automation Framework (TAF) Motorola
  • Mechanism to handle variabilities if-else
    statements

11
Finding Product Line Variabilities
  • Cloned code technique

12
Decision Model
  • Goal help developers when restructuring test
    variabilities
  • Inputs
  • Variability location at the source code (if-else
    statements)
  • Feature type
  • Criteria
  • Modularity (yes/no)
  • Source Code Size (9, 9, -9)
  • Scalability (yes/no)
  • Time (compile-time/runtime)
  • Output
  • Mechanism(s) to restructure the given test
    variability

13
Decision Model
Decision Model
Inheritance, Mixins Configuration
Files, Dependency Injection
14
Methodology
  • Test variabilitity found
  • Identify the feature type and its location
  • Restructure it by using many mechanisms, such as
    Inheritance, Configuration Files, Dependency
    Injection, Aspect-Oriented Programming, and so
    forth.
  • Compare each implementation with respect to the
    mentioned criteria
  • Adapt the Decision Model to encompass the
    variability

15
Beginning/End of Method Body
  • Transflash and Bluetooth four possible instances
  • Mechanisms Inheritance, Mixins, Decorator, and
    AOP

Optional
16
Beginning/End of Method Body
17
Beginning/End of Method Body
CDLOC Concern Diffusion over Lines of Code CDC
Concern Diffusion over Components NCC Number
of Concerns per Component CBC Coupling
18
Method Parameter
Alternative
19
Method Parameter
CDLOC Concern Diffusion over Lines of Code CDC
Concern Diffusion over Components VS
Vocabulary Size CBC Coupling
20
Summary Decision Model
21
Supporting the Variability Implementation
Mechanisms Recommendation
22
Tool Support
  • Choosing appropriate mechanisms may be difficult
  • If no tool support is available, such task may
    get worse
  • How to make the refactoring?
  • Variability scattered
  • Based on FLiPEx from Java classes to AspectJ
    aspects

23
FLiPRec How does it works?
  • Cloned code and Neighbors object

24
FLiPRec - Demonstration
  • Example 1
  • Example 2

C
25
Evaluation
26
Evaluation
27
Scenario 1
  • Test Case 150LOC

ü
ü
28
Scenario 2
  • Developer variability in a given test case of
    Family A
  • Variability cloned test cases of the families C,
    D, and F
  • Developer knows about the cloning
  • Developer does not know
  • Where they are cloned? 204 tests must be
    analyzed!
  • Decision Model AOP
  • Tool valid pointcut and clones found faster and
    precisely

ü
ü
29
Concluding Remarks
30
Concluding Remarks
  • Decision Model for recommending mechanisms to
    restructure test variabilities
  • Prototype tool to support developers faster and
    precise recommendations
  • Based on tests and targets if-else statements,
    but
  • Analogous variabilities found in J2ME Games
  • Product Lines based on Conditional Compilation
  • Restructuring versus Structuring

31
Concluding Remarks
  • Putting both Decision Model and Tool together
  • Eliminate cloned code, avoiding future
    inconsistences or even bugs
  • Modularized features developers might work in
    parallel when evolving the SPL
  • Productivity increasing time consuming and
    error-prone tasks may be avoided, reducing the
    time-to-market

32
Contributions
  • Code-centric and fine-grained Decision Model
  • Exact location at the source code
  • Easier to apply Fowler-like Refactorings
  • Not only qualitative (like the feature type)
    studies, but also quantitative (SoC, size, and
    coupling metrics)
  • Unique tool to recommend mechanisms
  • Extensible ? few interfaces need to be
    implemented
  • Adds important functionalities to FLiPEx
    searching for cloned code, recommending better
    pointcuts, and generate design rules

33
Related Work
  • Anastasopoulos et. Al, 2001
  • Comparison model of variability implementation
    approaches
  • Based on feature types
  • Patzke et. Al, 2002
  • Similar to Anastasopoulos based on feature types
  • Analysis at source code restricted to the feature
    types
  • Xie et. Al, 2007

34
Open Issues and Limitations
  • Main disadvantage of being code-centric
  • SPL already implemented
  • Only to the evolution phase of the SPL cycle-life
  • Some mechanisms, feature type, criteria, and
    binding time not considered restricted Decision
    Model and Tool
  • Limited metrics suite. Further, the CBC metric
    seems to be inconsistent

35
Future Work
  • More mechanisms in the Decision Model and Tool
  • More criteria like Performance and Bytecode Size
  • More locations, feature types, and binding times
  • Improve the metrics suite
  • Improve our evaluation

36
Restructuring Test Variabilities in Software
Product Lines
  • Márcio de Medeiros Ribeiro
  • mmr3_at_cin.ufpe.br
  • Advisor Professor Paulo Borba
  • phmb_at_cin.ufpe.br
Write a Comment
User Comments (0)
About PowerShow.com