Evolving an Elective Software Testing Course: Lessons Learned - PowerPoint PPT Presentation

Loading...

PPT – Evolving an Elective Software Testing Course: Lessons Learned PowerPoint presentation | free to download - id: 7aa311-ZTQzM



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

Evolving an Elective Software Testing Course: Lessons Learned

Description:

Evolving an Elective Software Testing Course: Lessons Learned Edward L. Jones Florida A&M University Tallahassee, FL USA Agenda Course Overview Student Background ... – PowerPoint PPT presentation

Number of Views:31
Avg rating:3.0/5.0
Slides: 49
Provided by: testin8
Category:

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: Evolving an Elective Software Testing Course: Lessons Learned


1
3rd Workshop on Teaching Software Testing
  • Evolving an Elective Software Testing Course
    Lessons Learned
  • Edward L. Jones
  • Florida AM University
  • Tallahassee, FL USA

2
Agenda
  • Course Overview
  • Student Background
  • Driving Principles
  • Overview of Assignments
  • Course Reflection
  • Improvements
  • Assignment Walkthroughs

3
Course Overview
DESCRIPTION The purpose of this course is to
build skills necessary to perform software
testing at the function, class and application
level. Students will be taught concepts of
black-box (functional and boundary) and white-box
(coverage-based) testing, and will apply these
concepts to small programs and components
(functions and classes). Students will also be
taught evaluative techniques such as coverage and
mutation testing (error seeding). This course
introduces the software engineering discipline of
software quality engineering and the legal and
societal issues of software quality.
4
Programming Focus
AUDIENCE Not software testers but software
developers. What distinguishes the course
approach is that is stresses the programming
aspect of software testing. A goal is to enhance
and expand students programming skills to
support activities across the testing lifecycle
C programming and Unix shell script programming
to automate aspects of software testing.
Students just needed a course to take ...
5
Conceptual Objectives
  • The student shall understand
  • The software testing lifecycle
  • The relationship between testing, VV, SQA
  • Theoretical/practical limits of software testing
  • The SPRAE testing framework
  • Concepts and techniques for black-/white-box
    testing
  • Test case design from behavioral model
  • Design patterns for test automation
  • Test coverage criteria
  • Issues of software testing management

6
Performance Objectives
  • The student shall be able to
  • Use the Unix development environment
  • Write simple Unix shell scripts
  • Design functional and boundary test cases
  • Develop manual test scripts
  • Conduct tests and document results
  • Write test drivers to automate function, object
    and application testing
  • Evaluate test session results write problem
    reports

7
Learning/Evaluation Activities
  • 80 practice / 20 concepts
  • Lectures (no text)
  • Laboratory assignments
  • Unix commands and tools
  • Testing tasks
  • Examinations
  • 2 Online tests
  • Final (online)
  • Amnesty period (1 test / 2 labs)

8
Student Background
  • Reality
  • 20 students
  • Not particularly interested in testing
  • Low programming skill/experience
  • Ideal
  • An interest in software testing
  • Strong programming skills
  • Scientific method (observation, hypothesis
    forming)
  • Sophomore or junior standing
  • Desire for internship in software testing

9
My Perspective on Teaching Testing
  • Testing is not just for testers!
  • In ideal world, fewer testers required
  • Developers have testers skills/mentality
  • Testing overlays development process
  • No silver bullet just bricks
  • Simple things provide leverage
  • No one-size-fits-all
  • Be driven by a few sound principles

10
Driving Principles
  • Testing for Software Developers
  • Duality of developer and tester
  • Few Basic Concepts
  • Testing lifecycle
  • Philosophy / Attitudes (SPRAE)
  • Learn By Doing
  • Different jobs across the lifecycle

11
A Testing Lifecycle
12
Experience Objectives
  • Student gains experience at each lifecycle stage
  • Student uses/enhances existing skills
  • Student applies different testing competencies
  • Competencies distinguish novices from the
    experienced

13
A Framework for Practicing Software Testing
  • Specification the basis for testing
  • Premeditation (forethought, techniques)
  • Repeatability of test design, execution, and
    evaluation (equivalence v. replication)
  • Accountability via testing artifacts
  • Economy (efficacy) of human, time and computing
    resources

14
Key Test Practices
  • Practitioner -- performs defined test
  • Builder -- constructs test machinery
  • Designer -- designs test cases
  • Analyst -- sets test goals, strategy
  • Inspector -- verifies process/results
  • Environmentalist -- maintains test tools
    environment
  • Specialist -- performs test life cycle.

15
Test Products
  • Test Report (informal) of manual testing
  • Test Scripts for manual testing
  • Test Log (semi-formal)
  • Application Test Driver (Unix shell script)
  • Unit/Class Test Driver (C program)
  • Test Data Files
  • Test Results (automated)
  • Bug Fix Log (informal)

16
Specification Products
  • Narrative specification
  • Specification Diagrams
  • Specification Worksheet (pre/post conditions)
  • Decision Tables
  • Control Flow Graphs

17
Assignments Target Skills
  • Observation Skills
  • Systematic exploration of software behavior
  • Specification Skills
  • Describe expected or actual behavior
  • Programming Skills
  • Coding for development of test machinery
  • Test Design Skills
  • Derive test cases from specification using
    technique
  • Team Skills
  • Work with other testers

18
Course Reflection
  • Testing is programming intensive
  • Testing requires analytical skills and facility
    with mathematical tools
  • Testing generates data management problem that is
    amenable to automation
  • Testing gives students advantage in entry-level
    positions
  • Students take this course too late

19
Failed Course Expectations
  • Students test at all levels
  • No in-the-large application (e.g., web-based)
  • Students develop intuitive testing skills
  • On largest project, concepts did not transfer
  • 1 in 3 students show knack for testing
  • Impact of balance of concept and experience
  • Poor performance on exams with problems like
    those in labs
  • Test case design skills low
  • Homework needed v. labs (programming)
  • Mentoring (timely feedback) did not occur
  • Students left to own devices too much

20
Why These Outcomes?
  • Formalisms important, but difficult
  • Provide the behavior model (e.g., decision table)
  • Basis for systematic test case design, automation
  • Lack of textbook
  • Students need concepts lots of examples
  • Poor availability when students were working
  • Students worked at last minute
  • Not always around
  • Automated grading lacked 1-1 feedback
  • Standards-rich/tool-poor environment a
    distraction
  • Assigned work too simple??

21
Proposed Changes
  • Improve lecture notes and example bank
  • Find and refine
  • Resources and workbook
  • Outside-in testing in-the-large before
    in-the-small
  • Recitation/laboratory for discussion and feedback
  • Increase use of testing tools (no-cost)
  • Increase use of collection of code/applications
  • Examination testbank for practice, learning

22
Assignment Walkthroughs
  • (see paper)

23
Assignment Walkthroughs
  • Blind Testing
  • Test Documentation
  • Specification
  • Test Automation via Shell Scripts
  • Unit Test Automation (Driver)
  • White-Box Unit Testing
  • Class Testing

24
Blind Testing I
  • Objective Explore behavior of software without
    the benefit of a specification
  • Given Executables general description
  • Results Students not systematic in exploration
    or in generalizing observed behavior
  • Hello ? output based on length of input
  • Add ? 1-digit modulus 10 adder, input exception
  • Pay ? pay calculation with upper bound pay amount

25
Blind Testing II
  • Programming Objective Student writes program
    that matches the observed behavior of Blind
    Testing I
  • Test Objective Observations on Blind Testing I
    used as test cases for reverse-engineered
    program.
  • Results Students did not see the connection
  • Did not replicate the recorded behavior
  • Did not recognize (via testing) failure to
    replicate

26
SUPPLEMENTAL SLIDES
  • Student work

27
SCALING UP
The heart of the approach is to use a decision
table as a thinking tool. The most critical task
in this process is to identify all the stimuli
and responses. When there are many logical
combinations of stimuli, the decision table can
become large, indicating that the unit is complex
and hard to test.
28
IDENTIFYING BEHAVIOR Approaches
  • Work backwards
  • Identify each response
  • Identify conditions that provoke response
  • Identify separate stimuli
  • Work forward
  • Identify stimuli
  • Identify how each stimulus influences what unit
    does
  • Specify the response

29
IDENTIFYING STIMULI
  • Arguments passed upon invocation
  • Interactive user inputs
  • Internal, secondary data
  • global or class variables
  • External data (sources)
  • file or database status variables
  • file or database data
  • Exceptions

30
IT PAYS TO BE A GOOD STIMULUS DETECTIVE
  • Failure to identify stimuli results in an
    incomplete, possibly misleading test case
  • The search for stimuli exposes
  • interface assumptions -- a major source of
    integration problems
  • incomplete design of unit
  • inadequate provision for exception handling

31
IDENTIFYING RESPONSES
  • Arguments/Results passed back on exit
  • Interactive user outputs
  • Internal, secondary data
  • updated global or class variables
  • External data (sinks)
  • output file or database status variables
  • output file or database data
  • Exceptions

32
IT PAYS TO BE A GOOD RESPONSE DETECTIVE
  • Failure to identify responses results in
  • incomplete understanding of the software under
    test
  • shallow test cases
  • incomplete expected results
  • incomplete test "success" verification -- certain
    effects not checked
  • To test, one must know all the effects

33
A SKETCHING TOOL Black-Box Schematic
Stimulus Type
Response Type
Argument
Argument
Inputs
Outputs
Software under Test
Globals
Globals
Database
Database
Exception
Exception
34
BEFORE CONTINUTING
Much of the discussion so far involves how to
identify what software does. We have introduced
thinking tools for systematically capturing our
findings. These thought processes and tools can
be used anywhere in the lifecycle, e.g., in
software design! One Stone for Two Birds!!
35
Specialist I - Competencies
1
2
3
4
5
...
Practitioner
Test Practitioner
1
2
3
4
5
...
Test Builder
1
2
3
4
5
...
Test Designer
1
2
3
4
5
...
Test Analyst
1
2
3
4
5
...
Test Inspector
1
2
3
4
5
...
Test Environmentalist
1
2
3
4
5
...
Test SPECIALIST
36
BOUNDARY TESTING DESIGN METHODOLOGY
  • Specification
  • Identify elementary boundary conditions
  • Identify boundary points
  • Generate boundary test cases
  • Update test script (add boundary cases).

37
EXAMPLE Pay Calculation (1) Specification
  • Compute pay for employee, given the number of
    hours worked and the hourly pay rate. For hourly
    employees (rate lt 30), compute overtime at 1.5
    times hourly rate for hours in excess of 40.
    Salaried employees (rate gt 30) are paid for
    exactly 40 hours.

38
EXAMPLE B (2) Identify Behaviors
  • Case 1 Hourly AND No overtime
  • (Rate lt 30) (Hours lt 40)
  • Expect Pay Hours Rate
  • Case 2 Hourly AND Overtime
  • (Rate lt 30) (Hours gt 40)
  • Expect Pay 40Rate1.5Rate(Hours - 40)
  • Case 3 Salaried (Rate gt 30)
  • Expect Pay 40 Rate

39
DECISION TABLE
Columns define Behaviors
40
EXAMPLE B (3) Create Test Cases
  • One test case per column of decision table
  • Case 1 Hourly, No Overtime
  • Case 2 Hourly, Overtime
  • Case 3 Salaried, No Extra Hours
  • Case 4 Salaried, Extra Hours
  • Order the test cases by column

41
EXAMPLE B (4) Write Test Script
42
Testing Modules -- Drivers
A test driver executes a unit with test case data
and captures the results.
43
Implementing Test Drivers
  • Complexity
  • Arguments/Results only
  • Special set-up required to execute unit
  • External effects capture/inquiry
  • Oracle announcing "PASS"/"FAIL"
  • Major Benefits
  • Automated, repeatable test script
  • Documented evidence of testing
  • Universal design pattern

44
Test Driver for Unit Pay
45
Test Driver Files (Pay)
46
Testing Classes -- Drivers (Black-Box)
47
Example -- Stack Class
48
Test Driver Files (Stack class)
About PowerShow.com