Building Correct Distributed Embedded Systems - PowerPoint PPT Presentation

1 / 46
About This Presentation
Title:

Building Correct Distributed Embedded Systems

Description:

Step 4 - connect tester and execute tests. Step 5: Analyse Results. Results 1: Observation Issues ... Tester speed. Things you can't see ... – PowerPoint PPT presentation

Number of Views:81
Avg rating:3.0/5.0
Slides: 47
Provided by: scie192
Category:

less

Transcript and Presenter's Notes

Title: Building Correct Distributed Embedded Systems


1
Building CorrectDistributed Embedded Systems
  • Rachel Cardell-Oliver
  • Department of Computer Science Software
    Engineering
  • The University of Western Australia
  • July 2002

2
Talk Overview
  • Research Goal to build correct distributed
    embedded systems
  • 1. Distributed Embedded Systems
  • 2. Correctness Formal Methods Testing
  • 3. Practice Executing Test Cases
  • 4. Future Distributed Embedded Systems

3
1. Distributed Embedded Systems
4
Embedded
5
Distributed
6
Real Time Reactions
7
Characteristics of Current DES
  • React or interact with their environment
  • Must respond to events within fixed time
  • Distributed over two or more processors
  • Fixed network topology
  • Each processor runs a set of tasks
  • Often embedded in other systems
  • Often built with limited HW SW resources

8
2. Correctness Formal Methods Testing
9
Goal Building Correct DES
behaves like this?
10
Software Tests
  • are experiments designed to answer the question
    does this implementation behave as intended?
  • Defect tests
  • are tests which try to try to force the
    implementation NOT to behave as intended

11
A New Test Method
12
Our Method for Defect Testing
  • Identify types of behaviour which are likely to
    uncover implementation defects (e.g. extreme
    cases)
  • Describe these behaviours using a formal
    specification language
  • Translate the formal test specification into a
    test program to run on a test driver
  • Connect the test driver to the system under test
    and execute the test program
  • Analyse test results (on-the-fly or off-line)

13
Test Generation some history
  • Chow IEEE SE 1978
  • deterministic Mealy machines
  • Clarke Lee 1997
  • timed requirements language
  • Neilsen PhD and TACAS 2000
  • Event recording automata
  • Cardell-Oliver FACJ 2000
  • Uppaal timed automata
  • and many more

14
Test Execution some history
  • Peters Parnas ISSTA 2000
  • Test monitors for reliability testing
  • Cardell-Oliver ISSTA 2002
  • Test automata for defect testing

15
Example System to Test
16
Step 1 Identify interesting behaviours
  • Usually extreme behaviours such as
  • Inputs at the maximum allowable rate
  • Maximum response time to events
  • Timely scheduling of tasks

17
Example Property to Test
  • Whenever the light level changes from low to high
  • then the valve starts to open
  • within 60cs
  • assuming the light level alternates between high
    and low every 100cs

18
Step 2 choose a formal specification language
  • which is able to describe
  • concurrent tasks
  • real time constraints
  • persistent data
  • communication
  • use Uppaal Timed Automata (UTA)

19
Example UTA for timely response
20
Writing Tests with UTA
  • Test cases specify all valid test inputs
  • no need to test outside these bounds
  • Test cases specify all expected test outputs
  • if an output doesnt match then its wrong
  • No need to model the implementation explicitly
  • Test cases may be concurrent programs
  • Test cases are executed multiple times

21
Step 3. Translate Spec to Exec
  • UTA specs are already program-like
  • Identify test inputs and how they will be
    controlled by the driver
  • Identify test outputs and how they will be
    observed by the driver
  • then straightforward translation into NQC (not
    quite C) programs

22
Example NQC for timely response
  • task dolightinput()
  • while (ilt400)
  • Wait(100)
  • setlighthigh(OUT_C) setlighthigh(OUT_A)
    record(FastTimer(0),2000)
  • i
  • Wait(100)
  • setlightlow(OUT_C) setlightlow(OUT_A)
    record(FastTimer(0),1000)
  • i
  • // end while
  • // end task
  • task monitormessages()
  • while (ilt400)
  • monitor (EVENT_MASK(1))
  • Wait(1000)
  • catch (EVENT_MASK(1))
  • record(FastTimer(0), Message())
  • i
  • ClearMessage()
  • // end while
  • // end task

23
Step 4 test driver
24
Step 4 - connect tester and execute tests
25
Step 5 Analyse Results
26
(No Transcript)
27
Results 1Observation Issues
  • Things you cant see
  • Probe effect
  • Clock skew
  • Tester speed

28
Things you cant see
  • Motor outputs cant be observed directly because
    of power drain
  • so we used IR messages to signal motor changes
  • But we can observe
  • touch light sensors via piggybacked wires
  • broadcast IR messages

29
The probe effect
  • We can instrument program code to observe program
    variables
  • but the time taken to record results disturbs the
    timing behaviour of the system under test
  • Solutions
  • observe only externally visible outputs
  • design for testability allow for probe effects

30
Clock Skew
  • Clocks may differ for local results from 2 or
    more processors
  • Solutions
  • user observations timed only by the tester
  • including tester events gives a partial order

31
Tester speed
  • Tester must be sufficiently fast to observe and
    record all interesting events
  • Beware
  • scheduling and monitoring overheads
  • execution time variability
  • Solution NQC parallel tasks and off-line
    analysis helped here

32
Results 2Input Control Issues
  • Input value control
  • Input timing control

33
Input Value Control
  • SUT touch light sensors can be controlled by
    piggybacked wires from test driver to SUT
  • Test driver sends IR messages to SUT
  • Use inputs directly from the environment such as
    natural light or button pushed by hand

34
Input Timing Control
  • Cant control input timing precisely
  • e.g. offered just before SUT task is called
  • Solution Run tests multiple times and analyse
    average and spread of results
  • Cant predict all SUT timings for a fully
    accurate model
  • c.f. WCET research, but our problem is harder

35
Conclusions from Experiments
  • Defect testing requires active test drivers able
    to control extreme inputs and observe relevant
    outputs
  • Test generation methods should take into account
    the constraints of executing test cases
  • Engineers should design for testability

36
5. Future Distributed Real-Time Systems
37
Embedded Everywhere
  • IT is on the verge of another revolution
  • Wireless networked systems of 1000s of tiny
    embedded computers will allow information to be
    collected, shared, and processed in unprecedented
    ways
  • Embedded Everywhere Report 2001

38
Embedded Everywhere A Research Agenda for
Networked Systems of Embedded Computers
  • A study by the Computer Science and
    Telecommunications Board of the National Research
    Council (USA)
  • For DARPA and the National Institute of Standards
    and Technology (NIST)
  • 236 pages, 2001
  • http//books.nap.edu/html/embedded_everywhere/

39
Future DES Applications
Precision Agriculture Motorola weather
station UWA CSSE
40
Future DES Applications
Intelligent Inhabited Environments
http//cswww.essex.ac.uk/Research/intelligent
-buildings/
41
Future DES Applications
Smart Dust Prof. Kris Pister University of
California at Berkeley http//www-bsac.eecs.berkel
ey.edu/pister/SmartDust/
42
Future vs Current
  • ? React or interact with their environment
  • ? Must respond to events within fixed time
  • ? Two or more processors
  • ? Fixed network topology
  • ? Each processor runs a set of tasks
  • ? Often embedded in other systems
  • ? Often built with limited HW SW resources
  • Reactive
  • Real-time
  • 103 to 106
  • Dynamic
  • 1-1
  • More so
  • Tiny scale

43
Future Challenges
  • Scale
  • 1000s of independent processes
  • Complexity
  • concurrency communication data time
  • Dynamic topologies
  • of wireless networks
  • very long life networks

44
the end
45
Traditional Embedded System
Algorithms for Digital Control
Engineering System
Real-Time Clock
Interface
Remote Monitoring System
Data Logging
Database
Data Retrieval and Display
Display Devices
Operators Console
Operator Interface
Real-Time Computer
46
Observations Traces
specification
implementation
Write a Comment
User Comments (0)
About PowerShow.com