The Application of XML to Testing: Specification Agile Test Equipment SATE PowerPoint PPT Presentation

presentation player overlay
1 / 31
About This Presentation
Transcript and Presenter's Notes

Title: The Application of XML to Testing: Specification Agile Test Equipment SATE


1
The Application of XML to Testing Specification
AgileTest Equipment(SATE)
  • By
  • Jimmy Saunders and Scott White
  • J3S, Inc.
  • 7 October 2002

2
Purpose
  • Brief an innovative method for improving test
    systems quality and performance while reducing
    systems costs
  • Outline disadvantages of traditional test
    documentation and instrumentation architecture
  • Discuss a new approach to high-end test
    requirements and equipment architecture
  • Discuss a means to develop and promulgate useful
    standards and tools

3
The Challenge
  • Aerospace systems require complex test equipment
    to stimulate and sense systems performance for
    development testing and buy-off demonstrations
  • Test equipment is costly and should be developed
    to some extent concurrent with the development of
    the system to be tested in order to avoid program
    schedule risk
  • Once designed, the test equipment must react to
    inevitable changes in the testing requirements
  • If associated with production or in service
    performance assessment, obsolescence and
    maintenance costs grow throughout service life

4
Traditional Architecture
Requirements Document
Monolithic Test Execution
5
RequirementsTraditional Approach
  • Test Requirements and Systems Specifications are
    passed to the test system developers
  • These developers go to school on the system to
    be tested and explore test equipment design
    options
  • During design efforts a symbiotic relationship
    develops between test system and test
    requirements developers
  • Test requirements and test system design evolve
    simultaneously
  • Ultimately the test system is validated against
    the test requirements

6
Results ofTraditional Approach
  • The give and take between requirements and test
    system development results in a leak back of
    test system point design and procedure into the
    requirements
  • The test system is essentially validated to a
    point design which impedes future modifications
  • What are real requirements vs. those leaked
    backed?
  • What will constitute a good re-validation post
    modification?
  • Associated test systems diverge functionally and
    may not implement test requirements completely or
    consistently
  • The effect is either a hard freeze of test
    equipment, software, and requirements or a large
    price tag for evaluating and making changes

7
Costs ofTraditional Approach
  • Faced with a change to the system under test
  • Test equipment and software changes
    significantly, adding to the cost of the
    modification
  • Test equipment downtime due to modifications and
    re-validation drives schedule induced program
    costs
  • Faced with the need to change test equipment
  • The validity of the test is called into question,
    resulting in cost for extensive engineering
    analysis
  • Given requirements contaminated with point
    designs, leak back is perpetuated. Design
    upgrades are rejected because of the cost of
    severing requirements from design.

8
Problem withTraditional Approach
  • The resulting test equipment (TE) is hardwired to
    the testing requirements
  • The testing requirements are contaminated by the
    TE point design
  • Life-cycle testing across various TE may be
    inconsistent and/or incomplete
  • The combination makes System Under Test and TE
    modifications expensive
  • Fear of this expense drives logistics and
    maintenance to more costly choices

9
A New Approach
  • Use Test Requirements as executable instructions
  • Build flexible, modular test system
  • Validate system to hardware limits (not specific
    Test Requirements)
  • Changes in Test Requirements are separated from
    Test Equipment

10
XML in SATE
  • Test Requirement Markup Language defined in XML
  • All test specifications tolerances, timing,
    measurement definition, test groupings, etc
  • Single XML file ingested into the system and
    rendered into printable from

11
Requirements Architecture Specification Agile
Test Equipment (SATE)
Requirements Data (XML Document)
Rules Application Conversion
Conversion and Format
Load into Database
Document Production
Generic Executable Test Instructions
12
Test Requirements Architecture
  • All requirements data captured explicitly
  • Generic to system and the System Under Test
  • Susceptible to Configuration Management (edit,
    validate, verify)
  • Accommodate ancillary data, e.g. notes and titles
  • Maintain an open interface standard

13
Test Requirements Architecture
  • Architecture contains several types of data
  • Document Architecture Items number and title for
    each section, job, block, and step.
  • Test Architecture Items pre-requisites,
    priority, etc.
  • Test Data measurement type, nominal value,
    tolerances, and interface.
  • Operational Items database references, operator
    notes, etc.

14
System Components and Functions
  • Reads in and validates the requirements
  • Parses to a database
  • Verifies document integrity
  • Validates interfaces, units, limits, etc.
  • Schedules required tests
  • Using test requirements document data, organizes
    sequence of events
  • De-conflicts interfaces with System Under Test
  • Groups together non-conflicting test tasks
  • Maps SUT interfaces to system software interfaces

15
System Components and Functions
  • Executes each required task
  • Passes System Under Test test data for each SUT
    interface to appropriate system software
    interface
  • Monitors test status
  • Starts/Stops/Pauses testing

16
Test Requirements Flow
Test document authored
Requirements are read in, verified and validated
PARSER
Requirements CM
Test System CM
17
Test Requirements Flow
Requirements are scheduled and interfaces mapped
SCHEDULER
18
Test Requirements Flow
SUT
EXECUTION
Requirements are executed
19
Illustration GPS MonitorOverview
  • Monitor a heterogeneous fleet of GPS navigation
    receivers operating in aerospace vehicles
  • Enforce a standard set of test requirements
  • Maintain a flexible design against
  • changing test requirements
  • interface variations in the original suite of
    in-service GPS receivers
  • unknown receiver interfaces that may appear as
    the fleet upgrades

20
Illustration GPS Monitor
  • System engineer generates the test requirements,
    e.g.
  • Monitor signal strength once per second. Fail
    the measurement for signal strength not equal to
    F dBm, plus or minus f dBm
  • Monitor the number of satellites in view once per
    second. Fail the measurement for a number of
    satellites in view not equal to N, plus or minus
    n.

21
Illustration GPS Monitor
  • Additional architecture and operational items are
    added to the test requirements to complete the
    document
  • Each receiver is identified with a unique
    interface
  • All the measurements are specified as independent
    with equal priority (can be made in parallel)

22
Illustration GPS Monitor
  • System developers create and test software
    modules for each receiver type
  • Isolates receiver-specific interface and
    functions inside the module necessary to perform
    required signal strength and satellites-in-view
    measurements
  • Conforms to execution interface accepts nominal
    value and tolerances as input
  • Measures and evaluates the data (Pass/Fail)
  • Archives measured data

23
Illustration GPS Monitor
  • Suppose requirements change
  • Author updates the requirements document, e.g. to
    measure signal strength three times per second,
    and submits it for parsing, scheduling and
    execution. System software remains the same.
  • Suppose a new receiver enters service
  • Developer creates a new software module to handle
    receiver specific interfaces. Requirements
    remain the same.
  • Suppose a receiver vendor wants to package test a
    receiver using a package test system
  • The requirements document provides clear,
    consistent unambiguous test requirements for the
    package test system. The vendor executes the
    same test.

24
Current Challenges
  • Test document architecture continues to develop
  • New applications motivate additions to the
    document architecture
  • Experience and new open standard developments
    inspire revisions
  • Timing, Data and Network
  • Coordinated stimulus and measurement with tight
    timing, e.g. event timing, challenge a
    distributed architecture
  • Complex data, e.g. waveforms, not supported
  • Design assumes a low-latency network

25
Continuing Work
  • Transition from a DTD and DSSSL to Schema to
    improve document functionality, e.g.
    auto-numbering and test-specific branching.
  • Developing more complete measurement definitions,
    e.g. bursts at irregular intervals.
  • Developing tools for editing and validating test
    document content as well as form
  • Seeking a broader community to explore the
    concept and its applications

26
SATE Benefits
  • A true flow-down of requirements results in
  • System specifications generated from requirements
    vs. point solution creep back into requirements
  • A requirements driven performance based
    architecture
  • Consistent, re-useable requirements specification
  • An agile system results in test equipment
  • Capable of adapting to tested system changes
    during development without added cost
  • Capable of conforming to changes in requirements
    post development with modest cost
  • Capable of absorbing requirements changes without
    breaking validation and configuration
    management

27
SATE Enabling Technology
  • SGML/XML standards allow manageable, simplified
    data formatting effort, allows single root source
    for specification and execution, allows change of
    performance without change in hardware or
    software configuration
  • Modular software design allows isolation of
    damage due to code modification, allows parallel
    S/W development, and minimizes modification
    testing effort
  • Software tools (libraries) for hardware
    controllers and tools like CORBA, MPI, PVM, etc.
    make truly standardized interfaces possible,
    minimizes time cost to exchange hardware

28
SATE Innovation Effects
  • Allows parallel development of test equipment and
    system under test without
  • Cost and schedule risk due to redesign of test
    equipment when flight hardware changes
  • Delayed start of TE development while awaiting
    system under test design maturation
  • Allows changes to operational performance without
  • Changing hardware or software
  • Changing H/W or S/W configuration
  • Lost production due to modification and
    revalidation
  • Requires
  • Proper documentation of performance
    specifications
  • Configuration management of specification
    documentation

29
Impact of SATE Approach
  • Drives TE H/W and S/W to be tested in development
    to capabilities envelope
  • System design should avoid custom hardware and
    hardwired software test implementation
  • Forces a comprehensive performance specification
  • Mandates that requirements be configuration
    managed like hardware/software
  • Requires a data editor and publisher
  • Motivates a standard test document architecture

30
Summary
  • XML-based test requirements
  • Significantly reduce costs of requirement changes
  • Improve consistency and clarity (re-use)
  • Maintains constructive separation of test
    requirements and system design
  • In general, SATE concept for complex systems test
    equipment
  • Reduces cost, improves quality
  • Facilitates TE multi-use and/or re-use
  • Recommend the development of a standard
    specification for community use

31
Summary
  • Contact Information
  • Jimmy D. Saunders saunders_at_j3s.us 512-997-1705
  • Scott White white_at_j3s.us 512-997-1750
  • Mark Brucks brucks_at_j3s.us 512-997-1700
Write a Comment
User Comments (0)
About PowerShow.com