An Automated Approach for Regression Testing OneSAF Combat Physical Models PowerPoint PPT Presentation

presentation player overlay
1 / 16
About This Presentation
Transcript and Presenter's Notes

Title: An Automated Approach for Regression Testing OneSAF Combat Physical Models


1
An Automated Approach for Regression
TestingOneSAF Combat Physical Models
Andrew Barnett 31 March 2009 US Army Materiel
Systems Analysis Activity Aberdeen Proving
Ground, Maryland 21005-5071
Approved for public release distribution is
unlimited.
2
Agenda
AMSAA
has been actively engaged with program support,
testing, co-development activities and use of
OneSAF.
  • Problem Statement
  • Combat Physical Models
  • AMSAAs Role within the OneSAF Program
  • Requirements Flow
  • Physical Model Verification Process (OneSAF v1.0)
  • Model Verification Tool
  • Vignette On-line Approach
  • Automated Verification Process
  • Path Ahead

3
Problem Statement
New Combat Physical Models .
to include updates to earlier implementations and
fixes associated with reported problems will be
integrated into future baseline release versions
of OneSAF hence an improved approach for
verification testing is critical. Moreover, the
number of new models to include expanded
complexity are increasing. Regression testing
remains essential to successful verification
testing, therefore new processes are required to
automate the previous verification testing
process.
Are the Combat Physical Models functioning as
designed?
4
AMSAA the OneSAF Program
RDA Technical Representative
RDA/RDECOM Requirements Analysis
FY07/FY08/FY09 P3I Requirements
Technical Assessments
Engineering Configuration Control Board
Configuration Control Management Participation
Lead for Coordinating Army Combat Physical Models
Focused on Standardization Reuse
Synchronization w/existing Entity-Based Force
Level Combat Simulations
Promoted Influenced AMSAA Data Standardization
Verification Validation Lead for Combat
Physical Models
MS Modeling Simulation KA/KE Knowledge
Acquisition/Knowledge Engineering P3I
PrePlanned Product Improvement RDA Research,
Development Acquisition
5
Combat Physical Model
Combat Physical Models.
provide the mathematical representation of combat
systems and their interactions with the
environment and other entities. Physical models
may be represented at multiple levels of
fidelity.
Battlefield Environment
Target Acquisition
Attrition
Communication
Movement
Sustainment
6
OneSAF Requirements Flow
Physical Model VV
Domain Representation
Combat Physical Model Development
  • Extensive VV of OneSAF v1.0
  • Periodic user evaluation events
  • OneSAF v3.0 VV ongoing
  • Requirements
  • Staffing
  • 61 physical models developed and integrated
  • Combat physical models documented

Requirements
Artifacts
Implementation
VV
OneSAF ORD
or
P3I Rqmts
ORD Operational Requirements Document PKAD
Physical Model Knowledge Acquisition Document VV
Verification Validation
7
Combat Physical Model Verification Testing
Process (OneSAF v1.0)
Continuous Physical Model Verification Testing
Physical Model Verification Methods
Execute Model Verification Tool (MVT)
Import data to MS Excel Verification Spreadsheet.
NO
Problems Detected?
Proceed to next test case.
Analyze results with DCST output, logger
statements, or debugging tool.
YES
Execute Vignette online Analysis
Perform Java code review.
Create detailed Problem Trouble Report (PTR) with
recommended fix.
Update physical model status to include crosswalk
with test threads.
DCST Data Collection Specification Tool
8
Model Verification Tool (MVT) (1 of 2)
  • Pro
  • Robust testing
  • Quickly test numerous test cases
  • Con
  • Doesnt test all aspects of engagement
  • Not Automated
  • Component of OneSAF
  • Derived from PKAD specification
  • Tool links directly to the OneSAF code
  • Facilitates rapid testing of multiple entity
    pairings on specific functional areas.
  • Directs the debugging of the model to a specific
    area
  • Process
  • Exercise Tool
  • Import Results to MS Excel
  • Compare Results
  • Contact Software Engineer
  • Write Problem Trouble Report

Notional Data
9
Model Verification Tool (MVT) (2 of 2)
Notional Data
  • Excel workbooks were created, each modeled after
    a PKAD.
  • Recommended Test Input represents test case
    input.
  • Actual Output represents data output from the
    OneSAF tool/simulation.
  • Expected Output represents data in the AMSAA
    unclassified database.

10
Vignette Online Verification Method
  • Management Control Tool (MCT)
  • Data Collection Methods
  • Data Collection Specification Tool
  • Instrumented via PKAD Specification
  • Output xml files
  • Logger Statements
  • Debugging Tool (JSwat)
  • Notional Vignette
  • BLUFOR convoy traveling SE on a paved road.
  • Two IEDs will detonate after the first vehicle
    passes the phase line.

Phase Line
IEDs
  • Pro
  • Robust testing
  • Quickly test numerous test cases
  • Con
  • Inconsistent data collection (OneSAF v1.0)

11
Combat Physical Model Verification Testing Process
Physical Model Verification Methods (Streamlined
Approach)
Continuous Physical Model Verification Testing
Execute Model Verification Tool (MVT)
Import data to MS Excel Verification Spreadsheet.
NO
Proceed to next test case.
Problems Detected?
Analyze results with DCST output, logger
statements, or debugging tool.
YES
Execute Vignette online Analysis
Perform Java code review.
Create detailed Problem Trouble Report (PTR) with
recommended fix.
  • Streamlined approach takes the good from both
    verification methods to make a strong automated
    verification process.
  • New data collection tool allows user to collect
    all required variables to perform sufficient
    verification.

Update physical model status to include crosswalk
with test threads.
12
Automated Verification Process (Notional Test
Vignette)
Notional Vignette
TARGET
X
Y
Z
  • Pre-canned vignettes will contain all
    shooter-target pairings from each functional area
    test matrix.
  • Events will be isolated as to not clutter the
    output data.
  • Engagements will vary by shooter, weapon, mount,
    munition, target, sensor, attack angle, range,
    terrain, etc.
  • New data collection tool will be utilized for
    each vignette.

88o
267o
123o
1250m
1050m
500m
MUNITION
Mun_A
Mun_B
Mun_C
A
B
C
SHOOTER
13
Automated Verification Process (Output Data)
Notional Data
Notional Output Data
  • Data imports directly into Excel via OneSAF
    delimited format.
  • Each row represents a different engagement from
    the test matrix.

14
Automated Verification Process (Data Analysis)
Notional Data
Notional Data Analysis
  • Expected Output will be populated independently
    of OneSAF to coincide with the test matrix.
  • Actual Output data read directly from DATA tab
    via HLOOKUP command.
  • Separate tabs will be created to analyze each
    test matrix case.
  • Differences between actual and expected
    results will create a flag.

15
Automated Verification Process (Further
Automation)
Notional Data
  • Potential to further automate process by
    populating Expected Output column directly from
    AMSAA data tables.
  • Requires additional maintenance work when
    transitioning between baselines
  • Allows for greater flexibility for testing
    additional vignettes

16
Path Ahead
  • AMSAA conducted a limited verification testing
    of Pre-v3.0 baseline.
  • New process proved to be efficient
  • Remains to be seen if vignettes/spreadsheets will
    translate between releases of OneSAF.
  • Develop standard set of vignettes for core combat
    physical models to include data analysis
    spreadsheet templates.
  • Employ streamlined approach for subset of core
    combat physical models.
Write a Comment
User Comments (0)
About PowerShow.com