EMIS 7307 - PowerPoint PPT Presentation

1 / 27
About This Presentation
Title:

EMIS 7307

Description:

To make sure that the system will not fail prematurely due to fatigue, aging, ... in turn yield actual test measurements called Measures of Performance (MOPs) ... – PowerPoint PPT presentation

Number of Views:28
Avg rating:3.0/5.0
Slides: 28
Provided by: dav5294
Category:
Tags: emis | mops

less

Transcript and Presenter's Notes

Title: EMIS 7307


1
A few DTE Areas of focus
  • Life testing.
  • To make sure that the system will not fail
    prematurely due to fatigue, aging, long term
    environmental exposure etc.
  • Time consuming and expensive.
  • Consider accelerated life testing.

2
A few DTE Areas of focus
  • Design evaluation/verification.
  • See if critical characteristics are achievable
  • Provide data making the hardware or software more
    rugged in order to meet specs.
  • Eliminate risks.
  • Assist design evolution.
  • Ensure component and system meet spec.

3
A few DTE Areas of focus
  • Design limit testing.
  • Based on mission profile.
  • Operated at the limits.
  • May be destructive.
  • Interpolation and extrapolation may be required.

4
A few DTE Areas of focus
  • Reliability Development testing.
  • Also called reliability growth testing, test-
    analyze-fix-test (TAFT), of test analyze and fix
    (TAAF).
  • Test under mission profile environments.
  • Things fail.
  • Determine and make fix.
  • Commence testing again.
  • Data allows updating reliability model for
    current predictions.

5
Built-In Test
  • Difficult part of design
  • Difficult part of IT
  • Difficult part of TE
  • Issues include
  • Level of fault isolation
  • Level of automatic fault detection
  • Induce failures for the TE?

6
OTE
  • Contractor perspective.
  • Usually involvement is limited to being the depot
    - a common approach.
  • If the program/system has contractor logistics
    support (CLS) then their performance is tested
    here too.

7
OTE
  • OTE government program office perspective.
  • Make or break time.
  • This is the program offices final exam!
  • Feels like youre sending your kid to the army!
  • Testing is done at arms length from the program
    office.
  • No program office involvement except to pay for
    it!

8
OTE
  • OTE government operational tester perspective.
  • An agent of the highest authority.
  • One phrase in the final report and the system
    does not go to FRP.
  • Not operationally suitable, not operationally
    effective.

9
OTE Supports Reviews and MSs Too
10
OTE
  • Critical Operational Issues (COI)
  • A key operational effectiveness or operational
    suitability issue that must be examined in
    operational test and evaluation to determine the
    systems capability to perform its mission.
  • A critical operational issue is normally
    phrased as a question to be answered in
    evaluating a
  • systems operational effectiveness and/or
    operational suitability.
  • Identifying COIs
  • Ask How well does the system perform a
    particular aspect of its mission? Can the
    system be supported logistically in the field?
    Other issues arise from questions asked about
    system performance or how it will affect other
    systems with which it must operate. Critical
    issues provide focus and direction for the
    operational test. Identifying the issues is
    analogous to the first step in the system
    engineering process that is defining the
    problem. When critical operational issues are
    properly addressed, deficiencies in the system
    can be uncovered.

11
OTE
  • COIs are the basis for sub-objectives known as
    Measures of Effectiveness (MOEs) which in turn
    yield actual test measurements called Measures of
    Performance (MOPs).

12
OTE
13
OTE
  • Test Realism
  • Key issue, if it isnt realistic, its not good
    OTE.
  • Realistic production representative version
  • Realistic environment.
  • Realistic tactics and threats.
  • Realistic logistics.
  • Realistic training.
  • Realistic operators - not the golden crew.

14
Live Fire Testing (LFT)
  • A lot like OTE.
  • Mandated by congress.
  • Prior to full rate production.
  • Appropriate threat weapons used against our.
    systems, configured for combat.
  • Or
  • Our munitions are fired against a threat target.

15
Before IOTE, data are gleaned wherever possible.
16
Evaluation
  • Remember the definitions of test, evaluation!
  • Establishing and maintaining a clear audit trail
    is essential from
  • System requirements through critical issues.
  • Evaluation criteria.
  • Test objectives.
  • Measures of effectiveness.
  • Data are used from all sources, not just tests
  • Design reviews, inspection, simulations, etc.

17
Evaluation
  • Issues
  • Questions that require answers during the
    acquisition process to support.
  • Development of acquisition strategy.
  • Refining performance requirements and designs.
  • MS decisions.
  • Criteria
  • Standards of assessment for
  • Technical and operational effectiveness/suitabilit
    y characteristics.
  • Resolution of operational issues.

18
Evaluation
  • Key Performance Parameters (KPPs) from the ORD
    yield
  • Critical issues.
  • Must be answered before the systems overall worth
    can be estimated or evaluated.
  • Of primary importance to the milestone decision
    authority.
  • Evaluation issues.
  • Technical implies DT, if possible.
  • Operational implies analysis, modeling,
    simulation. inspection, demo or test.

19
Evaluation
  • Test issues.
  • Issues that cannot or should not be addressed by
    any other method.
  • Criteria Required level of technical
    performance and operational effectiveness,
    suitability or supportability.
  • Objective Goal, a preferred level of capability.
    Not addressed by OTE.
  • Threshold Minimum acceptable level of
    acceptability.
  • System is not worth having below this level.
  • One function of TE is to verify meeting the
    threshold.

20
Generic Evaluation Process
21
Is the real thing always better?
22
How to decide?
23
Modeling and Simulation Support Testing
  • Help discover sensitive areas of special
    interest.
  • Can go where real testing cannot.
  • Can be updated to improve validity based. on
    selected test data.
  • Can be used for design, test and operations.

24
  • The final consists of two sections.
  • An in-class section (25 )
  • 25 questions similar to midterms
  • Mostly concerning topics since the 2nd midterm
  • A four question essay section which you prepare
    and bring to the final (75 )
  • Use MSWord and size 12 or 14 Times New Roman
    font. Both substance and grammar will be graded.
    Each persons offering must be unique.
  • Number of words is approximate.

25
  • Essay portion of final exam
  • (1)What is the role of simulation and modeling in
    the IT process? (250 words)
  • (2)What is the role of simulation and modeling in
    the TE process? (250 words)

26
  • Essay portion of final exam
  • (3)Pretend Im your boss. Convince me to change
    our companys approach to IT. Tell me whats
    wrong with our IT. Provide pros and cons of how
    you would revise it, and then explain why the
    pros outweigh the cons. Show how this revised
    approach will improve our overall systems
    engineering flow and ultimately provide a better
    product. Include suggested organizational changes
    that will be required to implement this change.
    (250 words, plus a table and /or chart)

27
  • Essay portion of final exam
  • (4)Discuss the pitfalls, dangers and risks of the
    well make up lost time and well fix the
    problems during IT mentality. (250 words)
  • The remaining portion of the final will be
    administered in the same fashion as the midterms.
    After completing the remainder of the final,
    attach your essays to it for one combined
    submittal.
Write a Comment
User Comments (0)
About PowerShow.com