Your Title Here and your subtitle if applicable - PowerPoint PPT Presentation

1 / 35
About This Presentation
Title:

Your Title Here and your subtitle if applicable

Description:

Columbia Accident. Investigation Board (CAIB) Report. A Renewed ... Develop standard for documentation, configuration management, and quality assurance ... – PowerPoint PPT presentation

Number of Views:51
Avg rating:3.0/5.0
Slides: 36
Provided by: Bail52
Category:

less

Transcript and Presenter's Notes

Title: Your Title Here and your subtitle if applicable


1
(No Transcript)
2
  • Lawrence L. Green,
  • Thomas A. Zang, Steve R. Blattnig, Michael J.
    Hemsch,
  • James M. Luckring, Joseph H. Morrison, Ram K.
    Tripathi,
  • NASA Langley Research Center
  • Lawrence.L.Green_at_nasa.gov, 757-864-2228
  • April 5, 2006
  • Outline
  • Genesis
  • Scope
  • Relationship to Other Standards
  • Approach Status
  • Standard Highlights
  • Summary

3
Genesis
4
Response to Columbia Accident
5
CAIB Recommendation R3.8-2(http//caib.nasa.gov/)
  • Develop, validate, and maintain physics-based
    computer models to evaluate Thermal Protection
    System damage from debris impacts. These tools
    should provide realistic and timely estimates of
    any impact damage from possible debris from any
    source that may ultimately impact the Orbiter.
    Establish impact damage thresholds that trigger
    responsive corrective action, such as on-orbit
    inspection and repair, when indicated
  • One of many Recommendations, Observations, and
    Findings

6
Diaz Team Conclusion(http//www.nasa.gov/pdf/5569
1main_Diaz_020204.pdf)
  • Identify those CAIB Report elements with
    Agency-wide applicability
  • All programs should produce, maintain, and
    validate models to assess the state of their
    systems and components. These models should be
    continually updated and validated against
    experimental and operational data to determine
    appropriate courses of action and repair. The
    value of the models should be assessed with
    respect to their ability to support decision
    making in a timely way so as not to lead the
    decision maker to a conflict between costly
    action versus effective action in the interest of
    safety or mission success.

7
Diaz Action 4 Requirements
  • Develop a standard for the development,
    documentation, and operation of models and
    simulations
  • Identify best practices to ensure that knowledge
    of operations is captured in the user interfaces
    (e.g. users are not able to enter parameters that
    are out of bounds)
  • Develop process for tool verification and
    validation, certification, reverification,
    revalidation, and recertification based on
    operational data and trending
  • Develop standard for documentation, configuration
    management, and quality assurance
  • Identify any training or certification
    requirements to ensure proper operational
    capabilities
  • Provide a plan for tool management, maintenance,
    and obsolescence consistent with modeling/
    simulation environments and the aging or changing
    of the modeled platform or system
  • Develop a process for user feedback when results
    appear unrealistic or defy explanation

8
Diaz Action Summary
  • The NASA HQ Office of the Chief Engineer has
    responsibility for most of the Diaz Actions
  • The NASA Langley team was commissioned by the
    NASA Office of the Chief Engineer in May 2005 to
    respond to Diaz Action 4
  • All Actions were slated to be closed by April
    2006
  • The Diaz Actions were reviewed by NASA HQ Program
    Analysis Evaluation Office in September 2005,
    many were stopped at that time
  • Diaz Action 4 was continued through April 2006
  • Detailed oversight transitioned in FY06 from NASA
    HQ to the NASA Engineering and Safety Center (at
    NASA Langley)

9
Scope
10
Out of Scope Control Systems SW that
implements algorithms for controlling
systems and subsystems Displays SW that
implements algorithms for user interaction
with control systems
In Scope The stimulation environment for
Control System and Display SW Focus on what
needs to be done, not how to do it.
HLA how
11
  • Not yet explicitly
  • considered
  • Latency effects
  • Distributed
  • Human-in-the-Loop and
  • Hardware-in-the-Loop
  • Simulations

In Scope Models and Simulation Modeling A
physical, mathematical, or otherwise logical
representation of a system, entity, phenomenon,
or process Simulation A method for
implementing a model including Live, Virtual, and
Constructive Simulations
12
Out of Scope Training produce learning in a
user or participant Acquisition specify and
acquire new systems Technology Investment identify
and evaluate candidate advanced technologies for
future missions and systems
In Scope Engineering Operations evaluate
status/anomalies/corrective actions in
operational systems Test Evaluation evaluate/ver
ify hardware software artifacts Analysis
Design evaluate and explore solution spaces
for current and future systems and subsystem
13
Out of Scope Scientific Data Analysis processing
data from scientific instruments Scientific
Understanding simulation of natural
phenomena used for advancement of scientific
knowledge Natural Phenomena Prediction
Simulations for which other
agencies have primary responsibility,
for example Earth Weather
In Scope Natural Phenomena Prediction Simulation
of natural phenomena used for operational
decisions affecting safety and mission success
and for informing the public and policy
makers Predictions of the operational
environment that have a direct impact on the
safety of personnel and NASA and for which
NASA has primary responsibility for these
predictions, for example Space Weather
14
Relationship to Other Standards
  • Standards means Policies, Processes
    Requirements, Standards, Guidelines, Recommended
    Practices, etc.
  • Standards contain specific shall and should
    requirements and recommendations

15
Related NASA Policies, Requirements, Standards
Guidelines
16
Comments on Current NASA Standards
  • Current NASA standards are strongly oriented
    towards control systems and displays
    SW
  • NASA has existing or imminent NPDs, NPRs and
    Standards that cover many of the generic software
    engineering aspects of the Diaz 4 requirements,
    especially
  • Quality Assurance and
  • Configuration Management
  • The unique, critical aspects of Models and
    Simulations (MS) are not addressed, especially
  • development of models
  • validation against experimental or flight data
  • uncertainty quantification
  • operations and maintenance of MS
  • Embedding MS policy in existing SW
    Engineering/QA policies would create an undue
    burden for acceptance of the MS Standard
  • the shear volume diversity of these documents
    intimidates MS readers
  • the language is foreign to MS practitioners

17
Observations on Other Agency Standards
  • Neither Sandia, nor the Dept. of Energy in
    general, has an MS standard
  • The Nuclear Regulatory Commission standards (like
    NASAs) are strongly oriented towards control
    systems displays, and the unique, critical
    aspects of models and simulations are not
    addressed
  • Although the Dept. of Defense has numerous
    directives, instructions, guidebooks, etc., it
    has no MS standard in the Diaz Action 4 sense
  • The following documents from non-NASA sources
    address some of the gaps in existing NASA
    standards
  • Concepts for Stockpile Computing from Sandia
  • VVA Recommended Practices Guide from DoD (DMSO)
  • AIAA Guide for Verification and Validation of
    Computational Fluid Dynamics Simulations
  • ASME Guide for Verification and Validation in
    Computational Solid Mechanics (in final review)
  • Existing guidance is overwhelmingly focused on
    the development of MS very little guidance is
    available on operations and maintenance of MS

18
DoD Hierarchy MS Guidance
EO 13231 Executive Order on Critical
Infrastructure Protection the development of
effective models, simulations, and other analytic
tools
Department of Defense Directive on Modeling and
Simulation (MS) Management (DoDD 5000.59)
Establishes DoD policy, assigns responsibilities,
and prescribes procedures for the management of
MS. Establishes the DoD Executive Council for
MS and the Defense M S Office (DMSO). See
also DoD Directive 5000.1 Defense
Acquisition DoD Directive 8320.1 DoD
Data Administration DoD Directive 8000.1
Defense Information Management (IM)
Program DoD Directive 5134.3 Director of
Defense Research and Engineering
(DDRE) DoD 5000.2-R Under Sec. Def.
Memorandum Mandatory Procedures for
Major Defense Acquisition Programs
(MDAPs) and Major Automated
Information Systems (MAIS) Acquisition
Programs
DoD Modeling and Simulation (MS) Glossary DoD
5000.59-M
DoD MS Master Plan (DoD 5000.59-P)
Department of Defense Instruction on Modeling and
Simulation (MS) Verification, Validation, and
Accreditation (VVA) (DoDI 5000.61)
Implements policy, assign responsibilities, and
prescribe procedures under reference DoDD 5000.59
for the VVA of DoD MS and their
associated data. Also authorizes publication
of DoD 5000.61-G, "DoD Verification, Validation,
and Accreditation Guide, consistent with DoD
5025.1-M.
DoD Verification, Validation, and Accreditation
Guide (DoD 5000.61-G)
Navy MS Management (OPNAVINST 5200.34) DEPARTMENT
OF THE NAVY MS MANAGEMENT (SECNAVINST
5200.38A) VVA of MS (SECNAVINST 5200.40)
Army, Navy, Marine, Air Force Policy Procedures
Manuals MS Master Plans Standards
Recommended Practice Guidebooks, etc. (Example
67 Army MS Standards)
Examples
19
NASA Hierarchy Guidance Documents
  • NASA Policy Directive (NPD)
  • Document NASA policy, responsibilities and
    authorities.
  • Describe the what required by management to
    achieve NASAs vision and mission.

NPR x
NPR y
  • NASA Procedural Requirements (NPR)
  • Establish requirements and procedures to
    implement NASA policies.

TS x
TS y
  • NASA Technical Standard
  • Establish uniform engineering and technical
    requirements for processes, procedures, practices
    and methods that have been adopted as standard,
    including requirements for selections,
    application, and design criteria of an item.

RP x
RP y
  • NASA Guidelines
  • Provide instructions and data for the application
    of standards and recommended practices,
    procedures, and methods.
  • Includes Recommended Practices and preliminary
    Standards.

20
Potential Hierarchy of NASA MS Guidance
Program/Project Management (NPD 7120.4)
Program and Project Management Processes and
Requirements (NPR- 7120.5C)
Systems Engineering Processes and Requirements
(recently approved)
Models and Simulations Processes and
Requirements (NPR- Does not exist)
Models and Simulations Standard (Draft, Diaz
Action 4)
Models and Simulations Guidelines (RPs- To be
developed)
21
Diaz Action 4 Deliverables
  • Documented draft MS Standard meeting Diaz Action
    4 requirements submitted to the NASA Technical
    Standards Working Group (July 2006)
  • Proposed high-level content for related NPDs,
    NPRs, MS Guidelines
  • Recommendations for deployment
  • training plan
  • process for monitoring use of standard
  • plan for implementation of standard
  • process maintenance plan
  • plan for SW maintenance and for SW usage
  • Note
  • MS Standard ?Verification, Validation
    Accreditation (VVA) / Uncertainty Quantification
    (UQ) Standard
  • However, VVA and UQ are important components of
    an MS Standard

22
Approach Status
23
Basic Philosophy of the MS Standard
  • Objective
  • Establish a minimum set of requirements and
    recommendations for using Modeling and Simulation
    (MS) to support critical decisions
  • Assume that the design, development,
    verification, validation, operation, maintenance,
    and retirement of MS will adhere to established
    NASA requirements for Software Engineering and
    Configuration Management (CM)
  • Address those aspects of documentation and CM
    that are unique to MS
  • Use the concepts and terminology familiar to the
    MS community
  • Require/recommend standard practices for
    planning, producing and assessing MS products
    during development, operations and maintenance
  • Emphasize timely, complete and transparent
    reporting (products / assessments)
  • Reference existing standards for most software
    engineering aspects
  • Motivation
  • Effective, credible risk reduction
  • Key Steps
  • Planning, documentation reporting
  • Defensible confidence building
  • Defensible uncertainty quantification

24
Status
  • A very rough first draft was circulated to
    several parties for comments in October 2005
  • The comments have led us to make a drastic
    overhaul for our forthcoming second draft (in
    progress)
  • Topic Working Group being formed
  • Have representation from GRC and MSFC
  • Need representation from ARC, JSC, KSC, and LaRC
  • The second draft will be circulated much more
    widely for comments expected to occur within
    the next month
  • Changes to the second draft based on these
    comments will be the final deliverable to the
    NASA Technical Standards Working Group
  • High-level recommendations for the content of the
    necessary NPD linkage, NPR and Guidebook will be
    included

25
Outline of NASA Standard for MS
  • Scope
  • 1.1 Purpose
  • 1.2 Applicability
  • Applicable Documents
  • Acronyms and Definitions
  • Requirements and Recommendations
  • 4.1 Program and Project Management Roles and
    Responsibilities
    for Modeling and Simulation
  • 4.2 Model
  • 4.3 Simulation and Analysis
  • 4.4 Verification and Validation / Uncertainty
    Quantification (VV/UQ)
  • 4.5 Recommended Practices
  • 4.6 Training and Certification
  • 4.7 Reporting to Decision Makers
  • Guidance (Reference Documents Keywords)

26
NASA Standard for MS
  • Definitions
  • Modeling - the process of developing conceptual,
    mathematical, or computational models
  • Simulation - the process of executing a model
  • Purpose - reduce risk associated with decisions
    based on MS
  • Make MS processes are transparent to decision
    makers
  • Make MS repeatable by a subject matter expert
  • Make MS limitations apparent to people making
    decisions
  • Allow MS to be evaluated against requirements
  • Enable MS credibility to be reported to decision
    makers
  • Scope
  • Applies whenever MS will be used for critical
    decisions
  • Fundamental MS research is not bound by this
    standard, but researchers should consider the
    trade-offs
  • resources required and utility provided (early in
    development cycle)
  • resources required (late in development cycle, if
    needed)

27
Program and Project Management Roles
  • Identify roles and responsibilities that are
    specific to MS
  • Define the objectives and requirements for MS
    products including
  • Intended use
  • Metrics (technical performance, cost, schedule,
    safety, etc.)
  • Verification, Validation and Uncertainty
    Quantification (VV/UQ)
  • Reporting of MS information for critical
    decisions
  • Configuration management (artifacts, timeframe,
    processes)
  • Define the acceptance criteria for MS products
  • Assess and mitigate the programmatic risks
    associated with using MS in critical decisions
  • Identify MS waiver processes
  • Develop a plan for MS development, operations,
    maintenance and retirement

28
Model
  • Document and/or reference basic structure and
    mathematics of model (i.e. physics
    included, equations solved, behaviors modeled,
    etc.)
  • List and provide rationale for simplifying
    assumptions
  • Document data sets, facilities, support software,
    etc. used in model development and input
    preparation
  • Document the results of model validation
    including validation metrics
  • Document the limits of applicability of model
  • Document the uncertainty quantification processes
    used for all models
  • Document the uncertainty quantification of data
    used to develop the model and data incorporated
    in the model
  • Document the requirements for proper use of the
    model
  • Document updates of the model and assign unique
    version description (this includes solution
    adjustment, change of parameters, calibration,
    etc.)
  • Provide a feedback mechanism for users to report
    unusual results, etc. to model developers

29
Simulation
  • Provide clear statements about
  • limits of operation (range of validity)
  • limits which are monitored in the simulation
  • Where MS is used for critical NASA decisions and
    clear statements about limits of operation are
    not available, examine the MS shall for limits
    of operation and ranges of validity and either
  • retrofit with execution wrappers to prevent their
    misuse, or
  • have the limits of operation and ranges of
    validity for the MS documented and distributed
    to those responsible for the MS use, analysis,
    and reporting to decision makers
  • Document the intended use of the simulation, a
    description of the models used in the simulation,
    a high-level summary of the model assumptions and
    provide references to the validation data
  • Document the versions of models and simulations

30
Simulation (Concluded)
  • Document data (including operational data) used
    as input to the simulation
  • Pedigree and/or heritage
  • Required and achieved level of accuracy (e.g.
    uncertainty quantification and/or error bars)
  • References to published data used within the MS
    that include sufficient detail about experimental
    procedure to enable repeatability List and
    provide rationale for simplifying assumptions
  • Document unique computational requirements (e.g.
    support software, processor, compilation options)
  • Document the processes for conducting analysis,
    simulation and uncertainty quantification for
    results reported to decision makers
  • Document the relevant characteristics of the real
    world system that is modeled
  • Place benchmark MS test cases within CM
  • Document the computational resources and platform
    requirements
  • The MS should fail in a manner that prevents
    misuse and misleading results
  •  The MS should provide messages which detail the
    failure mode and point of failure

31
VV / UQ
  • Verification
  • Document verification techniques used
  • Document the verification status of the method
    pertinent to the intended use
  • Document any parameter calibrations for the
    intended use
  • Validation
  • Document model validation studies of the method
    pertinent to the intended use
  • Document any numerical error estimates of the
    results pertinent to the intended use, for
    example
  • Approximations
  • insufficient spatial or temporal resolution
  • insufficient iterative convergence
  • finite precision arithmetic
  • Document the model error estimates of the results
    pertinent to the intended use considering
  • Uncertainty in input data for intended
    application
  • Uncertainty in comparison data for intended
    application

32
VV / UQ (Concluded)
  • Uncertainty Quantification
  • Document any processes used to quantify
    uncertainty of the MS results
  • Document the quantified uncertainties in the MS
    results
  • Report uncertainty management practices used on a
    common scale
  • NASA Langley proposal
  • DMSO proposal Validation Process Maturity Model
  • NASA Marshall CFD proposal Simulation Readiness
    Levels
  • Dept. of Energy / Sandia National Lab proposal

33
Reporting to Decision Makers
  • Reports to decision makers of simulation results
    shall include an estimate of the uncertainty and
    documentation of the process used to obtain the
    estimate. The uncertainty estimate shall be
  • a quantitative estimate of the uncertainty in the
    results
  • a qualitative estimate of the uncertainty in the
    results if a quantitative estimate is not
    available, or
  • Reports to decision makers of simulation results
    shall include an estimate of the uncertainty and
    documentation of the process used to obtain the
    estimate. The uncertainty estimate shall be
  • a quantitative estimate of the uncertainty in the
    results
  • a qualitative estimate of the uncertainty in the
    results if a quantitative estimate is not
    available, or
  • a clear statement that no quantitative or
    qualitative estimate is available
  • Reports to decision makers of results generated
    by simulations that were conducted outside the
    intended use of one or more models shall contain
    a prominent statement to this effect

34
Reporting to Decision Makers (Concluded)
  • Reports of results to decision makers generated
    by simulations for which any waivers to the
    training, certification, or VV/UQ requirements
    were granted shall clearly state the waivers
  • Backup material to reports on simulation results
    should contain a high-level summary of the models
    used, key assumptions, and limits of validity
  • Reports to decision makers of simulation results
    should be documented in the CM system to a
    sufficient degree to permit the results of the
    simulation to be reproduced
  • Reports of results should document deviations
    from established recommended practices
  • Dissenting technical opinions regarding
    recommended actions should be included in the
    reports to decision makers

35
Summary
  • Team chartered to develop a NASA Standard for
    Modeling and Simulation
  • First draft delivered in Oct. 2005
  • Second draft to be circulated for comment within
    the next month
  • Establishes a minimum set of requirements and
    recommendations for using Modeling and Simulation
    (MS) to support critical decisions
  • Enables effective, credible risk reduction
  • Enables defensible confidence building
  • Third draft due to NASA Standard Working Group in
    July 2006
  • We welcome comments, suggestions, criticisms,
    etc.
  • Contact Information
  • Lawrence.L.Green_at_nasa.gov
  • 757-864-2228
Write a Comment
User Comments (0)
About PowerShow.com