DO178C Overview - PowerPoint PPT Presentation

1 / 36
About This Presentation
Title:

DO178C Overview

Description:

DO178C Overview Duncan Brown - Rolls Royce Nick Tudor - QinetiQ – PowerPoint PPT presentation

Number of Views:142
Avg rating:3.0/5.0
Slides: 37
Provided by: acuk
Category:

less

Transcript and Presenter's Notes

Title: DO178C Overview


1
DO178C Overview
  • Duncan Brown - Rolls Royce
  • Nick Tudor - QinetiQ

2
Agenda
  • Why we need a new document
  • Structure of the Special Committee
  • Progress to date
  • Specific overview of SG6 Formal Methods

3
Why we need a new document
4
The DO-178B / ED-12B
  • Takes into account the inputs, constraints,
    requirements from all the stakeholders
  • Consensus between Airframe Manufacturers,
    Equipment suppliers, Certification Authorities
  • DO-178B / ED-12B was written as much as possible
    as a requirements oriented document
  • try to stay not prescriptive on the means gt less
    sensitive to technology evolution
  • DO-178B / ED-12B is a process oriented
    document
  • In 1992, we could not imagine to get sufficient
    assurance on the software product just by
    assessing it
  • More than 10 years of use do not reveal major
    safety flaws

5
So, why changing ?
  • Because FAA wants to...
  • But also because we need it DO
    178B / ED-12B was released in 1992
  • In 1992, Software engineering was 24 years old...
  • In 2005, Software Engineering is 50 older, as
    compared to 1992
  • Changing for a better consensus, taking into
    account
  • Legacy from the clarification group
  • Lessons learnt in applying DO-178B / ED-12B
  • Newly available industrial means

6
A number of issues that need to be resolved
  • EUROCAE WG 52/RTCA SC 190 main points
  • Should safety specific considerations be
    addressed in DO178 ?
  • Configuration Control requirement too high for
    tool
  • Integration process vs. integration testing
  • Finding Common mode errors not really addressed
  • Not enough goal oriented
  • DO-178B/ED-12B forces the applicant to address
    the objectives directly which may not be
    applicable for a given technology
  • Objectives in Annex tables are not all
    objectives--some are specific means of compliance
    (MC/DC), so an alternative means of compliance
    are not feasible
  • COTS issue not addressed in the same way in
    DO-178B DO-278
  • Recent development shows us that these issues are
    not theoretical

7
Fall 2004RTCA Ad-hoc SW meeting
  • Ad-hoc SW meeting convened by RTCA in October
  • US members FAA, NASA, Boeing, Honeywell,
    Rockwell, PW, United Technologies, Avidyne,
    Consultants,
  • European guests (D. Hawken, P. Heller, R. Hannan,
    G.Ladier)
  • 161 issues identified on DO-178B
  • 71 issues clarification or consistency
  • 90 issues new guidance. Helpful or needed for
    most of them
  • Amongst the top 20 issues

Need method to validate models, need expressive
requirements languages to be used to verify
models. Do we now also need guidance on how to
build models, validate models, etc.
Section 6 of 178B requires "testing". There are
now other techniques that are more exhaustive.
Analyzing source code using tools. Not clear how
to address structural coverage and dead code
issues when using such tools.
The criteria for development tool qualification
is too rigorous.
Memory and timing margins, WCET, difficult to
analyze. No "objective" that addresses WCET
Consider adding the DO-278 criteria to DO-178
Really needs to be addressed in the safety
assessment and ARP 4754/4761 - then flows down to
the software.
  • Model based development
  • Development tool qualification criteria
  • Analyzing computer resource usage and WCET
  • Separate document for CNS/ATM
  • Security Guidance
  • Static verification

8
Terms of reference
  • The ad-hoc SW group has proposed to
  • Maintain the current objective-based approach for
    SW assurance.
  • Modify (minimum changes) DO-178B/ED-12B gt
    DO-178C/ED-12C.
  • Develop rationale for each objective and package
  • Develop guidelines that provide information (?
    DO-248B/ED-94B)
  • Develop supplements to document
    technology-specific or method-specific guidance.

In American Guidance is stronger than
GuidelinesIn French Guidance is Directives,
Guidelines are Recommandations
  • Supplements
  • May provide alternate means to satisfy
    DO-178C/ED-12C objectives
  • Possible supplements Tool qualification,
    Model-based development, Object-oriented
    technology, Formal methods,
  • On this basis, RTCA and EUROCAE agreed to
    commence new committees

9
Structure of the Special Committee/Working Group
  • Joint between
  • EUROCAE and RTCA

10
WG-71/SC-205 Structure
Joint Chair Gerard Ladier Airbus Joint Chair
Jim Krodel PW Joint Sec Ross Hannan
Sigma Joint Sec Mike DeWalt CSI FAA Rep
Barbara Lingberg FAA
Joint committee WG71/ SC205 Executive Committee
SG1 Sub Group Coordination
SG2 Issue Review Rationale
Membership from Airframers, avionics suppliers,
certification authorities, engine manufacturers,
CNS/ATM specialists, Space community, consultants
SG3 Tools
SG4 Model Based Design
SG5 Object Oriented Technology
SG6 Formal Methods
SG7 Safety CNS/ATM
11
Progress to date
  • Joint between
  • EUROCAE and RTCA

12
Sub Group 1 Document IntegrationChairs Tom
Ferrell (Ferrell and Associates Consulting)
and Ron Ashpole (Bewicks Consulting)
  • Focussing on issues within current documente.g.
  • Annex A tables do not accurately reflect the real
    requirements in the main document. Hence causes
    people who focus only on the Annex tables to have
    issues with the DER
  • Information Paper system for managing work
    products
  • Technology Supplement Template
  • Hooks in DO-178C core document to link in
    Technology Supplements
  • Changes to core document such as Errata

13
Sub Group 2 - Issues Rationale Chairs Will
Struck (FAA) Ross Hannan (Sigma Associates)
  • Believe it or not they lost the rationale!
  • Discussion on why we use MCDC (or not)
  • Will coordinate activities relating to dead
    deactivated code (different for some
    technologies?)
  • Rationale for objectives should be released to WG
    May 07
  • Also new objectives will be set as a result

14
Sub Group 3 Tool QualificationChairs Leanna
Rierson (Digital Safety Consulting) Frederick
Pothon (ACG Solutions)
  • Likely new approach for tool qualification.
  • Aim is to develop a qualification approach that
    meets the needs of development tools and keep the
    current approach for current classes of
    verification tools (as far as possible),
  • enable re-use of tool qualification data,
  • allow emerging tool technologies,
  • identify users and developers,
  • objective based approach,
  • used by multiple domains (system/software).
  • Propose a re-write of section 12.2 with the aim
    to assess the impact of the tool rather than to
    determine the level directly.
  • New DOXXX/EDXXX document objective based, by
    level
  • Has some merit in that tool manufacturers dont
    necessarily have to be software certification
    experts.

15
Sub Group 4 Model Based Design (
Verification)Chairs Mark Lillis (Hamilton
Sundstrand) Pierre Lionne (EADS)
  • Split in agendas
  • Some wish to do things because they have a
    technology
  • Others wish to go back to first principles (as
    advised by Exec)
  • Opportunity is being lost as nothing abstract in
    what needs to be demonstrated by any MBD is being
    discussed
  • Not addressing syntax/semantics
  • Nothing said about relationship to existing
    objectives
  • Diving into low level issues
  • Biggest discussion topics to date have been
  • What is the difference between High-Level and
    Low-Level requirements?
  • What is source code?

16
Sub Group 5 OO TechnologiesChairs Jim Chelini
(Verocel Inc) Peter Heller (Airbus)
  • Unlikely to be much new because of FAA OOTiA work
  • Aim would be to withdraw this following
    completion of DO-178C and supplements
  • However not clear if an OO supplement is required
  • Much initial work was creating FAQs/Discussion
    Papers for DO-248 and minor changes for DO-178C
    This was challenged as the intent of the working
    group is to consolidate guidance.
  • Will address some structural coverage issues
    specific to OO and polymorphism

17
Sub Group 6 Formal MethodsChairs Kelly
Hayhurst (NASA) Duncan Brown (Rolls-Royce)
  • Established a definite need for a Formal Methods
    technology supplement
  • Decided to separate the case studies and tutorial
    information into a discussion paper
  • Proposed rewrite of Section 6 Verification
    because it made the use of technology supplements
    easier

18
Sub Group 7 CNS/ATM SafetyChairs Don Heck
(Boeing) David Hawken (National Air Traffic
Services Limited)
  • Issues surrounding merge of DO-278 (CNS/ATM) and
    DO-178 (Airborne)
  • Likely to happen
  • Domain specific sections where applicable
    annexes?
  • Links to system safety considerations
  • Decided that there is not to be Level A

19
Why section 6 must change
20
The Problem
  • DO-178B Section 6 calls explicitly for Review,
    Analysis and Test rather than setting out the
    objectives for verification and leaving the
    applicant to create the appropriate verification
    plan.
  • However specific methods are deemed the only way
    to meet specific objectives.
  • There are issues with specific technologies such
    as Formal Methods, OO and MBD with respect to
    how the current section 6 objectives can be met.
  • The existing material can benefit from additional
    clarification.

21
The Proposal
  • To re-arrange the DO-178B section 6 material
    according to life cycle data.
  • To explicitly state the objectives for the
    verification of each life cycle data item.
  • To retain the complete testing process under the
    verification of executable object code.
  • To generalise the wording to use verification
    instead of specific methods (Review, Analysis
    Test).
  • To ensure that DO-178C on its own is as forceful
    with regard to testing as DO-178B.

22
The Verification Process
System Requirements
A-3.1 Compliance A-3.6 Traceability
A-3.2 Accuracy Consistency A-3.3 HW
Compatibility A-3.4 Verifiability A-3.5
Conformance A-3.7 Algorithm Accuracy
(A-2 1, 2)
High-Level Requirements
A-6.1 Compliance A-6.2 Robustness
A-4.1 Compliance A-4.6 Traceability
A-4. 8 Architecture Compatibility
(A-2 3, 4, 5)
A-4.9 Consistency A-4.10 HW Compatibility A-4.11
Verifiability A-4.12 Conformance A-4.13
Partition Integrity
A-4.2 Accuracy Consistency A-4.3 HW
Compatibility A-4.4 Verifiability A-4.5
Conformance A-4.7 Algorithm Accuracy
Software Architecture
Low-Level Requirements
A-5.1 Compliance A-5.5 Traceability
A-5.2 Compliance
(A-2 6)
A-5.3 Verifiability A-5.4 Conformance A-5.6
Accuracy Consistency
A-6.3 Compliance A-6.4 Robustness
Source Code
(A-2 7)
Compliance with requirements Conformance with
standards
A-6.5 Compatible With Target
A-5. 7 Complete Correct
Executable Object Code
23
The Verification Process Level A
System Requirements
A-3.1 Compliance A-3.6 Traceability
A-3.2 Accuracy Consistency A-3.3 HW
Compatibility A-3.4 Verifiability A-3.5
Conformance A-3.7 Algorithm Accuracy
(A-2 1, 2)
High-Level Requirements
A-6.1 Compliance A-6.2 Robustness
A-4.1 Compliance A-4.6 Traceability
A-4.8 Architecture Compatibility
(A-2 3, 4, 5)
A-4.9 Consistency A-4.10 HW Compatibility A-4.11
Verifiability A-4.12 Conformance A-4.13
Partition Integrity
A-4.2 Accuracy Consistency A-4.3 HW
Compatibility A-4.4 Verifiability A-4.5
Conformance A-4.7 Algorithm Accuracy
Software Architecture
Low-Level Requirements
A-5.1 Compliance A-5.5 Traceability
A-5.2 Compliance
(A-2 6)
A-5.3 Verifiability A-5.4 Conformance A-5.6
Accuracy Consistency
A-6.3 Compliance A-6.4 Robustness
Source Code
(A-2 7)
Compliance with requirements Conformance with
standards
A-6.5 Compatible With Target
A-5. 7 Complete Correct
Executable Object Code
24
The Verification Process Level B
System Requirements
A-3.1 Compliance A-3.6 Traceability
A-3.2 Accuracy Consistency A-3.3 HW
Compatibility A-3.4 Verifiability A-3.5
Conformance A-3.7 Algorithm Accuracy
(A-2 1, 2)
High-Level Requirements
A-6.1 Compliance A-6.2 Robustness
A-4.1 Compliance A-4.6 Traceability
A-4.8 Architecture Compatibility
(A-2 3, 4, 5)
A-4.9 Consistency A-4.10 HW Compatibility A-4.11
Verifiability A-4.12 Conformance A-4.13
Partition Integrity
A-4.2 Accuracy Consistency A-4.3 HW
Compatibility A-4.4 Verifiability A-4.5
Conformance A-4.7 Algorithm Accuracy
Software Architecture
Low-Level Requirements
A-5.1 Compliance A-5.5 Traceability
A-5.2 Compliance
(A-2 6)
A-5.3 Verifiability A-5.4 Conformance A-5.6
Accuracy Consistency
A-6.3 Compliance A-6.4 Robustness
Source Code
(A-2 7)
Compliance with requirements Conformance with
standards
A-6.5 Compatible With Target
A-5. 7 Complete Correct
Executable Object Code
25
The Verification Process Level C
System Requirements
A-3.1 Compliance A-3.6 Traceability
A-3.2 Accuracy Consistency A-3.3 HW
Compatibility A-3.4 Verifiability A-3.5
Conformance A-3.7 Algorithm Accuracy
(A-2 1, 2)
High-Level Requirements
A-6.1 Compliance A-6.2 Robustness
A-4.1 Compliance A-4.6 Traceability
A-4.8 Architecture Compatibility
(A-2 3, 4, 5)
A-4.9 Consistency A-4.10 HW Compatibility A-4.11
Verifiability A-4.12 Conformance A-4.13
Partition Integrity
A-4.2 Accuracy Consistency A-4.3 HW
Compatibility A-4.4 Verifiability A-4.5
Conformance A-4.7 Algorithm Accuracy
Software Architecture
Low-Level Requirements
A-5.1 Compliance A-5.5 Traceability
A-5.2 Compliance
(A-2 6)
A-5.3 Verifiability A-5.4 Conformance A-5.6
Accuracy Consistency
A-6.3 Compliance A-6.4 Robustness
Source Code
(A-2 7)
Compliance with requirements Conformance with
standards
A-6.5 Compatible With Target
A-5. 7 Complete Correct
Executable Object Code
26
The Verification Process Level D
System Requirements
A-3.1 Compliance A-3.6 Traceability
A-3.2 Accuracy Consistency A-3.3 HW
Compatibility A-3.4 Verifiability A-3.5
Conformance A-3.7 Algorithm Accuracy
(A-2 1, 2)
High-Level Requirements
A-6.1 Compliance A-6.2 Robustness
A-4.1 Compliance A-4.6 Traceability
A-4.8 Architecture Compatibility
(A-2 3, 4, 5)
A-4.9 Consistency A-4.10 HW Compatibility A-4.11
Verifiability A-4.12 Conformance A-4.13
Partition Integrity
A-4.2 Accuracy Consistency A-4.3 HW
Compatibility A-4.4 Verifiability A-4.5
Conformance A-4.7 Algorithm Accuracy
Software Architecture
Low-Level Requirements
A-5.1 Compliance A-5.5 Traceability
A-5.2 Compliance
(A-2 6)
A-5.3 Verifiability A-5.4 Conformance A-5.6
Accuracy Consistency
A-6.3 Compliance A-6.4 Robustness
Source Code
(A-2 7)
Compliance with requirements Conformance with
standards
A-6.5 Compatible With Target
A-5. 7 Complete Correct
Executable Object Code
27
The Verification Process Level E
Executable Object Code
28
Comparison of Old -gt New
  • 6.0 SOFTWARE VERIFICATION PROCESS
  • 6.1 Software Verification Process Objectives
  • 6.2 Software Verification Process Activities
  • 6.3 Software Reviews and Analyses
  • 6.3.1 Reviews and Analyses of the High-Level
    Requirements
  • a. Compliance with system requirements
  • b. Accuracy and consistency
  • c. Compatibility with the target computer
  • d. Verifiability
  • e. Conformance to standards
  • f. Traceability
  • g. Algorithm aspects
  • 6.3.2 Reviews and Analyses of the Low-Level
    Requirements
  • a. Compliance with high-level requirements
  • b. Accuracy and consistency
  • c. Compatibility with the target computer
  • d. Verifiability
  • e. Conformance to standards
  • f. Traceability
  • 6.0 SOFTWARE VERIFICATION PROCESS
  • 6.1 Software Verification Process Objectives
  • 6.2 Software Verification Process Activities
  • 6.3 Detailed Guidance for Verification Activities
  • 6.3.1 Verification Activities for the High-Level
    Requirements
  • a. Compliance with system requirements
  • b. Accuracy and consistency
  • c. Compatibility with the target computer
  • d. Verifiability
  • e. Conformance to standards
  • f. Traceability
  • g. Algorithm aspects
  • 6.3.2 Verification Activities for the Low-Level
    Requirements
  • a. Compliance with high-level requirements
  • b. Accuracy and consistency
  • c. Compatibility with the target computer
  • d. Verifiability
  • e. Conformance to standards
  • f. Traceability

29
Comparison of Old -gt New
  • 6.3.3Verification Activities for the Software
    Architecture
  • a. Compliance with high-level requirements
  • b. Consistency
  • c. Compatibility with the target computer
  • d. Verifiability
  • e. Conformance to standards
  • f. Partitioning integrity
  • 6.3.4 Verification Activities for the Source
    Code
  • a. Compliance with low-level requirements
  • b. Compliance with the software architecture
  • c. Verifiability
  • d. Conformance to standards
  • e. Traceability
  • f. Accuracy and consistency
  • 6.3.5 Verification Activities for the Executable
    Object Code
  • a. Completeness and correctness
  • b. Compliance with the high-level requirements
  • c. Robustness for high and low-level requirements
  • d. Compliance with the low-level requirements
  • 6.3.3 Reviews and Analyses of the Software
    Architecture
  • a. Compliance with high-level requirements
  • b. Consistency
  • c. Compatibility with the target computer
  • d. Verifiability
  • e. Conformance to standards
  • f. Partitioning integrity
  • 6.3.4 Reviews and Analyses of the
  • Source Code
  • a. Compliance with low-level requirements
  • b. Compliance with the software architecture
  • c. Verifiability
  • d. Conformance to standards
  • e. Traceability
  • f. Accuracy and consistency
  • 6.3.5 Reviews and Analysis of the Outputs of the
    Integration Process

30
Comparison of Old -gt New
  • 6.3.6 Verification Activities for the Analyses,
    Test Cases, Procedures and Results
  • a. Analysis and Test cases
  • b. Analysis and Test procedures
  • c. Analysis and Test results
  • 6.3.6.1 Coverage Analysis
  • 6.3.6.1.1 Requirements Coverage Analysis
  • 6.3.6.1.2 Structural Coverage Analysis
  • 6.3.6.1.3 Structural Coverage Analysis Resolution
  • Transferred to section 6.3.5.1
  • Transferred to section 6.3.5.2
  • Transferred to 6.3.5
  • Transferred to 6.3.5
  • Transferred to 6.3.5
  • Transferred to 6.3.5
  • Transferred to section 6.3.6.1
  • Transferred to section 6.3.6.1.1
  • Transferred to section 6.3.6.1.2
  • Transferred to section 6.3.6.1.3
  • 6.3.6 Reviews and Analyses of the Test Cases,
    Procedures, and Results
  • 6.4 Software Testing Process
  • 6.4.1 Test Environment
  • 6.4.2 Requirements-Based Test Case Selection
  • 6.4.2.1 Normal Range Test Cases
  • 6.4.2.2 Robustness Test Cases
  • 6.4.3 Requirements-Based Testing Methods
  • 6.4.4 Test Coverage Analysis
  • 6.4.4.1 Requirements-Based Test Coverage Analysis
  • 6.4.4.2 Structural Coverage Analysis
  • 6.4.4.3 Structural Coverage Analysis Resolution

31
(No Transcript)
32
(No Transcript)
33
(No Transcript)
34
Major Comments Raised
  • The paper lowers the bar for testing
    significantly (To zero!)
  • Review and analysis are the only applicable
    methods for verification of higher level life
    cycle data.
  • Testing is the only applicable method for meeting
    the verification of the executable object code.

35
Summary
  • Latest Revision of Paper emphasises the reliance
    on testing where there are no other accepted
    means for verification.
  • DO-178C used alone needs to be as forceful in the
    need for testing as DO-178B
  • It now says that testing is necessary to ensure
    that the executable object code is compatible
    with the target computer
  • Only the use of approved guidance in conjunction
    with DO-178C could alter the amount of testing
    required.
  • Paper to be agreed by SG6 at interim meeting
    mid-year.
  • Plenary consensus to be sought in Vienna in the
    Autumn.
  • There is significant backing for this paper.
  • This provides a way to use Formal Methods as part
    of certification The technology supplement will
    provide the How

36
IP 601 Rev B (Draft)
Write a Comment
User Comments (0)
About PowerShow.com