Software Assurance Metrics and Tool Evaluation - PowerPoint PPT Presentation

About This Presentation
Title:

Software Assurance Metrics and Tool Evaluation

Description:

Software Assurance Metrics and Tool Evaluation. Paul E. Black ... Introduction. SRD project. SAMATE project structure. Future where do we go now? 9/13/09 ... – PowerPoint PPT presentation

Number of Views:43
Avg rating:3.0/5.0
Slides: 21
Provided by: drpaul1
Learn more at: https://hissa.nist.gov
Category:

less

Transcript and Presenter's Notes

Title: Software Assurance Metrics and Tool Evaluation


1
Software Assurance Metrics and Tool Evaluation
  • Paul E. Black
  • National Institute of Standards and Technology
  • http//www.nist.gov/
  • paul.black_at_nist.gov

2
Outline
  • Introduction
  • SRD project
  • SAMATE project structure
  • Future where do we go now?

3
What is Software Assurance?
  • Activities that ensures that software processes
    and products conform to requirements.
  • after NASA Software Assurance Guidebook
  • Two legs of good software
  • Good Development
  • Good Checking
  • Testing (dynamic)
  • Analysis (static)

4
NISTs role
  • What is NIST?
  • A non-regulatory agency in Dept. of Commerce
  • 3,000 employees in Maryland and Colorado
  • Primarily research, not funding
  • Why NIST?
  • Over a century of experience in standards and
    measurement
  • Involved in security DES, AES, NVLAP, etc.
  • Trusted, neutral 3rd party

5
The Two Projects
SAMATE Software Assurance Metrics And Tool
Evaluation
SRD Standard Reference Dataset
6
Outline
  • Introduction
  • SRD project
  • SAMATE project structure
  • Future where do we go now?

7
SRD Project Goals
  • Identify classes of security flaws and
    vulnerabilities
  • Identify classes of software security assessment
    techniques
  • Document state of the art
  • Develop a Standard Reference Dataset (SRD) of
    clean programs and programs with security flaws

8
SRD Characteristics
  • Small test cases for each flaw
  • Separate can detect from speed issues
  • Flawed programs and their clean counterparts
  • Very large test cases
  • Confirm speed and maximum size
  • Test cases taken from actual code
  • Nobody can say it would never happen
  • Many different subsets
  • Java, C, web app, OS, Windows, Unix, etc.
  • Ongoing development and additions
  • Submissions from NIST, researchers, vendors
  • Readily usable

9
SRD Project Plans
  • Small workshop
  • http//samate.nist.gov/softSecToolsSOA
  • 10 11 August at NIST
  • Publish proceedings as NIST Special Publication
    put in ACM Digital Library
  • Write journal article on
  • classes of known software security
    vulnerabilities and
  • the state of the art of security SA tools

10
Outline
  • Introduction
  • SRD project
  • SAMATE project structure
  • Future where do we go now?

11
DHS Software Assurance Plan
  • 1. PEOPLE (Education/Training)
  • Software Developer focused training and education
  • 2. PROCESS (Lifecycle, Best Practices,
    Standards)
  • Security throughout the Software Development Life
    Cycle
  • 3. TECHNOLOGY (Tools and RD)
  • SA tools identification, enhancement, and
    development
  • 4. ACQUISITION (SOW / Procurement language)
  • Embed security requirements in procurement stage

12
The SAMATE Project
  • http//samate.nist.gov/
  • Compendiums (ongoing)
  • Tools
  • Researchers and companies
  • Workshops
  • Aids for tool evaluation
  • Software metrics

13
Workshops
  • Taxonomy of SA functions and techniques
  • Approach (code vs. spec, static vs. dynamic)
  • Software type (distributed, real time, secure)
  • Type of fault detected
  • Which are the most important?
  • Highest cost/benefit ratio?
  • Finds highest priority vulnerabilities?
  • Identify gaps in SA functions and write research
    agenda
  • Plan and initiate studies for metrics
  • first workshop in Long Beach, Nov, w/ASE

14
Purposes of SA Tool Evaluations
  • Precisely document what a tool does (and doesnt)
    do
  • in order to
  • Provide feedback to tool developers
  • Simple changes to make
  • Directions for future releases
  • Inform users
  • Match the tool to a particular situation
  • Understand significance of tool results
  • Guide research for next tool generation

15
Developing a Specification
  • After tool function selection approved by working
    group,
  • NIST develops a specification for the function
    with focus group input
  • Spec posted to web for public comment
  • NIST develops the tests
  • Detailed plans
  • Scripts
  • Standard Reference Dataset

16
Outline
  • Introduction
  • SRD project
  • SAMATE project structure
  • Future where do we go now?

17
Toward Software Metrics
  • Qualitative comparison
  • Formally defined quantity
  • Unit and scale
  • Measured value
  • Derived units

warmer, colder buggy, secure
temperature quality? confidence?
degree, kelvin ?
Heat energysmDt Software assurancept
18
Tool Effectiveness Metrics
  • Do they really find vulnerabilities and catch
    bugs?
  • In other words, how much assurance does running a
    tool provide?
  • Create studies and experiments to measure the
    effectiveness of tools

19
Call for Participation
  • Define classes of flaws and vulnerabilities
  • Contribute to collections of tools, researchers,
  • Help define classes of SA functions
  • Decide order of importance of functions
  • Participate in focus group to specify a function
  • Contribute to standard reference dataset
  • Develop metrics to assess software and tools
  • Help set research agenda

20
Society has 3 options
  • Learn how to make software that works
  • Limit size or authority of software
  • Accept failing software
Write a Comment
User Comments (0)
About PowerShow.com