CSCE 548 Secure Software Development Risk-Based Security Testing - PowerPoint PPT Presentation

Loading...

PPT – CSCE 548 Secure Software Development Risk-Based Security Testing PowerPoint presentation | free to download - id: 40daf6-ZGI1Y



Loading


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation
Title:

CSCE 548 Secure Software Development Risk-Based Security Testing

Description:

Risk-Based Security Testing CSCE 548 ... For reliability estimation ... stimulus-response time, queue lengths, etc. Resources to be tested: network bandwidth ... – PowerPoint PPT presentation

Number of Views:12
Avg rating:3.0/5.0
Slides: 24
Provided by: far1
Learn more at: http://www.cse.sc.edu
Category:

less

Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: CSCE 548 Secure Software Development Risk-Based Security Testing


1
CSCE 548 Secure Software DevelopmentRisk-Based
Security Testing
2
Reading
  • This lecture
  • Risk-Based Security Testing, McGraw Chapter 7
  • Next lecture
  • Security Operations, McGraw Chapter 9

3
Application of Touchpoints
External Review
3. Penetration Testing
1. Code Review (Tools)
6. Security Requirements
4. Risk-Based Security Tests
2. Risk Analysis
7. Security Operations
5. Abuse cases
2. Risk Analysis
Requirement and Use cases
Architecture and Design
Test Plans
Code
Tests and Test Results
Feedback from the Field
4
Software Testing
  • Running a program or system with the intent of
    finding errors
  • Evaluating capability of the system and
    determining that its requirements are met
  • Physical processes vs. Software processes
  • Testing purposes
  • To improve quality
  • For Verification Validation (VV)
  • For reliability estimation

5
Quality Assurance
  • External quality correctness, reliability,
    usability, integrity
  • Interior (engineering) quality efficiency,
    testability, documentation, structure
  • Future (adaptability) quality flexibility,
    reusability, maintainability

6
Correctness Testing
  • Black box
  • Test data are derived from the specified
    functional requirements without regard to the
    final program structure
  • Data-driven, input/output driven, or
    requirements-based
  • Functional testing
  • No implementation details of the code are
    considered

7
Correctness Testing
  • White box
  • Software under test are visible to the tester
  • Testing plans based on the details of the
    software implementation
  • Test cases derived from the program structure
  • Glass-box testing, logic-driven testing, or
    design-based testing

8
Performance Testing
  • Goal bottleneck identification, performance
    comparison and evaluation, etc.
  • Explicit or implicit requirements
  • "Performance bugs" design problems
  • Test usage, throughput, stimulus-response time,
    queue lengths, etc.
  • Resources to be tested network bandwidth
    requirements, CPU cycles, disk space, disk access
    operations, memory usage, etc.

9
Reliability Testing
  • Probability of failure-free operation of a system
  • Dependable software it does not fail in
    unexpected or catastrophic ways
  • Difficult to test

10
Security Testing
  • Test finding flaws in software can be exploited
    by attackers
  • Quality, reliability and security are tightly
    coupled
  • Software behavior testing
  • Need risk-based approach using system
    architecture information and attackers model

11
Risk-Based Testing
  • Identify risks
  • Create tests to address identified risks
  • Security testing vs. penetration testing
  • Level of approach
  • Timing of testing

12
Penetration Testing
  • Performed after the software is completed
  • Evaluate operational environment
  • Dynamic behavior
  • Outside ? in activity defending perimeters
  • Cursory

13
Security Testing
  • Can be applied before the product is completed
  • Different levels of testing (e.g., component/unit
    level vs. system level)
  • Testing environment
  • Detailed

14
Risk Analysis
  • Design phase analysis
  • Identify and rank risks
  • Discusses inter-component assumptions
  • Component/unit testing
  • Test for
  • Unauthorized misuse of and access to the target
    assets
  • Violations of assumptions
  • Breaking system into a number of discrete parts
  • Risk can be mitigated within the bounds of
    contextual assumptions

15
System-Level Testing
  • Focus on the properties of the integrated
    software system
  • Penetration testing Security testing
  • Using data flow diagrams, models, and
    inter-component documentations, identify
  • Inter-component failures
  • Design level security risks
  • Use misuse cases to enhance test plan

16
Behavior in the Presence of Malicious Attack
  • What happens when the software fails?
  • Safety critical systems
  • Track risk over time
  • Security relative to
  • Information and services protected
  • Skills and resources of adversaries
  • Cost of protection
  • System vulnerabilities

17
Vulnerabilities
  • Design-level
  • Hardest to detect
  • Prevalent and critical
  • Requires great expertise to detect hard to
    automate
  • Implementation specific
  • Critical
  • Easier to detect some automation

18
Security Testing
  • Functional security testing testing security
    mechanisms for functional capabilities
  • Adversarial security testing risk-based security
    testing
  • Understanding and simulating the attackers
    approach
  • Both approaches must be used
  • Security attacks may ignore the security
    mechanism to exploits of the software defects

19
Who Should Perform the Test?
  • Standard testing organizations
  • Functional testing
  • Software security professionals
  • Risk-based security testing
  • Important expertise and experience

20
How to Test?
  • White box analysis
  • Understanding and analyzing source code and
    design
  • Very effective finding programming errors
  • Can be supported by automated static analyzer
  • Disadvantage high rate of false positives
  • Black box analysis
  • Analyze a running program
  • Probe the program with various input (malicious
    input)
  • No need for any code can be tested remotely

21
Malicious Input
  • Software takes input
  • Trust input?
  • Malformed or malicious input may lead to security
    compromise
  • What is the input?
  • Data vs. control
  • Attacker toolkit

22
What Else?
  • Testing for malicious input necessary but NOT
    sufficient
  • Risk-based security testing
  • Planning tests (use forest-level view)
  • Need operational aspects
  • System state vs. applications used
  • Multithread system time-based attacks

23
Next Class
  • Security Operations
About PowerShow.com