CSCE 548 Secure System Standards Risk Management CSCE 548 - PowerPoint PPT Presentation


PPT – CSCE 548 Secure System Standards Risk Management CSCE 548 PowerPoint presentation | free to download - id: 3dde7e-MjhhZ


The Adobe Flash plugin is needed to view this content

Get the plugin now

View by Category
About This Presentation

CSCE 548 Secure System Standards Risk Management CSCE 548


CSCE 548 Secure System Standards Risk Management CSCE 548 - Farkas * National Computer Security Center 1981: National Computer Security Center (NCSC) was established ... – PowerPoint PPT presentation

Number of Views:31
Avg rating:3.0/5.0
Slides: 53
Provided by: cseScEdu3
Learn more at:


Write a Comment
User Comments (0)
Transcript and Presenter's Notes

Title: CSCE 548 Secure System Standards Risk Management CSCE 548

CSCE 548 Secure System Standards Risk Management
  • This lecture
  • McGraw Chapter 2
  • Recommended
  • Rainbow Series Library, http//
  • Common Criteria, http//www.commoncriteriaportal.o
  • Next lecture
  • Software Development Lifecycle Dr. J. Vidal
  • Handout on SDLC and UML

Homework 1
  • Choose a team member among your class mates.
    (This selection is for this exercise only. Team
    member selection is facilitated in class. If
    you've missed the Jan. 20th class, contact Dr.
    Farkas to have a team member assigned to you.)
  • List the steps of RMF for the "KillerAppCo's
    iWare 1.0 Server" given in your text book. (3
  • Carry out similar RMF on the computing resources
    owned by your team member. For example,
    understand the "business" context may include
    goals like graduating from USC, making profit
    from writing software to a company, etc.
    Document your RMF activities and findings. (7
  • BONUS points (2 points) Have your partner
    evaluate your risk management report and comment
    on it.

Incident Handling Computer Security Incident
Handling Guide, Recommendations of the National
Institute of Standards and Technology
How to Response?
  • Actions to avoid further loss from intrusion
  • Terminate intrusion and protect against
  • Law enforcement prosecute
  • Enhance defensive security
  • Reconstructive methods based on
  • Time period of intrusion
  • Changes made by legitimate users during the
    effected period
  • Regular backups, audit trail based detection of
    effected components, semantic based recovery,
    minimal roll-back for recovery.

Roles and Responsibilities
  • User
  • Vigilant for unusual behavior
  • Report incidents
  • Manager
  • Awareness training
  • Policies and procedures
  • System administration
  • Install safeguards
  • Monitor system
  • Respond to incidents, including preservation of

Computer Incident Response Team
  • Assist in handling security incidents
  • Formal
  • Informal
  • Incident reporting and dissemination of incident
  • Computer Security Officer
  • Coordinate computer security efforts
  • Others law enforcement coordinator,
    investigative support, media relations, etc.

Incident Response Process 1.
  • Preparation
  • Baseline Protection
  • Planning and guidance
  • Roles and Responsibilities Training
  • Incident response team

Incident Response Process 2.
  • Identification and assessment
  • Symptoms
  • Nature of incident
  • Identify perpetrator, origin and extent of attack
  • Can be done during attack or after the attack
  • Gather evidences
  • Key stroke monitoring, honey nets, system logs,
    network traffic, etc.
  • Legislations on Monitoring!
  • Report on preliminary findings

Incident Response Process 3.
  • Containment
  • Reduce the chance of spread of incident
  • Determine sensitive data
  • Terminate suspicious connections, personnel,
    applications, etc.
  • Move critical computing services
  • Handle human aspects, e.g., perception
    management, panic, etc.

Incident Response Process 4.
  • Eradication
  • Determine and remove cause of incident if
    economically feasible
  • Improve defenses, software, hardware, middleware,
    physical security, etc.
  • Increase awareness and training
  • Perform vulnerability analysis

Incident Response Process 5.
  • Recovery
  • Determine course of action
  • Reestablish system functionality
  • Reporting and notifications
  • Documentation of incident handling and evidence

Follow Up Procedures
  • Incident evaluation
  • Quality of incident (preparation, time to
    response, tools used, evaluation of response,
  • Cost of incident (monetary cost, disruption, lost
    data, hardware damage, etc.)
  • Preparing report
  • Revise policies and procedures

Security Awareness and Training
  • Major weakness users unawareness
  • Organizational effort
  • Educational effort
  • Customer training
  • Federal Trade Commission program to educate
    customers about web scams

Risk Management
Risk Assessment
Financial Loss
Dollar Amount Losses by Type
Total Loss (2006) 53,494,290 CSI/FBI Computer
Crime and Security Survey Computer Security
Security Protection
Real Cost of Cyber Attack
  • Damage of the target may not reflect the real
    amount of damage
  • Services may rely on the attacked service,
    causing a cascading and escalating damage
  • Need support for decision makers to
  • Evaluate risk and consequences of cyber attacks
  • Support methods to prevent, deter, and mitigate
    consequences of attacks

System Security Engineering (Traditional View)
Specify System Architecture
Identify and Install Safeguards
Identify Threats, Vulnerabilities, Attacks
Prioritize Vulnerabilities
Estimate Risk
Risk is acceptably low
Risk Management Framework (Business Context)
Understand Business Context
Understand the Business Context
  • Who cares?
  • Identify business goals, priorities and
    circumstances, e.g.,
  • Increasing revenue
  • Meeting service-level agreements
  • Reducing development cost
  • Generating high return investment
  • Identify software risk to consider

Identify Business and Technical Risks
  • Why should business care?
  • Business risk
  • Direct threat
  • Indirect threat
  • Consequences
  • Financial loss
  • Loss of reputation
  • Violation of customer or regulatory constraints
  • Liability
  • Tying technical risks to the business context in
    a meaningful way

Synthesize and Rank the Risks
  • What should be done first?
  • Prioritization of identified risks based on
    business goals
  • Allocating resources
  • Risk metrics
  • Risk likelihood
  • Risk impact
  • Risk severity
  • Number of emerging risks

Define the Risk Mitigation Strategy
  • How to mitigate risks?
  • Available technology and resources
  • Constrained by the business context what can the
    organization afford, integrate, and understand
  • Need validation techniques

Carry Out Fixes and Validate
  • Perform actions defined in the previous stage
  • Measure completeness against the risk
    mitigation strategy
  • Progress against risk
  • Remaining risks
  • Assurance of mechanisms
  • Testing

Measuring and Reporting
  • Continuous and consistent identification and
    storage of risk information over time
  • Maintain risk information at all stages of risk
  • Establish measurements, e.g.,
  • Number of risks, severity of risks, cost of
    mitigation, etc.

Assets-Threat Model (1)
  • Threats compromise assets
  • Threats have a probability of occurrence and
    severity of effect
  • Assets have values
  • Assets are vulnerable to threats

Assets-Threat Model (2)
  • Risk expected loss from the threat against an
  • RVPS
  • R risk
  • V value of asset
  • P probability of occurrence of threat
  • V vulnerability of the asset to the threat

System-Failure Model
  • Estimate probability of highly undesirable events
  • Risk likelihood of undesirable outcome

Undesirable outcome
Risk Acceptance
  • Certification
  • How well the system meet the security
    requirements (technical)
  • Accreditation
  • Managements approval of automated system

Building It Secure
  • 1960s US Department of Defense (DoD) risk of
    unsecured information systems
  • 1970s
  • 1977 DoD Computer Security Initiative
  • US Government and private concerns
  • National Bureau of Standards (NBS now NIST)
  • Responsible for standards for acquisition and use
    of federal computing systems
  • Federal Information Processing Standards (FIPS

  • Two initiatives for security
  • Cryptography standards
  • 1973 invitation for technical proposals for
  • 1977 Data Encryption Standard
  • 2001 Advanced Encryption Standard (NIST)
  • Development and evaluation processes for secure
  • Conferences and workshops
  • Involves researchers, constructors, vendors,
    software developers, and users
  • 1979 Mitre Corporation entrusted to produce an
    initial set of criteria to evaluate the security
    of a system handling classified data

National Computer Security Center
  • 1981 National Computer Security Center (NCSC)
    was established within NSA
  • To provide technical support and reference for
    government agencies
  • To define a set of criteria for the evaluation
    and assessment of security
  • To encourage and perform research in the field of
  • To develop verification and testing tools
  • To increase security awareness in both federal
    and private sector
  • 1985 Trusted Computer System Evaluation Criteria
    (TCSEC) Orange Book

Orange Book
  • Orange Book objectives
  • Guidance of what security features to build into
    new products
  • Provide measurement to evaluate security of
  • Basis for specifying security requirements
  • Security features and Assurances
  • Trusted Computing Base (TCB) security components
    of the system hardware, software, and firmware
    reference monitor

Orange Book
  • Supply
  • Users evaluation metrics to assess the
    reliability of the security system for protection
    of classified or sensitive information when
  • Commercial product
  • Internally developed system
  • Developers/vendors design guide showing security
    features to be included in commercial systems
  • Designers guide for the specification of
    security requirements

Orange book
  • Set of criteria and requirements
  • Three main categories
  • Security policy protection level offered by the
  • Accountability of the users and user operations
  • Assurance of the reliability of the system

Security Policy
  • Concerns the definition of the policy regulation
    the access of users to information
  • Discretionary Access Control
  • Mandatory Access Control
  • Labels for objects and subjects
  • Reuse of objects basic storage elements must be
    cleaned before released to a new user

  • Identification/authentication
  • Audit
  • Trusted path no users are attempting to access
    thr system fraudulently

  • Reliable hardware/software/firmware components
    that can be evaluated separately
  • Operation reliability
  • Development reliability

Operation reliability
  • During system operation
  • System architecture TCB isolated from user
    processes, security kernel isolated from
    non-security critical portions of the TCB
  • System integrity correct operation (use
    diagnostic software)
  • Covert channel analysis
  • Trusted facility management separation of duties
  • Trusted recovery recover security features after
    TCB failures

Development reliability
  • System reliable during the development process.
    Formal methods.
  • System testing security features tested and
  • Design specification and verification correct
    design and implementation wrt security policy.
    TCB formal specifications proved
  • Configuration management configuration of the
    system components and its documentation
  • Trusted distribution no unauthorized

  • Defined set of documents
  • Minimal set
  • Trusted facility manual
  • Security features users guide
  • Test documentation
  • Design documentation
  • Personnel info Operators, Users, Developers,

Orange Book Levels
  • Highest Security
  • A1 Verified protection
  • B3 Security Domains
  • B2 Structured Protection
  • B1 Labeled Security Protections
  • C2 Controlled Access Protection
  • C1 Discretionary Security Protection
  • D Minimal Protection
  • No Security

NCSC Rainbow Series
  • Orange Trusted Computer System Evaluation
  • Yellow Guidance for applying the Orange Book
  • Red Trusted Network Interpretation
  • Lavender Trusted Database Interpretation

Evaluation Process
  • Preliminary technical review (PTR)
  • Preliminary technical report architecture
    potential for target rating
  • Vendor assistance phase (VAP)
  • Review of the documentation needed for the
    evaluation process, e.g., security features
    users guide, trusted facility manual, design
    documentation, test plan. For B or higher,
    additional documentations are needed, e.g.,
    covert channel analysis, formal model, etc.
  • Design analysis phase (DAP)
  • Initial product assessment report (IPAR) 100-200
    pages, detailed info about the hardware, software
    architecture, security relevant features, team
    assessments, etc.
  • Technical Review Board
  • Recommendation to the NCSC

Evaluation Process
  • Formal evaluation phase (FEP)
  • Product Bulletin formal and public announcement
  • Final Evaluation Report information from IPAR
    and testing results, additional tests, review
    code (B2 and up), formal policy model, proof.
  • Recommends rating for the system
  • NCSC decides final rating
  • Rating maintenance phase (RAMP)
  • Minor changes and revisions
  • Reevaluated
  • Rating maintenance plan

European Criteria
  • German Information Security Agency German Green
    Book (1988)
  • British Department of Trade and Industry and
    Ministry of Defense several volumes of criteria
  • Canada, Australia, France works on evaluation
  • 1991 Information Technology Security Evaluation
    Criteria (ITSEC)
  • For European community
  • Decoupled features from assurance
  • Introduced new functionality requirement classes
  • Accommodated commercial security requirements

Common Criteria
  • January 1996 Common Criteria
  • Joint work with Canada and Europe
  • Separates functionality from assurance
  • Nine classes of functionality audit,
    communications, user data protection,
    identification and authentication, privacy,
    protection of trusted functions, resource
    utilization, establishing user sessions, and
    trusted path.
  • Seven classes of assurance configuration
    management, delivery and operation, development,
    guidance documents, life cycle support, tests,
    and vulnerability assessment.

Common Criteria
  • Evaluation Assurance Levels (EAL)
  • EAL1 functionally tested
  • EAL2 structurally tested
  • EAL3 methodologically tested and checked
  • EAL4 methodologically designed, tested and
  • EAL5 semi-formally designed and tested
  • EAL6 semi-formally verified and tested
  • EAL7 formally verified design and tested

National Information Assurance Partnership (NIAP)
  • 1997 National Institute of Standards and
    Technology (NIST), National Security Agency
    (NSA), and Industry
  • Aims to improve the efficiency of evaluation
  • Transfer methodologies and techniques to private
    sector laboratories
  • Functions developing tests, test methods, tools
    for evaluating and improving security products,
    developing protection profiles and associated
    tests, establish formal and international schema
    for CC

Next Class
  • Software Development Lifecycle