Cybersecurity Today and Tomorrow: Pay Now or Pay Later - PowerPoint PPT Presentation

1 / 27
About This Presentation
Title:

Cybersecurity Today and Tomorrow: Pay Now or Pay Later

Description:

The National Academy of Sciences National Academy of Engineering, and Institute ... They consist of the nation's top scientists, engineers, and medical experts. ... – PowerPoint PPT presentation

Number of Views:55
Avg rating:5.0/5.0
Slides: 28
Provided by: herb5
Category:

less

Transcript and Presenter's Notes

Title: Cybersecurity Today and Tomorrow: Pay Now or Pay Later


1
Cybersecurity Today and Tomorrow Pay Now or Pay
Later
  • Herb Lin
  • National Research Council
  • hlin_at_nas.edu, 202-334-3191

2
The National Academies
  • The National Academy of Sciences National Academy
    of Engineering, and Institute of Medicine are
    honorific organizations operating under a
    Congressional charter granted in 1863. They
    consist of the nations top scientists,
    engineers, and medical experts.
  • The National Research Council is the operating
    arm of the National Academies to associate the
    broad community of science and technology with
    the Academys purpose of advising the government,
    the public, and the scientific and engineering
    communities.
  • The Computer Science and Telecommunications Board
    has responsibility within the NRC for matters of
    public policy involving information technology.

3
The CSTB Board
  • DAVID D. CLARK, Massachusetts Institute of
    Technology, Chair
  • DAVID BORTH, Motorola Labs
  • JAMES CHIDDIX, AOL Time Warner
  • JOHN M. CIOFFI, Stanford University
  • ELAINE COHEN, University of Utah
  • W. BRUCE CROFT, University of Massachusetts at
    Amherst
  • THOMAS E. DARCIE, ATT Labs Research
  • JOSEPH FARRELL, University of California at
    Berkeley
  • JEFFREY M. JAFFE, Bell Laboratories, Lucent
    Technologies
  • ANNA KARLIN, University of Washington
  • BUTLER W. LAMPSON, Microsoft Corporation
  • EDWARD D. LAZOWSKA, University of Washington
  • DAVID LIDDLE, U.S. Venture Partners
  • TOM M. MITCHELL, WhizBang! Labs, Inc.
  • DONALD NORMAN, Nielsen Norman Group
  • DAVID A. PATTERSON, University of California at
    Berkeley
  • HENRY (HANK) PERRITT, Chicago-Kent College of Law
  • BURTON SMITH, Cray, Inc.
  • TERRY SMITH, University of California at Santa
    Barbara
  • LEE SPROULL, New York University
  • JEANNETTE M. WING, Carnegie Mellon University

4
Why Now?
  • September 11, 2001 was the impetus.
  • Lessons from September 11
  • Some terrorists are patient
  • Some terrorists plan their operations very well
  • Some terrorists are smart and technically
    sophisticated
  • Some terrorists know about the Internet
  • Note Cybersecurity more narrow than
    reliability/availability

5
This report
  • Compiles understanding, knowledge, findings, and
    recommendations from a decade of CSTB reports on
    cybersecurity
  • 1991. Computers at Risk Safe Computing in the
    Information Age.
  • 1996. Cryptography's Role in Securing the
    Information Society.
  • 1997. For the Record Protecting Electronic
    Health Information.
  • 1999. Trust in Cyberspace.
  • 1996. Continued Review of the Tax Systems
    Modernization of the Internal Revenue Service.
  • 1999. Realizing the Potential of C4I
    Fundamental Challenges.
  • 2001. Embedded, Everywhere

6
The Main Message
  • Whats the new news now?
  • Information technology has changed dramatically
    in the last decade Internet, e-commerce, PCs
    everywhere
  • What was news 10 years ago in cybersecurity is
    STILL relevant today!
  • Speaks to a very sorry state of progress in the
    field.

7
What can go wrong with a computer system or
network?
  • It can become unavailable or very slow (e.g.,
    denial of service)
  • It can become corrupted (e.g., bad data)
  • It can become leaky (e.g., sensitive information
    revealed)

8
Causes of Deliberate Cyber-problems
  • Cyber-only attack
  • virus
  • denial of service
  • originate from anywhere
  • damage may be invisible
  • Compromise of trusted insider
  • recruited, tricked, or planted
  • Physical destruction of IT facility

9
Possible Harm from Cyber-attack
  • In general, NOT comparable to 9/11
  • massive loss of life, damage to physical
    infrastructure over a very short amount of time.
  • Cyber-only attack could result in economic and
    associated social harm
  • compromise of electric power distribution
    transportation and shipping, financial
    transactions
  • Many opportunity costs interruption of business,
    forgoing of various activities and associated
    benefits.
  • Major effect of large-scale cyberattack
    coordinated with attack on physical
    infrastructure.
  • ATC airliner hijackings
  • 911/phone bombing

10
What do we know about Cyber-Security?
  • General Observations
  • Information system vulnerabilities are growing
    faster than our ability (and willingness) to
    respond
  • Management
  • Operational Considerations
  • Design and Architectural Considerations

11
General Observations
  • Security costs, not only in dollars but
    especially in interference with daily work.
  • No widely accepted metrics for characterizing
    security, so hard to know how much security a
    certain investment buys or is enough.
  • System security is a holistic problem, in which
    technological, managerial, organizational,
    regulatory, economic, and social aspects
    interact.
  • The best is the enemy of the good.
  • Security is a game of action and reaction.
  • Cyberattacks can be easily covered up.
  • Reporting is essential for forensics, attack
    characterization, and prevention.

12
Management
  • Cybersecurity today is far worse than what known
    best practices can provide.
  • The payoff from security investments is
    uncertain society often captures the benefit of
    improved security. Hence, underinvestment in
    security customers buy features rather than
    security.
  • A central aspect of security must be
    accountability.
  • Many security problems exist not because a fix is
    unknown but because some responsible party has
    not implemented a known fix.
  • Resolve the conflict between holding people
    responsible and full reporting of problems, which
    tends to be easier in an environment in which
    individuals are not fearful of reporting problems.

13
Operational Considerations
  • Independent red-teaming for understand actual
    vulnerabilities
  • Target must not know about the test
  • Testers must not be constrained
  • Problems must be fixed!
  • Many compromises result from improper
    configuration.
  • Unauthorized but legitimate modem
  • Bug fixes missing because system was restored
    from a backup tape
  • Improperly configured firewall
  • Overly broad access privileges
  • Need for automated configuration management tools
    and automatic security update (very challenging
    problem)
  • Fallback action plans needed do less in order to
    lower vulnerability

14
Design and Architectural Considerations
  • Tensions between security and other good things,
    such as features, ease of use, and
    interoperability (similar to insurance)
  • Human error is usually not a useful explanation
    for security problems.
  • operational practice that requires people to get
    too many details right too little red-team
    testing
  • management practice that allows too little time
    for security procedures or fails to ensure that
    problems uncovered are fixed.
  • User authentication is essential for access
    control and for auditing.
  • Passwords are 40-yr old security technology
  • (Biometrically-enabled) hardware token
  • Defense in depth vs/and perimeter-only defense

15
What can be done?
  • Individual organizations
  • Vendors
  • Public policy makers

16
Individual organizations
  • Resource an internal entity that provides direct
    defensive operational support to system
    administrators throughout the organization
  • Ensure that adequate information security tools
    are available and used hold people accountable
    for use
  • Conduct frequent, unannounced red-team
    penetration testing and report the results to
    responsible management.
  • Promptly fix problems and vulnerabilities that
    are found.
  • Mandate the organization-wide use of available
    network/configuration management tools, use of
    strong authentication mechanisms
  • Design systems under the assumption that they
    could be connected to a compromised network or a
    network that is under attack, and practice
    operating these systems under this assumption.

17
Vendors
  • Improve the user interface to security, which is
    totally incomprehensible in nearly all of today's
    systems.
  • Develop tools to monitor systems automatically
    for consistency with defined secure
    configurations and enforce these configurations.
  • Provide schemes for user authentication based on
    hardware tokens.
  • Ship systems with security features turned on and
    default identifications and passwords turned off
  • Conduct more rigorous testing of software and
    systems for security flaws, and do so before
    releasing products rather than use customers as
    implicit beta testers to shake out security
    flaws.

18
Policy Makers
  • Consider legislative responses to the failure of
    existing incentives to cause the market to
    respond adequately to the security challenge.
  • Position the federal government as a leader in
    security.
  • Provide adequate support for research and
    development on information systems security.

19
CSTB/NRC
  • Reports from
  • www.nap.edu
  • The organization that did this report
  • www.cstb.org
  • The National Academies
  • www.national-academies.org
  • Me
  • Herb Lin, Senior Scientist
  • hlin_at_nas.edu, 202-334-3191

20
Thoughts on Liability
  • CSTB/NRC report did NOT endorse liability for
    software vendors.
  • Policy makers should consider legislative
    responses to the failure of existing incentives
    to cause the market to respond adequately to the
    security challenge. Possible options include
    steps that would increase the exposure of
    software and system vendors and system operators
    to liability for system breaches and mandated
    reporting of security breaches that could
    threaten critical societal functions.

21
The broad outline
  • Market forces have failed to provide an
    environment in which vendors and users have
    sufficient incentives to provide for security.
  • Inadequate information about consequences of
    breaches.
  • Society often captures benefits of security
    investments (social needs greater than corporate
    needs)
  • Deregulation forces more competitive pressures

22
Responding to Market Failures
  • Mandate behavioral changes
  • Shift the economic calculus
  • carrots
  • yearly award for cybersecurity from Ofc of
    Homeland Security and the President
  • Malcom Baldridge Quality Award
  • ISO XXXXX certification?
  • Immunity for early reporting? (like FAA)
  • sticks
  • Liability
  • red team results into public documents
  • accounting standards to include cybersecurity
    assessments

23
Issues raised by liability
  • differentiating between infrastructure and end
    user for liability purposes (e.g., my program
    works on your operating system - how to allocate
    liability?)
  • holding infrastructure providers liable for
    economic damage is hard under current case law
  • uncorrelated damages mean that insurance model
    wont work

24
Issues (continued)
  • reducing incentives to disseminate information or
    to allow forensics or to provide fixes
  • holding vendors responsible for something they
    don't know how to do (e.g., how do deal with loss
    of functionality from security patch, which
    doesn't always work properly)

25
Issues (continued)
  • establishing standards of care (best practices?
    certification?)
  • hard for lay people to understand security
    breaches - how to decide on liability
  • good methodology to establish extent of liability

26
Fire vs Cyberhack Prevention
  • Losses due to deliberate action (hence no
    actuarial basis) (terrorists are not a
    probability distribution)
  • No metrics for security
  • Fundamental science of cybersecurity is not known
  • Damage is often invisible
  • Technical standardization can be similar to
    monoculture weak in face of correlated threat
  • Impact of fix often impossible to be localized
  • Losses largely due to accident (harder to insure
    against arson than lightening)
  • Fire resistance can be quantified (sort of)
  • Fundamental science of fireproofing and
    structural engineering is known
  • Damage is visible
  • Standardization is advantageous when failures can
    be uncorrelated
  • Impact of fixes can be localized

27
On standards for security
  • Distinguish between technical standards and
    behavioral standards
  • Tech standard 56 bit encryption, no buffer
    overflows, Windows XP everywhere (or an Orange
    Book C2 system)
  • Behavioral standard red-team test once a month
Write a Comment
User Comments (0)
About PowerShow.com