Intrusion Detection Testing and Benchmarking Methodologies - PowerPoint PPT Presentation

1 / 20
About This Presentation
Title:

Intrusion Detection Testing and Benchmarking Methodologies

Description:

LARIAT 'emulates the network traffic from a small organization ... WebReflector emulates the behavior of large Web, application and data server environments ... – PowerPoint PPT presentation

Number of Views:91
Avg rating:3.0/5.0
Slides: 21
Provided by: insaCom
Category:

less

Transcript and Presenter's Notes

Title: Intrusion Detection Testing and Benchmarking Methodologies


1
Intrusion Detection Testing and Benchmarking
Methodologies
  • Nicholas Athanasiades, Randal Abler, John Levine,
    Henry Owen, and George Riley
  • School of Electrical and Computer Engineering
  • Georgia Institute of Technology

2
1. Introduction
  • Beginning of the Intrusion Detection Evaluation
  • DARPA(19981999)
  • LARIAT (Lincoln Adaptable Real-time Information
    Assurance Test-bed)(20002001)
  • Most common methodologies
  • Traffic generation is one of the most difficult
    ones
  • Synthetic traffic not represent the realities of
    an actual network
  • SmartBits
  • Scripting tools

3
2. Existing Tools and Testing Methodologies
  • A. DARAPA Environment
  • B. LARIAT Environment
  • C. Nidsbench and IDS Wakeup
  • D. IDSwakeup
  • E. Flame Thrower
  • F. WebAvalanche/WebReflector
  • G. Tcpreplay
  • H. Fragrouter
  • I. Hping2
  • J. Iperf

4
2. Existing Tools and Testing Methodologies
  • A. DARAPA Environment
  • Approach
  • An off-line (Tune and optimize) and an on-line
    (actual testing) evaluation executed
  • Tcpreplay
  • Protocol/traffic activity
  • HTTP, X window, SQL, SMTP, DNS, FTP, POP3,
    Finger, Telnet, IRC, SNMP, and Time

5
2. Existing Tools and Testing Methodologies
  • A. DARAPA Environment

Solaris SunOS Linux
Denial of Service (11 types, 43 instances) Back, Neptune, Ping of death, Smurf, syslog, Land, apache2, Mailbomb, Process table, UDP storm Back, Neptune, Ping of death, Smurf, Land, apache2, Mailbomb, Process table, UDP storm Back, Neptune, Ping of death, Smurf, teardrop, Land, apache2, Mailbomb, Process table, UDP storm
Remote to Local (14 types, 17 instances) Dictionary, ftp-write, guest, phf, http tunnel, xlock, xsnoop Dictionary, ftp-write, guest, phf, http tunnel, xlock, xsnoop Dictionary, ftp-write, guest, imap, phf, named, http tunnel, sendmail, xlock, xsnoop
User to Root (7 type, 38 instances) Eject, ffbconfig, Fdformat, ps Loadmodule, ps Perl, xterm
Surveillance/ Probe (6 types, 22 instances) Eject, nmap, Port sweep, Satan, mscan, saint Eject, nmap, Port sweep, Satan, mscan, saint Eject, nmap, Port sweep, Satan, mscan, saint
Figure 1 Attacks in the 1998 DARPA evaluation
6
2. Existing Tools and Testing Methodologies
  • A. DARAPA Environment
  • 1999 the goals shifted to testing complete
    systems
  • Changes and additions
  • Victim Windows NT added
  • New stealthy attacks added
  • Two new types of analysis performed
  • An analysis of misses and high-scoring false
    alarms
  • Participants were allowed to submit information
    aiding in the identification of many attacks and
    their appropriate response
  • Detection of novel attacks without first training

7
2. Existing Tools and Testing Methodologies
  • B. LARIAT Environment
  • LARIAT emulates the network traffic from a small
    organization connected to the Internet
  • Many phases
  • Network discovery phase
  • Then, initializes the network and configures the
    hosts
  • The tests conditions are set up
  • Traffic generation is done through the use of
    defined service models
  • Modified a Linux Kernel that allow their software
    to generate background traffic
  • Part of a government project and not publicly
    available

8
2. Existing Tools and Testing Methodologies
  • C. Nidsbench
  • A NIDS Test Suite released in 1999
  • Made up of the components tcpreplay, idtest and
    fragrouter
  • D. IDSwakeup
  • Like Nidsbench
  • It generates false attacks, a false positive test
    utility
  • Consists of IDSwakeup and utilizes hping and iwu
  • E. Flame Thrower
  • Commercial load stress tool used to identify
    network infrastructure weaknesses
  • Produces transaction in order to test network
    infrastructure and applications
  • Supports HTTP/HTTPS 1.0, 1.1 and SSL
  • It can emulate over two million IP address
  • FirewallStressor measure throughput under attack
    conditions
  • Flame Thrower intended for testing firewalls

9
2. Existing Tools and Testing Methodologies
  • F. WebAvalanche/WebReflector
  • Commercial network appliances used in the testing
    of IDS
  • WebAvalanche is a stress-testing appliance
  • WebReflector emulates the behavior of large Web,
    application and data server environments
  • Support such as HTTP 1.0/1.1, SSL, RTSP/RTP and
    FTP
  • Measure percent dropped packets, latencies,
    maximum number of users and new user arrival
    rates
  • G. Tcpreplay
  • Allows captured traffic to be played back on a
    network at different speeds
  • Tcpdump or snoop

10
2. Existing Tools and Testing Methodologies
  • H. Fragrouter
  • An attack generation tool
  • For testing anti-evasion techniques and
    fragmentation queues
  • I. Hping2
  • A command-line packet assembler and analyzer
  • Allows one to create and transmit custom ICMP,
    UDP, and TCP packets
  • Fingerprint remote operating systems
  • J. Iperf
  • Measures bandwidth, delay jitter and datagram
    loss
  • Used as a background traffic source

11
4. Examples of Intrusion Detection Evaluation
Environments
  • DARPA Like Environment
  • Custom Software
  • Advanced Security Audit Trail Analysis on Unix
  • Vendor Independent Testing Lab
  • Trade Magazine Evaluation

12
DARPA Like Environment
  • 5 components
  • Traffic generating
  • Victim was an anonymous FTP server running on a
    Sun UltraSparc-1 using a Solaris 2.5 OS
  • Attack Injection programs
  • The in house reference programs counted the
    number of hung connection at the victim server as
    a measure of attack effectiveness. They used a
    metric called virulence. Virulence described the
    intensity of an attack situation.
  • The evaluation method was to use 10, 15, 30, 40
    and 60 attacking hosts each utilizing rates of
    varying rates of attacks per second.

13
Custom Software
  • A software platform that simulates intrusions and
    tests IDS effectiveness
  • Criteria used included
  • Broad Detection Range
  • Economy in resource usage
  • Resilience to stress
  • The benchmark platform was base on Expect and
    Tool Command Language Distributed Programming
    (TCL-DP) package

14
Advanced Security audit trail Analysis on uniX
  • The test consisted of the following scenarios
  • Trojan horse
  • Attempted break-ins
  • Masquerading
  • Suspicious connections
  • Black listed addresses
  • Nosing numerous moves through directories
  • Privilege abuse

15
Vendor Independent Testing Lab
  • NSS tests a broad range of features of IDS
  • Convenience ease of installation, deployment and
    management
  • UI reporting and alerts delivered
  • Attack signatures
  • Accuracy
  • Peripheral issues like licensing, documentation
    and log management

16
Vendor Independent Testing Lab
  • NSSs test-bed
  • P3 1GHz 768 MB RAM running Windows 2000 SP2,
    FreeBSD 4.4 or Red Hat 6.2/7.1
  • Ghost image
  • 100M Ethernet with CAT-5, Intel NetStructure 40T
    routing Switches and Intel auto-sensing 10/100
    network cards
  • IDS installed on a dual-homed PC on each subnet
  • No firewall used

17
Vendor Independent Testing Lab
  • NSS five types of tests
  • Attack recognition
  • SAN top 20 and/or ICAT top 10 vulnerability lists
  • Performance under load
  • Back Orifice ping
  • 64-byte, 1514-byte packets/25,50,75 and 100
    percent of network load
  • Adtech AX/4000 Broadband Test System and
    SmartBits SMB6000

18
Vendor Independent Testing Lab
  • NSS five types of tests
  • IDS evasion techniques
  • Tools Fragrouter and whisker
  • Stateful operation test
  • Tools stick and snot used to generate false
    alerts
  • Host performance
  • Network load, CPU and memory utilizations were
    monitored

19
Trade Magazine Evaluation
  • Interesting approach
  • IDSs in the production network of an ISP
  • Deployed four machines
  • The metrics were accuracy, ease of use, and uptime

20
Conclusion
Write a Comment
User Comments (0)
About PowerShow.com