Title: Collection of general data mining briefings
 1Data Mining for Malicious Code 
Detection and 
 Security Applications 
 Prof. Bhavani Thuraisingham The University of 
Texas at Dallas
 March 17, 2008 Lecture 19  
 2Outline
- Acknowledgement 
 - Prof. Latifur Khan, Dr. Awad (former PhD student 
and postdoc at UTD), Mehedy Masud (PhD student)  - Overview of Data Mining 
 - Vision for Assured Information Sharing (Our 
framework)  - Data Mining for Cyber security applications 
 - Intrusion Detection 
 - Data Mining for Assembly code 
 - Data mining for Buffer overflow 
 - Data mining for firewall policy checking 
 - Data mining for email work detection
 
  3Vision Assured Information Sharing
Data/Policy for Coalition
Publish
Publish
Data/Policy
Data/Policy
Publish
Data/Policy
Component
Component
Data/Policy for 
Data/Policy for 
Agency A
Agency C
- Friendly partners 
 - Semi-honest partners 
 - Untrustworthy partners
 
Component
Data/Policy for 
Agency B 
 4What is Data Mining? 
 5Whats going on in data mining?
- What are the technologies for data mining? 
 - Database management, data warehousing, machine 
learning, statistics, pattern recognition, 
visualization, parallel processing  - What can data mining do for you? 
 - Data mining outcomes Classification, Clustering, 
Association, Anomaly detection, Prediction, 
Estimation, . . .  - How do you carry out data mining? 
 - Data mining techniques Decision trees, Neural 
networks, Market-basket analysis, Link analysis, 
Genetic algorithms, . . .  - What is the current status? 
 - Many commercial products mine relational 
databases  - What are some of the challenges? 
 - Mining unstructured data, extracting useful 
patterns, web mining, Data mining, security and 
privacy 
  6Data Mining for Intrusion Detection Problem
- An intrusion can be defined as any set of 
actions that attempt to compromise the integrity, 
confidentiality, or availability of a resource.  - Attacks are 
 - Host-based attacks 
 - Network-based attacks 
 - Intrusion detection systems are split into two 
groups  - Anomaly detection systems 
 - Misuse detection systems 
 - Use audit logs 
 - Capture all activities in network and hosts. 
 - But the amount of data is huge! 
 
  7Misuse Detection
  8Problem Anomaly Detection
  9Our Approach Overview
Training Data
Class
Hierarchical Clustering (DGSOT)
Testing
SVM Class Training
DGSOT Dynamically growing self organizing tree
Testing Data 
 10Our Approach Hierarchical Clustering
Our Approach
Hierarchical clustering with SVM flow chart 
 11Results
 Training Time, FP and FN Rates of Various 
Methods  
 12 Introduction Detecting Malicious Executables 
using Data Mining
- What are malicious executables? 
 - Harm computer systems 
 - Virus, Exploit, Denial of Service (DoS), Flooder, 
Sniffer, Spoofer, Trojan etc.  - Exploits software vulnerability on a victim 
 - May remotely infect other victims 
 - Incurs great loss. Example Code Red epidemic 
cost 2.6 Billion  -  Malicious code detection Traditional approach 
 - Signature based 
 - Requires signatures to be generated by human 
experts  - So, not effective against zero day attacks
 
  13 State of the Art in Automated Detection 
- Automated detection approaches 
 - Behavioural analyse behaviours like source, 
destination address, attachment type, statistical 
anomaly etc.  - Content-based analyse the content of the 
malicious executable  - Autograph (H. Ah-Kim  CMU) Based on automated 
signature generation process  - N-gram analysis (Maloof, M.A. et .al.) Based on 
mining features and using machine learning.  
  14Our New Ideas (Khan, Masud and Thuraisingham)
- Content -based approaches consider only 
machine-codes (byte-codes).  - Is it possible to consider higher-level source 
codes for malicious code detection?  - Yes Diassemble the binary executable and 
retrieve the assembly program  - Extract important features from the assembly 
program  - Combine with machine-code features 
 
  15Feature Extraction
- Binary n-gram features 
 - Sequence of n consecutive bytes of binary 
executable  - Assembly n-gram features 
 - Sequence of n consecutive assembly instructions 
 - System API call features 
 - DLL function call information
 
  16 The Hybrid Feature Retrieval Model
- Collect training samples of normal and malicious 
executables.  - Extract features 
 - Train a Classifier and build a model 
 - Test the model against test samples
 
  17Hybrid Feature Retrieval (HFR)
  18Hybrid Feature Retrieval (HFR)
  19 Feature Extraction
- Binary n-gram features 
 - Features are extracted from the byte codes in the 
form of n-grams, where n  2,4,6,8,10 and so on. 
  -  Example 
 -  Given a 11-byte sequence 0123456789abcdef012
345,  - The 2-grams (2-byte sequences) are 0123, 2345, 
4567, 6789, 89ab, abcd, cdef, ef01, 0123, 2345  - The 4-grams (4-byte sequences) are 01234567, 
23456789, 456789ab,...,ef012345 and so on....  - Problem 
 - Large dataset. Too many features (millions!). 
 - Solution 
 - Use secondary memory, efficient data structures 
 - Apply feature selection 
 
  20 Feature Extraction
- Assembly n-gram features 
 - Features are extracted from the assembly programs 
in the form of n-grams, where n  2,4,6,8,10 and 
so on.   -  Example 
 -  three instructions 
 - push eax mov eax, dword0f34  add ecx, 
eax  -  2-grams 
 - (1) push eax mov eax, dword0f34 
 -  (2) mov eax, dword0f34 add ecx, eax 
 - Problem 
 - Same problem as binary 
 - Solution 
 - Same solution 
 
  21 Feature Selection
- Select Best K features 
 - Selection Criteria Information Gain 
 - Gain of an attribute A on a collection of 
examples S is given by 
  22Experiments
- Dataset 
 - Dataset1 838 Malicious and 597 Benign 
executables  - Dataset2 1082 Malicious and 1370 Benign 
executables  - Collected Malicious code from VX Heavens 
(http//vx.netlux.org)  - Disassembly 
 - Pedisassem ( http//www.geocities.com/sangcho/ind
ex.html )  - Training, Testing 
 - Support Vector Machine (SVM) 
 - C-Support Vector Classifiers with an RBF kernel 
 
  23Results
- HFS  Hybrid Feature Set 
 - BFS  Binary Feature Set 
 - AFS  Assembly Feature Set
 
  24Results
- HFS  Hybrid Feature Set 
 - BFS  Binary Feature Set 
 - AFS  Assembly Feature Set
 
  25Results
- HFS  Hybrid Feature Set 
 - BFS  Binary Feature Set 
 - AFS  Assembly Feature Set
 
  26 Future Plans
- System call 
 - seems to be very useful. 
 - Need to Consider Frequency of call 
 - Call sequence pattern (following program path) 
 - Actions immediately preceding or after call 
 - Detect Malicious code by program slicing 
 - requires analysis
 
  27Data Mining for Buffer Overflow Introduction
- Goal 
 - Intrusion detection. 
 - e.g. worm attack, buffer overflow attack. 
 - Main Contribution 
 - 'Worm' code detection by data mining coupled with 
'reverse engineering'.  - Buffer overflow detection by combining data 
mining with static analysis of assembly code.   
  28Background 
- What is 'buffer overflow'? 
 - A situation when a fixed sized buffer is 
overflown by a larger sized input.  - How does it happen? 
 - example 
 
........ char buff100 gets(buff) ........
buff
Stack
memory
Input string 
 29Background (cont...)
buff
Stack
........ char buff100 gets(buff) ........
buff
Stack
memory
Return address overwritten
Attacker's code
buff
Stack
memory
New return address points to this memory location 
 30Background (cont...)
- So what? 
 - Program may crash or 
 - The attacker can execute his arbitrary code 
 - It can now 
 - Execute any system function 
 - Communicate with some host and download some 
'worm' code and install it!  - Open a backdoor to take full control of the 
victim  - How to stop it?
 
  31Background (cont...)
- Stopping buffer overflow 
 - Preventive approaches 
 - Detection approaches 
 - Preventive approaches 
 - Finding bugs in source code. Problem can only 
work when source code is available.  - Compiler extension. Same problem. 
 - OS/HW modification  
 - Detection approaches 
 - Capture code running symptoms. Problem may 
require long running time.  - Automatically generating signatures of buffer 
overflow attacks. 
  32CodeBlocker (Our approach)
- A detection approach 
 - Based on the Observation 
 - Attack messages usually contain code while normal 
messages contain data.  - Main Idea 
 - Check whether message contains code 
 - Problem to solve 
 - Distinguishing code from data 
 
  33Some Statistics
- Statistics to support this observation(a)on 
Windows platforms  - most web servers (port 80) accept data only 
 - remote access services (ports 111, 137, 138, 139) 
accept data only Microsoft SQL Servers (port 
1434) accept data only  - workstation services (ports 139 and 445) accept 
data only.   - (b) On Linux platforms, most 
 - Apache web servers (port 80) accept data only 
 - BIND (port 53) accepts data only 
 - SNMP (port 161) accepts data only 
 - most Mail Transport (port 25) accepts data only 
 - Database servers (Oracle, MySQL, PostgreSQL) at 
ports 1521, 3306 and 5432 accept data only. 
  34Severity of the problem
- It is not easy to detect actual instruction 
sequence from a given string of bits 
  35Our solution 
- Apply data mining. 
 - Formulate the problem as a classification problem 
(code, data)  - Collect a set of training examples, containing 
both instances  - Train the data with a machine learning algorithm, 
get the model  - Test this model against a new message
 
  36CodeBlocker Model 
 37Feature Extraction 
 38Disassembly
- We apply SigFree tool 
 - implemented by Xinran Wang et al. (PennState) 
 
  39Feature extraction
- Features are extracted using 
 - N-gram analysis 
 - Control flow analysis 
 - N-gram analysis
 
What is an n-gram? -Sequence of n 
instructions Traditional approach -Flow of 
control is ignored 2-grams are 02, 24, 46,...,CE
Assembly program
Corresponding IFG 
 40Feature extraction (cont...)
- Control-flow Based N-gram analysis
 
What is an n-gram? -Sequence of n 
instructions Proposed Control-flow based 
approach -Flow of control is 
considered 2-grams are 02, 24, 46,...,CE, E6
Assembly program
Corresponding IFG 
 41Feature extraction (cont...)
- Control Flow analysis. Generated features 
 - Invalid Memory Reference (IMR) 
 - Undefined Register (UR) 
 - Invalid Jump Target (IJT) 
 - Checking IMR 
 - A memory is referenced using register addressing 
and the register value is undefined  - e.g. mov ax, dx  5 
 - Checking UR 
 - Check if the register value is set properly 
 - Checking IJT 
 - Check whether jump target does not violate 
instruction boundary 
  42Putting it together
- Why n-gram analysis? 
 - Intuition in general, disassembled executables 
should have a different pattern of instruction 
usage than disassembled data.  - Why control flow analysis? 
 - Intuition there should be no invalid memory 
references or invalid jump targets.  - Approach 
 - Compute all possible n-grams 
 - Select best k of them 
 - Compute feature vector (binary vector) for each 
training example  - Supply these vectors to the training algorithm 
 
  43Experiments
- Dataset 
 - Real traces of normal messages 
 - Real attack messages 
 - Polymorphic shellcodes 
 - Training, Testing 
 - Support Vector Machine (SVM)
 
  44Results
- CFBn Control-Flow Based n-gram feature 
 - CFF Control-flow feature
 
  45Novelty, Advantages, Limitations, Future
- Novelty 
 - We introduce the notion of control flow based 
n-gram  - We combine control flow analysis with data mining 
to detect code / data  - Significant improvement over other methods (e.g. 
SigFree)  - Advantages 
 - Fast testing 
 - Signature free operation 
 - Low overhead 
 - Robust against many obfuscations 
 - Limitations 
 - Need samples of attack and normal messages. 
 - May not be able to detect a completely new type 
of attack.  - Future 
 - Find more features 
 - Apply dynamic analysis techniques 
 - Semantic analysis 
 
  46Analysis of Firewall Policy Rules Using Data 
Mining Techniques
- Firewall is the de facto core technology of 
todays network security  - First line of defense against external network 
attacks and threats  - Firewall controls or governs network access by 
allowing or denying the incoming or outgoing 
network traffic according to firewall policy 
rules.  - Manual definition of rules often result in in 
anomalies in the policy  - Detecting and resolving these anomalies manually 
is a tedious and an error prone task  - Solutions 
 - Anomaly detection 
 - Theoretical Framework for the resolution of 
anomaly  -  A new algorithm will simultaneously detect and 
resolve any anomaly that is present in the 
policy rules  - Traffic Mining Mine the traffic and detect 
anomalies  -  
 
  47Traffic Mining
- To bridge the gap between what is written in the 
firewall policy rules and what is being observed 
in the network is to analyze traffic and log of 
the packets traffic mining  - Network traffic trend may show that some rules 
are out-dated or not used recently  
Firewall Policy Rule 
 481 TCP,INPUT,129.110.96.117,ANY,...,80,DENY 2
 TCP,INPUT,...,ANY,...,80,ACCEPT 3 
TCP,INPUT,...,ANY,...,443,DENY 4 
TCP,INPUT,129.110.96.117,ANY,...,22,DENY 5 
TCP,INPUT,...,ANY,...,22,ACCEPT 6 
TCP,OUTPUT,129.110.96.80,ANY,...,22,DENY 7 
UDP,OUTPUT,...,ANY,...,53,ACCEPT 8 
UDP,INPUT,...,53,...,ANY,ACCEPT 9 
UDP,OUTPUT,...,ANY,...,ANY,DENY 10 
UDP,INPUT,...,ANY,...,ANY,DENY 11 
TCP,INPUT,129.110.96.117,ANY,129.110.96.80,22,DENY
 12 TCP,INPUT,129.110.96.117,ANY,129.110.96.80,80
,DENY 13 UDP,INPUT,...,ANY,129.110.96.80,ANY,
DENY 14 UDP,OUTPUT,129.110.96.80,ANY,129.110.10.
,ANY,DENY 15 TCP,INPUT,...,ANY,129.110.96.80,
22,ACCEPT 16 TCP,INPUT,...,ANY,129.110.96.80,
80,ACCEPT 17 UDP,INPUT,129.110..,53,129.110.96.
80,ANY,ACCEPT 18 UDP,OUTPUT,129.110.96.80,ANY,129
.110..,53,ACCEPT
Rule 1, Rule 2 gt GENRERALIZATION Rule 1, Rule 
16 gt CORRELATED Rule 2, Rule 12 gt 
SHADOWED Rule 4, Rule 5 gt GENRERALIZATION Rule
 4, Rule 15 gt CORRELATED Rule 5, Rule 11 
gt SHADOWED
Anomaly Discovery Result 
 49Worm Detection Introduction
- What are worms? 
 - Self-replicating program Exploits software 
vulnerability on a victim Remotely infects other 
victims  - Evil worms 
 - Severe effect Code Red epidemic cost 2.6 
Billion  - Goals of worm detection 
 - Real-time detection 
 - Issues 
 - Substantial Volume of Identical Traffic, Random 
Probing  - Methods for worm detection 
 - Count number of sources/destinations Count 
number of failed connection attempts  - Worm Types 
 - Email worms, Instant Messaging worms, Internet 
worms, IRC worms, File-sharing Networks worms  - Automatic signature generation possible 
 - EarlyBird System (S. Singh -UCSD) Autograph (H. 
Ah-Kim - CMU)  
  50Email Worm Detection using Data Mining
Task given some training instances of both 
normal and viral emails, induce a hypothesis 
to detect viral emails. 
We used Naïve Bayes SVM
Outgoing Emails
The Model
Test data
Feature extraction
Classifier
Machine Learning
Training data
Clean or Infected ? 
 51Assumptions
- Features are based on outgoing emails. 
 - Different users have different normal 
behaviour.  - Analysis should be per-user basis. 
 - Two groups of features 
 - Per email (of attachments, HTML in body, 
text/binary attachments)  - Per window (mean words in body, variable words in 
subject)  - Total of 24 features identified 
 - Goal Identify normal and viral emails based 
on these features 
  52Feature sets
- Per email features 
 - Binary valued Features 
 - Presence of HTML script tags/attributes 
embedded images hyperlinks  - Presence of binary, text attachments MIME types 
of file attachments  - Continuous-valued Features 
 - Number of attachments Number of words/characters 
in the subject and body  - Per window features 
 - Number of emails sent Number of unique email 
recipients Number of unique sender addresses 
Average number of words/characters per subject, 
body average word length Variance in number of 
words/characters per subject, body Variance in 
word length  - Ratio of emails with attachments 
 
  53Data Mining Approach
Classifier
Clean/ Infected
Test instance
Clean/ Infected
infected?
SVM
Naïve Bayes
Test instance
Clean?
Clean 
 54Data set
- Collected from UC Berkeley. 
 - Contains instances for both normal and viral 
emails.  - Six worm types 
 - bagle.f, bubbleboy, mydoom.m, 
 - mydoom.u, netsky.d, sobig.f 
 - Originally Six sets of data 
 - training instances normal (400)  five worms 
(5x200)  - testing instances normal (1200)  the sixth worm 
(200)  - Problem Not balanced, no cross validation 
reported  - Solution re-arrange the data and apply 
cross-validation 
  55Our Implementation and Analysis
- Implementation 
 - Naïve Bayes Assume Normal distribution of 
numeric and real data smoothing applied  - SVM with the parameter settings one-class SVM 
with the radial basis function using gamma  
0.015 and nu  0.1.  - Analysis 
 - NB alone performs better than other techniques 
 - SVM alone also performs better if parameters are 
set correctly  - mydoom.m and VBS.Bubbleboy data set are not 
sufficient (very low detection accuracy in all 
classifiers)  - The feature-based approach seems to be useful 
only when we have  - identified the relevant features 
 - gathered enough training data 
 - Implement classifiers with best parameter 
settings