Data Mining Classification: Basic Concepts, Decision Trees, and Model Evaluation - PowerPoint PPT Presentation

1 / 35
About This Presentation
Title:

Data Mining Classification: Basic Concepts, Decision Trees, and Model Evaluation

Description:

Classification: Basic Concepts, Decision Trees, and Model Evaluation Lecture Notes for Chapter 4 Introduction to Data Mining by Tan, Steinbach, Kumar – PowerPoint PPT presentation

Number of Views:194
Avg rating:3.0/5.0
Slides: 36
Provided by: Compu227
Category:

less

Transcript and Presenter's Notes

Title: Data Mining Classification: Basic Concepts, Decision Trees, and Model Evaluation


1
Data Mining Classification Basic Concepts,
Decision Trees, and Model Evaluation
  • Lecture Notes for Chapter 4
  • Introduction to Data Mining
  • by
  • Tan, Steinbach, Kumar

2
Classification Definition
  • Given a collection of records (training set )
  • Each record is by characterized by a tuple (x,y),
    where x is the attribute set and y is the class
    label
  • x attribute, predictor, independent variable,
    input
  • y class, response, dependent variable, output
  • Task
  • Learn a model that maps each attribute set x into
    one of the predefined class labels y

3
Examples of Classification Task
Task Attribute set, x Class label, y
Categorizing email messages Features extracted from email message header and content spam or non-spam
Identifying tumor cells Features extracted from MRI scans malignant or benign cells
Cataloging galaxies Features extracted from telescope images Elliptical, spiral, or irregular-shaped galaxies
4
General Approach for Building Classification Model
5
Classification Techniques
  • Base Classifiers
  • Decision Tree based Methods
  • Rule-based Methods
  • Nearest-neighbor
  • Neural Networks
  • Naïve Bayes and Bayesian Belief Networks
  • Support Vector Machines
  • Ensemble Classifiers
  • Boosting, Bagging, Random Forests

6
Example of a Decision Tree
categorical
categorical
continuous
class
Splitting Attributes
Home Owner
Yes
No
MarSt
NO
Married
Single, Divorced
Income
NO
lt 80K
gt 80K
YES
NO
Model Decision Tree
Training Data
7
Another Example of Decision Tree
categorical
categorical
continuous
class
Single, Divorced
MarSt
Married
Home Owner
NO
No
Yes
Income
lt 80K
gt 80K
YES
NO
There could be more than one tree that fits the
same data!
8
Decision Tree Induction
  • Many Algorithms
  • Hunts Algorithm (one of the earliest)
  • CART
  • ID3, C4.5
  • SLIQ,SPRINT

9
General Structure of Hunts Algorithm
  • Let Dt be the set of training records that reach
    a node t
  • General Procedure
  • If Dt contains records that belong the same class
    yt, then t is a leaf node labeled as yt
  • If Dt contains records that belong to more than
    one class, use an attribute test to split the
    data into smaller subsets. Recursively apply the
    procedure to each subset.

Dt
?
10
Hunts Algorithm
11
How to determine the Best Split
Before Splitting 10 records of class 0, 10
records of class 1
Which test condition is the best?
12
How to determine the Best Split
  • Greedy approach
  • Nodes with purer class distribution are preferred
  • Need a measure of node impurity

High degree of impurity
Low degree of impurity
13
Measures of Node Impurity
  • Gini Index
  • Entropy
  • Misclassification error

14
Comparison among Impurity Measures
For a 2-class problem
15
Measure of Impurity GINI
  • Gini Index for a given node t
  • (NOTE p( j t) is the relative frequency of
    class j at node t).
  • Maximum (1 - 1/nc) when records are equally
    distributed among all classes, implying least
    interesting information
  • Minimum (0.0) when all records belong to one
    class, implying most interesting information

16
Binary Attributes Computing GINI Index
  • Splits into two partitions
  • Effect of Weighing partitions
  • Larger and Purer Partitions are sought for.

B?
Yes
No
Node N1
Node N2
Gini(N1) 1 (5/6)2 (1/6)2 0.278
Gini(N2) 1 (2/6)2 (4/6)2 0.444
Gini(Children) 6/12 0.278 6/12
0.444 0.361
17
Categorical Attributes Computing Gini Index
  • For each distinct value, gather counts for each
    class in the dataset
  • Use the count matrix to make decisions

Multi-way split
Two-way split (find best partition of values)
18
Decision Tree Based Classification
  • Advantages
  • Inexpensive to construct
  • Extremely fast at classifying unknown records
  • Easy to interpret for small-sized trees
  • Accuracy is comparable to other classification
    techniques for many simple data sets

19
Rule-Based Classifier
  • Classify records by using a collection of
    ifthen rules

R1 (Give Birth no) ? (Can Fly yes) ?
Birds R2 (Give Birth no) ? (Live in Water
yes) ? Fishes R3 (Give Birth yes) ? (Blood
Type warm) ? Mammals R4 (Give Birth no) ?
(Can Fly no) ? Reptiles R5 (Live in Water
sometimes) ? Amphibians
20
Nearest Neighbor Classifiers
  • Basic idea
  • If it walks like a duck, quacks like a duck, then
    its probably a duck

21
Bayes Classifier
  • A probabilistic framework for solving
    classification problems
  • Key idea is that certain attribute values are
    more likely (probable) for some classes than for
    others
  • Example Probability an individual is a male or
    female if the individual is wearing a dress
  • Conditional Probability
  • Bayes theorem

22
Evaluating Classifiers
  • Confusion Matrix

PREDICTED CLASS PREDICTED CLASS PREDICTED CLASS
ACTUALCLASS ClassYes ClassNo
ACTUALCLASS ClassYes a b
ACTUALCLASS ClassNo c d
a TP (true positive) b FN (false negative) c
FP (false positive) d TN (true negative)
23
Accuracy
  • Most widely-used metric

PREDICTED CLASS PREDICTED CLASS PREDICTED CLASS
ACTUALCLASS ClassYes ClassNo
ACTUALCLASS ClassYes a(TP) b(FN)
ACTUALCLASS ClassNo c(FP) d(TN)
24
Methods for Classifier Evaluation
  • Holdout
  • Reserve k for training and (100-k) for testing
  • Random subsampling
  • Repeated holdout
  • Cross validation
  • Partition data into k disjoint subsets
  • k-fold train on k-1 partitions, test on the
    remaining one
  • Leave-one-out kn
  • Bootstrap
  • Sampling with replacement
  • .632 bootstrap

25
Problem with Accuracy
  • Consider a 2-class problem
  • Number of Class 0 examples 9990
  • Number of Class 1 examples 10
  • If a model predicts everything to be class 0,
    accuracy is 9990/10000 99.9
  • This is misleading because the model does not
    detect any class 1 example
  • Detecting the rare class is usually more
    interesting (e.g., frauds, intrusions, defects,
    etc)

26
Example of classification accuracy measures
PREDICTED CLASS PREDICTED CLASS PREDICTED CLASS
ACTUALCLASS ClassYes ClassNo
ACTUALCLASS ClassYes 35(TP) 5(FN)
ACTUALCLASS ClassNo 5(FP) 5(TN)
  • Accuracy 0.8
  • For Yes class precision 87.5, recall 87.5,
    F-measure 87.5
  • For No class precision 0.5, recall 0.5,
    F-measure 0.5

02/14/2011
CSCI 8980 Spring 2011 Mining Biomedical Data
26
27
Example of classification accuracy measures
PREDICTED CLASS PREDICTED CLASS PREDICTED CLASS
ACTUALCLASS ClassYes ClassNo
ACTUALCLASS ClassYes 99(TP) 1(FN)
ACTUALCLASS ClassNo 10(FP) 90(TN)
 
  • Accuracy 0.9450
  • Sensitivity 0.99
  • Specificity 0.90

02/14/2011
CSCI 8980 Spring 2011 Mining Biomedical Data
27
28
Measures of Classification Performance
 
PREDICTED CLASS PREDICTED CLASS PREDICTED CLASS
ACTUALCLASS Yes No
ACTUALCLASS Yes TP FN
ACTUALCLASS No FP TN
? is the probability that we reject the null
hypothesis when it is true. This is a Type I
error or a false positive (FP). ? is the
probability that we accept the null hypothesis
when it is false. This is a Type II error or a
false negative (FN).
02/14/2011
CSCI 8980 Spring 2011 Mining Biomedical Data
28
29
ROC (Receiver Operating Characteristic)
  • A graphical approach for displaying trade-off
    between detection rate and false alarm rate
  • Developed in 1950s for signal detection theory to
    analyze noisy signals
  • ROC curve plots True Positive Rate (TPR) against
    (False Positive Rate) FPR
  • Performance of a model represented as a point in
    an ROC curve
  • Changing the threshold parameter of classifier
    changes the location of the point
  • http//commonsenseatheism.com/wp-content/uploads/2
    011/01/Swets-Better-Decisions-Through-Science.pdf

02/14/2011
CSCI 8980 Spring 2011 Mining Biomedical Data
29
30
ROC Curve
  • (TPR,FPR)
  • (0,0) declare everything to be
    negative class
  • (1,1) declare everything to be positive
    class
  • (1,0) ideal
  • Diagonal line
  • Random guessing
  • Below diagonal line
  • prediction is opposite of the true class

02/14/2011
CSCI 8980 Spring 2011 Mining Biomedical Data
30
31
Using ROC for Model Comparison
  • No model consistently outperforms the other
  • M1 is better for small FPR
  • M2 is better for large FPR
  • Area Under the ROC curve
  • Ideal Area 1
  • Random guess Area 0.5

02/14/2011
CSCI 8980 Spring 2011 Mining Biomedical Data
31
32
ROC (Receiver Operating Characteristic)
  • To draw ROC curve, classifier must produce
    continuous-valued output
  • Outputs are used to rank test records, from the
    most likely positive class record to the least
    likely positive class record
  • Many classifiers produce only discrete outputs
    (i.e., predicted class)
  • Approaches to get ROC curve for other types of
    classifiers such as decision trees
  • WEKA gives you ROC curves

02/14/2011
CSCI 8980 Spring 2011 Mining Biomedical Data
32
33
ROC Curve Example
  • - 1-dimensional data set containing 2 classes
    (positive and negative)
  • - Any points located at x gt t is classified as
    positive

02/14/2011
CSCI 8980 Spring 2011 Mining Biomedical Data
33
34
How to Construct an ROC curve
  • Use classifier that produces continuous-valued
    output for each test instance score(A)
  • Sort the instances according to score(A) in
    decreasing order
  • Apply threshold at each unique value of
    score(A)
  • Count the number of TP, FP, TN, FN at each
    threshold
  • TPR TP/(TPFN)
  • FPR FP/(FP TN)

Instance score(A) True Class
1 0.95
2 0.93
3 0.87 -
4 0.85 -
5 0.85 -
6 0.85
7 0.76 -
8 0.53
9 0.43 -
10 0.25
02/14/2011
CSCI 8980 Spring 2011 Mining Biomedical Data
34
35
How to construct an ROC curve
Threshold gt
ROC Curve
02/14/2011
CSCI 8980 Spring 2011 Mining Biomedical Data
35
Write a Comment
User Comments (0)
About PowerShow.com