0% found this document useful (0 votes)
21 views11 pages

Lecture 11 Model Evaluation

Uploaded by

Atif Ali
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
21 views11 pages

Lecture 11 Model Evaluation

Uploaded by

Atif Ali
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 11

MODEL EVALUATION AND

SELECTION
 Evaluation metrics: How can we measure accuracy?
Other metrics to consider?

 Use validation test set of class-labeled tuples


instead of training set when assessing accuracy

 Some of the measures are:


 Accuracy – suitable when class tuples are evenly distributed
 Precision - suitable when class tuples are not evenly
distributed
 Recall - Sensitivity

1
CLASSIFIER EVALUATION METRICS:
CONFUSION MATRIX
Confusion Matrix:

Actual class\Predicted class Yes No


Yes True Positives (TP) False Negatives (FN)
No False Positives (FP) True Negatives (TN)

Actual class\Predicted class C1 ¬ C1


C1 True Positives (TP) False Negatives (FN)
¬ C1 False Positives (FP) True Negatives (TN)

 Given m classes, an entry, CMi,j in a confusion matrix


indicates # of tuples in class i that were labeled by the
classifier as class j
 May have extra rows/columns to provide totals
2
CLASSIFIER EVALUATION METRICS:
CONFUSION MATRIX
 True Positives
 Postive tuples correctly classified as positive.

 True Negatives:
 Negative tuples correctly classified as negative.

 False Positives:
 Negative tuples incorrectly classified as positives.

 False Negatives:
 Positive tuples incorrectly classified as negatives

3
CLASSIFIER EVALUATION METRICS:
CONFUSION MATRIX
Confusion Matrix:
Actual class\Predicted class C1 ¬ C1
C1 True Positives (TP) False Negatives (FN)
¬ C1 False Positives (FP) True Negatives (TN)

Example of Confusion Matrix:


Actual class\Predicted buy_computer buy_computer Total
class = yes = no
buy_computer = yes 1 0
buy_computer = no 1 998
Total 1000

 Given m classes, an entry, CMi,j in a confusion matrix


indicates # of tuples in class i that were labeled by the
classifier as class j
 May have extra rows/columns to provide totals
4
ACCURACY, ERROR RATE,
SENSITIVITY AND SPECIFICITY
A\P C ¬C
C TP FN P
¬C FP TN N
P’ N’ All

 Classifier Accuracy, or recognition rate:


percentage of test set tuples that are correctly
classified
Accuracy = (TP + TN)/All

 Error rate: 1 – accuracy, or


Error rate = (FP + FN)/All

5
PRECISION AND RECALL, AND F-
MEASURES
 Precision: exactness – what % of tuples that the
classifier labeled as positive are actually positive

 Recall: completeness – what % of positive tuples


did the classifier label as positive?
 Perfect score is 1.0

6
CONFUSION MATRIX FOR 3 CLASS
CLASSIFIATION
CLASSIFIER EVALUATION METRICS:
PRECISION AND RECALL, AND F-
MEASURES
 The F-score is a way of combining the precision and recall of
the model, and it is defined as the harmonic mean of the
model’s precision and recall.

 Inverse relationship between precision & recall


 F measure (F1 or F-score): harmonic mean of precision
and recall,

 Fß: weighted measure of precision and recall


 assigns ß times as much weight to recall as to precision

10
CLASSIFIER EVALUATION METRICS: EXAMPLE
Actual Class\Predicted class cancer = yes cancer = no Total Recognition(%)

cancer = yes 90 210 300 30.00 (sensitivity

cancer = no 140 9560 9700 98.56


(specificity)

Total 230 9770 10000 96.40 (accuracy)

 Precision = 90/230 = 39.13% Recall = 90/300 = 30.00%

11

You might also like