0% found this document useful (0 votes)
51 views11 pages

Chapter 3 Modelling and Evaluation

The chapter discusses model evaluation and performance measures. A confusion matrix is used to evaluate classifier performance by comparing predicted and actual classifications. The confusion matrix is used to calculate measures like accuracy, true positive rate, false positive rate, precision, F1 score, and Matthews correlation coefficient. These measures help assess how well a model is classifying outcomes.

Uploaded by

Shreeji Modh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
51 views11 pages

Chapter 3 Modelling and Evaluation

The chapter discusses model evaluation and performance measures. A confusion matrix is used to evaluate classifier performance by comparing predicted and actual classifications. The confusion matrix is used to calculate measures like accuracy, true positive rate, false positive rate, precision, F1 score, and Matthews correlation coefficient. These measures help assess how well a model is classifying outcomes.

Uploaded by

Shreeji Modh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 11

CHAPTER 3

Modelling and Evaluation


TOPICS COVERED IN THE
CHAPTER
 Selecting a Model (Predictive/Descriptive)
(Learned in Chapter 2)
(
https://careerfoundry.com/en/blog/data-an
alytics/different-types-of-data-analysis
/)
 Training a Model for supervised learning

(Learned in Chapter 2 )
 Evaluating performance of a model
 Model Representation and Interpretability
EVALUATING PERFORMANCE OF
A MODEL
Diagnostics(Evaluation measures) of Classifiers:
 Confusion Matrix: A Confusion matrix lays out the
comparison between the predicted classification and
actual classification to provide various performance
measures.
Actual Class
Confusion Matrix for Class 1 and
Class 2 Class 1 Class 2

Class 1 True Positive False Positive


Predicted Class
Class 2 False Negative True Negative
 True Positive (TP) = 6
You predicted positive and it’s true. You predicted
that an animal is a cat and it actually is.
 True Negative (TN) = 11
You predicted negative and it’s true. You predicted
that animal is not a cat and it actually is not (it’s a
dog).
 False Positive (Type 1 Error) (FP) = 2
You predicted positive and it’s false. You predicted
that animal is a cat but it actually is not (it’s a
dog).
 False Negative (Type 2 Error) (FN) = 1
You predicted negative and it’s false. You
predicted that animal is not a cat but it actually is.
 Based on the values of above slide, u can calculate
various performance parameters for your
classification model.
a) Accuracy: Accuracy simply measures how often
the classifier makes the correct prediction. It’s
the ratio between the number of correct
predictions and the total number of predictions.

Error Rate = 1 – Accuracy


b) Recall / True Positive rate/Sensitivity: The rate
at which the model has correctly classified the
positive outcomes.
c) False Positive Rate: The rate at which the model
has incorrectly classified the positive outcomes.

D) True Negative rate/Specificity: The rate at which


the model has correctly classified the negative
outcomes.

E) False Negative Rate: The rate at which the


model has incorrectly classified the negative
outcomes.
f) Precision: The percentage of samples marked
positive that were really positive.

g) F1 Score: F1 score is the harmonic mean of the


precision and recall.
or

h) Matthews Correlation Coefficient (MCC) : It is a


correlation coefficient between the observed
and predicted binary classifications.
QUESTION:
Actual Class
Confusion Matrix for Banana and
Apple Banana Apple(Not
Banana)
Banana 5 2
Predicted Class Apple(Not 3 3
Banana)

Find Accuracy, TPR,FPR,Precision


QUESTION:

TP= 100 FP=10


FN=5 TN=50

Find F1 score and MCC

You might also like