0% found this document useful (0 votes)
81 views2 pages

Confusion Matrix

The confusion matrix measures the performance of a classification model on test data by displaying the number of true positives, true negatives, false positives, and false negatives. For binary classification, the matrix is 2x2, and for multi-class classification, the matrix shape is nxn where n is the number of classes. The confusion matrix allows calculating performance metrics like accuracy, precision, recall, and F1-score.

Uploaded by

Divyanshi Dubey
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
81 views2 pages

Confusion Matrix

The confusion matrix measures the performance of a classification model on test data by displaying the number of true positives, true negatives, false positives, and false negatives. For binary classification, the matrix is 2x2, and for multi-class classification, the matrix shape is nxn where n is the number of classes. The confusion matrix allows calculating performance metrics like accuracy, precision, recall, and F1-score.

Uploaded by

Divyanshi Dubey
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 2

CONFUSION MATRIX:

1. To measure the performance of the classification model we use the confusion matrix.
2. The matrix displays the number of true positives (TP), true negatives (TN), false positives
(FP), and false negatives (FN) produced by the model on the test data.
3. For binary classification, the matrix will be of a 2X2 table, For multi-class classification, the
matrix shape will be equal to the number of classes i.e. for n classes it will be nXn.

EXAMPLE:

A 2X2 Confusion matrix is shown below for the image recognition having a Dog image or
Not Dog image.

Actual
Dog Not Dog
Predicted Dog True Positive False Positive
(TP) (FP)
Not Dog False Negative True Negative
(FN) (TN)

 True Positive (TP): It is the total counts having both predicted and actual values are
Dog.
 True Negative (TN): It is the total counts having both predicted and actual values are
Not Dog.
 False Positive (FP): It is the total counts having prediction is Dog while actually Not
Dog.
 False Negative (FN): It is the total counts having prediction is Not Dog while
actually, it is Dog.

WHAT INFORMATION WE CAN GET FORM CONFUSION MATRIX?

Accuracy: Accuracy is used to measure the performance of the model. It is the ratio of Total correct
instances to the total instances.

Accuracy = (TP+TN)/(TP+TN+FP+FN)

Precision: is a measure of how accurate a model’s positive predictions are. It is defined as the ratio of
true positive predictions to the total number of positive predictions made by the model.

Precision = TP/(TP+FP)

Recall: measures the effectiveness of a classification model in identifying all relevant instances from
a dataset. It is the ratio of the number of true positive (TP) instances to the sum of true positive and
false negative (FN) instances.

Recall=TP/(TP+FN)
F1-Score: F1-score is used to evaluate the overall performance of a classification model. It is the
harmonic mean of precision and recall.

F1 Score= 2. Precision. Recall/(Precision+Recall)

You might also like