Confusion Matrix
Confusion Matrix
Confusion Matrix
net/publication/355096788
Confusion Matrix
CITATION READS
1 8,334
1 author:
Zohreh Karimi
Kharazmi University
16 PUBLICATIONS 3 CITATIONS
SEE PROFILE
Some of the authors of this publication are also working on these related projects:
All content following this page was uploaded by Zohreh Karimi on 06 October 2021.
Confusion Matrix
The confusion matrix is in the form of a square matrix whre the column
represents the actual values and the row depicts the predicted value of
the model and vica versa.
1
Written by zohreh karimi(junior machine learning engineer)
• TP: True Positive: The actual value was positive and the model
predicted a positive value
• TN: True Negative: The actual value was negative and the model
predicted a negative value
1- Accuracy:
made for the complete test dataset. Accuracy is a good basic metric to
measure the performance of the model. In unbalanced datasets, accuracy
becomes a poor metric
𝑇𝑃+𝑇𝑁
accuracy=
𝑇𝑃+𝑇𝑁+𝐹𝑃+𝐹𝑁
2- Misclassification:
𝐹𝑃+𝐹𝑁
misclassification=
𝑇𝑃+𝑇𝑁+𝐹𝑃+𝐹𝑁
3- Precison
Precison tells us how many of the correctly predicted case actually
turned out to be positive. This would determine whether our model is
reliable or not.
2
Written by zohreh karimi(junior machine learning engineer)
4- Recall(sensivity)
Recall tells us how many of the actual positive cases we were able to
predict correctly with our model.
Recall is a useful metric in cases where False Negative trumps False
Positive.
𝑇𝑃
Recall=
𝑇𝑃+𝐹𝑁
A higher recall means that most of the positive cases (TP+FN) will be
labeled as positive(TP). This wil lead to a higher number of FP
measurement and a lower overall accuracy.
A low recall means that you have a high number of FN (should have
been positive but labeled as negative). This means that you have more
certainty if you found a positive case, this is like to be a true positive.
5- F1-Score
When we try to increase the precision of model , the recall grows down
and vice-versa.
F1-Score is a harmonic mean of Precison and Recall and so it gives a
combined idea about these two metrics. It is maximum when precision
is equal to recall.
1
F1-Score= 1 1
+
𝑟𝑒𝑐𝑎𝑙𝑙 𝑝𝑟𝑒𝑐𝑖𝑠𝑖𝑜𝑛
3
Written by zohreh karimi(junior machine learning engineer)