0% found this document useful (0 votes)
6 views3 pages

Confusion Matrix Analysis

The document explains how to analyze a confusion matrix in binary classification, detailing its structure and performance metrics such as accuracy, precision, recall, F1 score, and specificity. It also discusses the interpretation of the matrix, indicating that high true positives and true negatives are indicative of a good model, while high false negatives and false positives can signify issues. Additionally, it provides an example using COVID test results to illustrate the classification of predictions.

Uploaded by

adarshhalse45
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views3 pages

Confusion Matrix Analysis

The document explains how to analyze a confusion matrix in binary classification, detailing its structure and performance metrics such as accuracy, precision, recall, F1 score, and specificity. It also discusses the interpretation of the matrix, indicating that high true positives and true negatives are indicative of a good model, while high false negatives and false positives can signify issues. Additionally, it provides an example using COVID test results to illustrate the classification of predictions.

Uploaded by

adarshhalse45
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

Understanding and Analyzing Confusion Matrix

HOW TO ANALYZE CONFUSION MATRIX

1. Understand the Structure (Binary Classification)

| | Predicted Positive | Predicted Negative |

|----------------|--------------------|--------------------|

| Actual Positive | True Positive (TP) | False Negative (FN) |

| Actual Negative | False Positive (FP) | True Negative (TN) |

2. Calculate Performance Metrics

Accuracy = (TP + TN) / (TP + FP + FN + TN)

Precision = TP / (TP + FP)

Recall = TP / (TP + FN)

F1 = 2 * (Precision * Recall) / (Precision + Recall)

Specificity = TN / (TN + FP)

3. Interpret the Matrix

- High TP & TN, low FP & FN = Good model

- High FN = Model is missing positives (bad for disease detection)

- High FP = False alarms (bad for spam filters)

4. For Multi-class Classification

- Diagonal = Correct predictions

- Off-diagonal = Misclassifications
Understanding and Analyzing Confusion Matrix

HOW TO IDENTIFY WHAT IS TP, TN,FP,FN

Given Actual = 1, Predicted = 0

=> This is a False Negative (FN)

Summary Table:

| Actual | Predicted | Category |

|--------|-----------|----------|

|1 |1 | True Positive (TP) |

|0 |0 | True Negative (TN) |

|0 |1 | False Positive (FP) |

|1 |0 | False Negative (FN) |

EXAMPLE:

Example: COVID Test Results

| Person | Actual | Predicted | Result Type |

|--------|--------|-----------|-------------|

|A |1 |1 | True Positive (TP) |

|B |0 |0 | True Negative (TN) |

|C |1 |0 | False Negative (FN) |

|D |0 |1 | False Positive (FP) |

|E |1 |1 | True Positive (TP) |

Count:
Understanding and Analyzing Confusion Matrix

TP = 2 (A, E)

TN = 1 (B)

FP = 1 (D)

FN = 1 (C)

You might also like