0% found this document useful (0 votes)
26 views7 pages

SWE622 Lecture 3-1 Classification Accuracy

The document defines key terms used to evaluate classification accuracy such as true positives, false positives, true negatives, and false negatives. It explains how to calculate accuracy, precision, recall, the F-measure, and the break even point using a confusion matrix. Several factors are discussed that determine whether a given accuracy level is considered good or bad, such as the problem and base rate.

Uploaded by

Lamia
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
26 views7 pages

SWE622 Lecture 3-1 Classification Accuracy

The document defines key terms used to evaluate classification accuracy such as true positives, false positives, true negatives, and false negatives. It explains how to calculate accuracy, precision, recall, the F-measure, and the break even point using a confusion matrix. Several factors are discussed that determine whether a given accuracy level is considered good or bad, such as the problem and base rate.

Uploaded by

Lamia
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Multimedia Systems

Lecture 3 Annex 1
Classification Accuracy
Dr. Mohammad Abdullah-Al-Wadud
[email protected]
Few Keywords
• TP = Number of test samples correctly classified as
‘positive’.

• FP = Number of test samples incorrectly classified as


‘positive’.

• TN = Number of test samples correctly classified as


‘negative’.

• FN = Number of test samples incorrectly classified as


‘negative’.
Confusion Matrix
Predicted as Positive Predicted as Negative
Actual Positive TP FN
Actual Negative FP TN

𝑇𝑃 + 𝑇𝑁
𝐴𝑐𝑐𝑢𝑟𝑎𝑐𝑦 =
𝑇𝑃 + 𝑇𝑁 + 𝐹𝑃 + 𝐹𝑁

Is 99% accuracy good?


– can be excellent, good, mediocre, poor, terrible
– depends on problem

Is 10% accuracy bad?


– information retrieval

BaseRate = accuracy of predicting predominant class


- on most problems obtaining BaseRate accuracy is easy
Precision & Recall
• Precision = Percentage of test samples correctly
classified as ‘positive’ with respect to the total
number of test samples classified as ‘positive’.
• Recall = Percentage of test samples correctly
classified as ‘positive’ with respect to the total
number of ‘positive’ test samples.

𝑇𝑃
Precision=
𝑇𝑃+𝐹𝑃

𝑇𝑃
Recall=
𝑇𝑃+𝐹𝑁
The F-measure
• Harmonic average of precision and recall

2 × 𝑝𝑟𝑒𝑐𝑖𝑠𝑖𝑜𝑛 × 𝑟𝑒𝑐𝑎𝑙𝑙
𝐹=
𝑝𝑟𝑒𝑐𝑖𝑠𝑖𝑜𝑛 + 𝑟𝑒𝑐𝑎𝑙𝑙
BreakEvenPoint
• BreakEvenPoint = PRECISION = RECALL
Some other measures
• Sensitivity
• Specificity
• Receiver Operator Characteristic (ROC) area
• etc.

You might also like