Simple Guide To Confusion Matrix Terminology
Simple Guide To Confusion Matrix Terminology
Simple guide to
confusion matrix
Launch a data science career!
terminology
Join Data School Insiders Let's start with an example confusion matrix for
https://www.dataschool.io/simple-guide-to-confusion-matrix-terminology/ 2/14
4/28/2019 Simple guide to confusion matrix terminology
About it correct?
TP/predicted yes = 100/110 = 0.91
Prevalence: How often does the yes
condition actually occur in our sample?
actual yes/total = 105/165 = 0.64
Email address:
the null error rate. (More details about
Cohen's Kappa.)
F Score: This is a weighted average of the
Join the Newsletter true positive rate (recall) and precision. (More
details about the F Score.)
New? Start here! ROC Curve: This is a commonly used graph
that summarizes the performance of a
Machine Learning course
classi er over all possible thresholds. It is
Join my 80,000+ YouTube generated by plotting the True Positive Rate
subscribers (y-axis) against the False Positive Rate (x-axis)
as you vary the threshold for assigning
Join Data School Insiders
observations to a given class. (More details
Private forum for Insiders about ROC Curves.)
EMAIL FACEBOOK
TWITTERLINKEDIN
TUMBLRREDDIT GOOGLE+
POCKET
Email address:
New? Start here! All comments are moderated, and will usually be
approved by Kevin within a few hours. Thanks for
Machine Learning course your patience!
LOG IN WITH
Name