Naive Bayes
Naive Bayes
Conditional Independence
and Naïve Bayes
Required reading:
• Mitchell draft chapter, sections 1 and 2.
(available on class website)
Tom M. Mitchell
Machine Learning Department
Carnegie Mellon University
Equivalently:
Naïve Bayes
Naïve Bayes assumes
E.g.,
Naïve Bayes uses assumption that the Xi are conditionally
independent, given Y
in general:
• Classify (Xnew)
• Classify (Xnew)
ant
fMRI
activation
high
average
below
average
“bottle” minus mean activation:
Scaling up: 60 exemplars
Categories Exemplars
BODY PARTS leg arm eye foot hand
FURNITURE chair table bed desk dresser
VEHICLES car airplane train truck bicycle
ANIMALS horse dog bear cow cat
KITCHEN
UTENSILS glass knife bottle cup spoon
TOOLS chisel hammer screwdriver pliers saw
BUILDINGS apartment barn house church igloo
PART OF A
BUILDING window door chimney closet arch
CLOTHING coat dress shirt skirt pants
INSECTS fly ant bee butterfly beetle
VEGETABLES lettuce tomato carrot corn celery
MAN MADE
OBJECTS refrigerator key telephone watch bell
Rank Accuracy Distinguishing among 60 words
Where in the brain is activity that
distinguishes tools vs. buildings?
Accuracyat
Accuracy of each
a radius
voxelone
with
aclassifier
radius 1 centered at each
searchlight
voxel:
voxel clusters: searchlights
Accuracies of
cubical
27-voxel
classifiers
centered at
each significant
voxel
[0.7-0.8]
What you should know:
• Training and using classifiers based on Bayes rule
• Conditional independence
– What it is
– Why it’s important
• Naïve Bayes
– What it is
– Why we use it so much
– Training using MLE, MAP estimates
– Discrete variables (Bernoulli) and continuous (Gaussian)
Questions:
• Can you use Naïve Bayes for a combination of
discrete and real-valued Xi?