0% found this document useful (0 votes)
3 views25 pages

Lecture2 Both Part Merged

The document discusses Bayes Decision Theory in the context of classification tasks, specifically using fruit classification as an example. It explains how to compute posterior probabilities for different classes based on a given feature vector, demonstrating this with a fruit classification example and a buying decision scenario. The document also highlights the use of logarithms to handle small probabilities in calculations.

Uploaded by

fahadkhanraj1111
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views25 pages

Lecture2 Both Part Merged

The document discusses Bayes Decision Theory in the context of classification tasks, specifically using fruit classification as an example. It explains how to compute posterior probabilities for different classes based on a given feature vector, demonstrating this with a fruit classification example and a buying decision scenario. The document also highlights the use of logarithms to handle small probabilities in calculations.

Uploaded by

fahadkhanraj1111
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 25

Classifiers: Bayes

Decision Theory
Farhana Shahid, Brac University
● A classification task of M classes: C1, C2, …, CM
● Fruit classification task of M=3 fruits:
● C1: Apple, C2: Orange, C3 : Berries

● An unknown pattern is given with feature vector x


● x : {color: yellow-green, diameter: 10cm, weight: 180g, taste: sweet}
Problem ● Form M conditional probabilities: P(Ci | x), i= 1,2,…, M
● M=3 conditional probabilities:
● P(Apple | x)
● P(Orange | x) Posterior probabilities
● P(Berries | x)

Read P(Ci | x) as “Probability of class Ci given


feature vector x”

Farhana Shahid, Brac University 2


● Given, feature vector:
● x : {color: yellow-green, diameter: 10cm, weight: 180g, taste: sweet}
● Posterior probability for each class:
Most probable ● P(Apple | x) = 0.52
Max: 0.78
● P(Orange | x) = 0.78
class ● P(Berries | x) = 0.04
Given the features x the fruit is
most probably Orange

Farhana Shahid, Brac University 3


How to compute
posterior probability, P(Ci
| x) ?
Bayes Decision Theory

Farhana Shahid, Brac University 4


Prior ●
probability

Farhana Shahid, Brac University 5


● Given, feature vector:
● x = {feature1: value1, feature2: value2, … , featurel: valuel}
● Probability density function, P(x | Ci)
Likelihood ● Likelihood function of class Ci with respect to x
function ● How likely is the set of features x in a sample of class Ci
● What is the probability that a person will have cough, runny nose, and
fever if s/he is infected with COVID-19
● Can be calculated from training data

Farhana Shahid, Brac University 6


Bayes rule ●

Farhana Shahid, Brac University 7


Same denominator

Most probable ●
class

Farhana Shahid, Brac University 8


Calculate
likelihood ●

function

Farhana Shahid, Brac University 9


Example Math
Naïve Bayes

Farhana Shahid, Brac University 1


Feature vector
Class label

SL Age Income Student Credit rating Buys


computer
1 35 Medium Yes Fair Yes
2 30 High No Average No
3 40 Low Yes Good No

X = {Age: 21, 4 35 Medium No Fair Yes

Income: Medium,
Student: Yes,
5
6
45
35
Supervised
High
Low
No
learning
Excellent
No
Yes
Fair Yes

7 35 Medium No Good No
Credit rating: fair}
8 25 Low No Good Yes
9 28 High No Average No
10 35 Medium Yes Average Yes

Farhana Shahid, Brac University 2


SL Age Income Student Credit rating Buys
computer
1 35 Medium Yes Fair Yes
2 30 High No Average No
3 40 Low Yes Good No
4 35 Medium No Fair Yes
X = {Age: 21, 5 45 Low No Fair Yes
Income: Medium, 6 35 High No Excellent Yes
Student: Yes, 7 35 Medium No Good No

Credit rating: fair} 8 25 Low No Good Yes


9 28 High No Average No
10 35 Medium Yes Average Yes

Farhana Shahid, Brac University 3


SL Age Income Student Credit Buys
rating computer
1 35 Medium Yes Fair Yes

2 30 High No Average No

3 40 Low Yes Good No

X = {Age: 35, 4 35 Medium No Fair Yes

Income:
● Medium, 5 45 Low No Fair Yes
Student: Yes, 6 35 High No Excellent Yes
Credit rating: fair} 7 35 Medium No Good No

8 25 Low No Good Yes

9 28 High No Average No

10 35 Medium Yes Average Yes

Farhana Shahid, Brac University 4


SL Age Income Student Credit Buys
rating computer
1 35 Medium Yes Fair Yes

2 30 High No Average No

3 40 Low Yes Good No

X = {Age: 35, 4 35 Medium No Fair Yes

Income:
● Medium, 5 45 Low No Fair Yes
Student: Yes, 6 35 High No Excellent Yes
Credit rating: fair} 7 35 Medium No Good No

8 25 Low No Good Yes

9 28 High No Average No

10 35 Medium Yes Average Yes

Farhana Shahid, Brac University 5


SL Age Income Student Credit Buys
rating computer
1 35 Medium Yes Fair Yes

2 30 High No Average No

3 40 Low Yes Good No

X = {Age: 35, 4 35 Medium No Fair Yes

Income:
● Medium, 5 45 Low No Fair Yes
Student: Yes, 6 35 High No Excellent Yes
Credit rating: fair} 7 35 Medium No Good No

8 25 Low No Good Yes

9 28 High No Average No

10 35 Medium Yes Average Yes

Farhana Shahid, Brac University 6


SL Age Income Student Credit Buys
rating computer
1 35 Medium Yes Fair Yes

2 30 High No Average No

3 40 Low Yes Good No

X = {Age: 35, 4 35 Medium No Fair Yes

Income:
● Medium, 5 45 Low No Fair Yes
Student: Yes, 6 35 High No Excellent Yes
Credit rating: fair} 7 35 Medium No Good No

8 25 Low No Good Yes

9 28 High No Average No

10 35 Medium Yes Average Yes

Farhana Shahid, Brac University 7


SL Age Income Student Credit Buys
rating computer
1 35 Medium Yes Fair Yes

2 30 High No Average No

3 40 Low Yes Good No

X = {Age: 35, 4 35 Medium No Fair Yes

Income:
● Medium, 5 45 Low No Fair Yes
Student: Yes, 6 35 High No Excellent Yes
Credit rating: fair} 7 35 Medium No Good No

8 25 Low No Good Yes

9 28 High No Average No

10 35 Medium Yes Average Yes

Farhana Shahid, Brac University 8


SL Age Income Student Credit Buys
rating computer
1 35 Medium Yes Fair Yes

2 30 High No Average No

3 40 Low Yes Good No

X = {Age: 35, 4 35 Medium No Fair Yes

Income:
● Medium, 5 45 Low No Fair Yes
Student: Yes, 6 35 High No Excellent Yes
Credit rating: fair} 7 35 Medium No Good No

8 25 Low No Good Yes

9 28 High No Average No

10 35 Medium Yes Average Yes

Farhana Shahid, Brac University 9


SL Age Income Student Credit Buys
rating computer
1 35 Medium Yes Fair Yes

2 30 High No Average No

3 40 Low Yes Good No

X = {Age: 35, 4 35 Medium No Fair Yes

Income:
● Medium, 5 45 Low No Fair Yes
Student: Yes, 6 35 High No Excellent Yes
Credit rating: fair} 7 35 Medium No Good No

8 25 Low No Good Yes

9 28 High No Average No

10 35 Medium Yes Average Yes

Farhana Shahid, Brac University 10


SL Age Income Student Credit Buys
rating computer
1 35 Medium Yes Fair Yes

2 30 High No Average No

3 40 Low Yes Good No

X = {Age: 35, 4 35 Medium No Fair Yes

Income:
● Medium, 5 45 Low No Fair Yes
Student: Yes, 6 35 High No Excellent Yes
Credit rating: fair} 7 35 Medium No Good No

8 25 Low No Good Yes

9 28 High No Average No

10 35 Medium Yes Average Yes

Farhana Shahid, Brac University 11


SL Age Income Student Credit Buys
rating computer
1 35 Medium Yes Fair Yes

2 30 High No Average No

3 40 Low Yes Good No

X = {Age: 35, 4 35 Medium No Fair Yes

Income:
● Medium, 5 45 Low No Fair Yes
Student: Yes, 6 35 High No Excellent Yes
Credit rating: fair} 7 35 Medium No Good No

8 25 Low No Good Yes

9 28 High No Average No

10 35 Medium Yes Average Yes

Farhana Shahid, Brac University 12


● Feature vector, X = {Age: 21, Income: Medium, Student: Yes,
Credit rating: fair}
Most probable ● P(Buys computer: yes | X) ≈ 2.719 × 10-4 Maximum

class ● P(Buys computer: no | X) ≈ 0


● Conclusion: Given the features, the person is most likely to buy
computer.

Farhana Shahid, Brac University 13


●0≤P≤1
● 0.7 × 0.3 × 0.2 × 0.5 =0.021
Too small ● 0.7 × 0.3 × 0.2 × 0.5 × 0.1 × 0.33 × 0.47 = 0.00032571
probabilities ● Solution: Consider logarithms instead of actual probabilities!

Farhana Shahid, Brac University 14


Natural

logarithm of ln(M N) = ln M + ln N

probabilities

Farhana Shahid, Brac University 15


Most probable ● ln P(Buys computer: yes | X) ≈ -8.21 Maximum

class ● ln P(Buys computer: no | X) ≈ -13.37

Farhana Shahid, Brac University 16

You might also like