0% found this document useful (0 votes)
14 views42 pages

AS Lecture 12 (Naive Bayes Classifier)

Uploaded by

amiraziz.uet
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views42 pages

AS Lecture 12 (Naive Bayes Classifier)

Uploaded by

amiraziz.uet
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 42

NAÏVE BAYES CLASSIFIER

Lecture # 12

Dr. Imran Khalil


[email protected]
Contents
• Naïve Bayes Classifier
• Probability Basics
• Conditional Probability
• Applied Examples
• Assignment

2
Naïve Bayes Algorithm
• Naive Bayes is a family of probabilistic classification algorithms that are
based on Bayes' theorem. It is particularly used for classification and is
known for its simplicity, efficiency, and effectiveness.
• The "naive" part of the name comes from the assumption that the
features (attributes) used for classification are conditionally
independent, given the class label. In other words, it assumes that each
feature provides independent and equal contributions to the
probability of a given class.
• The basic concept behind Naive Bayes is to calculate the probability of a
data point belonging to a specific class based on the probabilities of the
individual features.
• It's widely used in applications like spam email classification, document
categorization, etc. 3
Basics of Probability

1
𝑃(ℎ𝑒𝑎𝑑) =
2

4
Basics of Probability

Pick a random card, what is the probability of getting a queen?


𝟒 𝟏
𝑃 𝒒𝒖𝒆𝒆𝒏 = =
𝟓2 𝟏𝟑

5
Basics of Probability

Pick a random card, you know it is a diamond, now what is the


probability of that card being a queen?
Total diamond =13, Queen = 1
𝟏
𝑃 𝒒𝒖𝒆𝒆𝒏/𝒅𝒊𝒂𝒎𝒐𝒏𝒅 =
𝟏𝟑
6
Conditional Probability
𝟏
𝑃 𝒒𝒖𝒆𝒆𝒏/𝒅𝒊𝒂𝒎𝒐𝒏𝒅 =
𝟏𝟑

P(A/B) = Probability of event A knowing that


event B has already occurred.

7
Conditional Probability
𝑷 𝑩Τ𝑨 ∙ 𝑷 𝑨
𝑃 𝑨/𝑩 =
𝑷(𝑩)
Thomas Bayes

𝑃(𝑑𝑖𝑎𝑚𝑜𝑛𝑑Τ𝑞𝑢𝑒𝑒𝑛) ∙ 𝑃(𝑞𝑢𝑒𝑒𝑛)
𝑃(𝑞𝑢𝑒𝑒𝑛/𝑑𝑖𝑎𝑚𝑜𝑛𝑑) =
𝑃(𝑑𝑖𝑎𝑚𝑜𝑛𝑑)

8
Conditional Probability
𝑃(𝑑𝑖𝑎𝑚𝑜𝑛𝑑Τ𝑞𝑢𝑒𝑒𝑛) ∙ 𝑃(𝑞𝑢𝑒𝑒𝑛)
𝑃(𝑞𝑢𝑒𝑒𝑛/𝑑𝑖𝑎𝑚𝑜𝑛𝑑) =
𝑃(𝑑𝑖𝑎𝑚𝑜𝑛𝑑)
1
𝑃(𝑑𝑖𝑎𝑚𝑜𝑛𝑑/𝑞𝑢𝑒𝑒𝑛) =
4
4
𝑃 𝑞𝑢𝑒𝑒𝑛 =
52
13
𝑃 𝑑𝑖𝑎𝑚𝑜𝑛𝑑 =
52
1 4
∙ 1
𝑃 𝑞𝑢𝑒𝑒𝑛Τ𝑑𝑖𝑎𝑚𝑜𝑛𝑑 = 4 52 =
13 13
52 9
Applied Example-I
No. Color Type Origin Stolen
1 Red Sports Domestic Yes
2 Red Sports Domestic No
3 Red Sports Domestic Yes
4 Yellow Sports Domestic No
5 Yellow Sports Imported Yes
6 Yellow SUV Imported No
7 Yellow SUV Imported Yes
8 Yellow SUV Domestic No
9 Red SUV Imported No
10 Red Sports Imported Yes

𝑰𝒔 𝒕𝒉𝒆 𝒄𝒂𝒓 𝑿 𝒔𝒕𝒐𝒍𝒆𝒏 𝒐𝒓 𝒏𝒐𝒕 𝒘𝒊𝒕𝒉 𝒕𝒉𝒆 𝒇𝒐𝒍𝒍𝒐𝒘𝒊𝒏𝒈 𝒇𝒆𝒂𝒕𝒖𝒓𝒆𝒔:


𝒙 = 𝑹𝒆𝒅, 𝑺𝑼𝑽, 𝑫𝒐𝒎𝒆𝒔𝒕𝒊𝒄 ? 10
Applied Example-I
𝑷 𝑩Τ𝑨 ∙ 𝑷 𝑨
𝑃 𝑨/𝑩 =
𝑷(𝑩)

𝑃(𝑥/𝑌𝑒𝑠) = ?
𝑃(𝑥/𝑁𝑜) = ?

11
Applied Example-I
𝑃(𝑥/𝑌𝑒𝑠) = ?

3 5
𝑃(𝑌𝑒𝑠Τ𝑅𝑒𝑑)∙𝑃(𝑅𝑒𝑑) (5)∙(10) 3
𝑃 𝑅𝑒𝑑 Τ𝑌𝑒𝑠 = = 5 =
𝑃 𝑌𝑒𝑠 (10) 5
1 4
𝑃(𝑌𝑒𝑠Τ𝑆𝑈𝑉)∙𝑃(𝑆𝑈𝑉) ( )∙( ) 1
4 10
𝑃 𝑆𝑈𝑉 Τ𝑌𝑒𝑠 = = 5 =
𝑃 𝑌𝑒𝑠 (10) 5
2 5
𝑃(𝑌𝑒𝑠Τ𝐷𝑜𝑚𝑒𝑠𝑡𝑖𝑐)∙𝑃(𝐷𝑜𝑚𝑒𝑠𝑡𝑖𝑐) ( )∙( ) 2
5 10
𝑃 𝐷𝑜𝑚𝑒𝑠𝑡𝑖𝑐 Τ𝑌𝑒𝑠 = = 5 =
𝑃 𝑌𝑒𝑠 ( ) 5
10

12
Applied Example-I
𝑃(𝑥/𝑁𝑜) = ?
3
𝑃 𝑅𝑒𝑑 Τ𝑌𝑒𝑠 =
5
1
𝑃 𝑆𝑈𝑉 Τ𝑌𝑒𝑠 =
5
2
𝑃 𝐷𝑜𝑚𝑒𝑠𝑡𝑖𝑐 Τ𝑌𝑒𝑠 =
5

3 2
𝑃 𝑅𝑒𝑑 Τ𝑁𝑜 =1− =
5 5
1 4
𝑃 𝑆𝑈𝑉 Τ𝑁𝑜 =1− =
5 5
2 3
𝑃 𝐷𝑜𝑚𝑒𝑠𝑡𝑖𝑐 Τ𝑁𝑜 = 1 − =
5 5 13
Applied Example-I
𝑃(𝑥/𝑌𝑒𝑠) = ?
3
𝑃 𝑅𝑒𝑑 Τ𝑌𝑒𝑠 =
5
1
𝑃 𝑆𝑈𝑉 Τ𝑌𝑒𝑠 =
5
2
𝑃 𝐷𝑜𝑚𝑒𝑠𝑡𝑖𝑐 Τ𝑌𝑒𝑠 =
5

𝑃 𝑥 Τ𝑌𝑒𝑠 = 𝑃(𝑌𝑒𝑠) ∙ 𝑃 𝑅𝑒𝑑 Τ𝑌𝑒𝑠 ∙ 𝑃 𝑆𝑈𝑉 Τ𝑌𝑒𝑠 ∙ 𝑃 𝐷𝑒𝑚𝑒𝑠𝑡𝑖𝑐 Τ𝑌𝑒𝑠


1 3 1 2 3
𝑃 𝑥 Τ𝑌𝑒𝑠 = ∙ ∙ ∙ = = 0.024
2 5 5 5 125
14
Applied Example-I
𝑃(𝑥/𝑁𝑜) = ?
2
𝑃 𝑅𝑒𝑑 Τ𝑁𝑜 =
5
4
𝑃 𝑆𝑈𝑉 Τ𝑁𝑜 =
5
3
𝑃 𝐷𝑜𝑚𝑒𝑠𝑡𝑖𝑐 Τ𝑁𝑜 =
5

𝑃 𝑥 Τ𝑁𝑜 = 𝑃(𝑁𝑜) ∙ 𝑃 𝑅𝑒𝑑 Τ𝑁𝑜 ∙ 𝑃 𝑆𝑈𝑉 Τ𝑁𝑜 ∙ 𝑃 𝐷𝑒𝑚𝑒𝑠𝑡𝑖𝑐 Τ𝑁𝑜


1 2 4 3 12
𝑃 𝑥 Τ𝑌𝑒𝑠 = ∙ ∙ ∙ = = 0.096
2 5 5 5 125
15
Applied Example-I

𝑰𝒔 𝒕𝒉𝒆 𝒄𝒂𝒓 𝑿 𝒔𝒕𝒐𝒍𝒆𝒏 𝒐𝒓 𝒏𝒐𝒕 𝒘𝒊𝒕𝒉 𝒕𝒉𝒆 𝒇𝒐𝒍𝒍𝒐𝒘𝒊𝒏𝒈 𝒇𝒆𝒂𝒕𝒖𝒓𝒆𝒔:


𝒙 = 𝑹𝒆𝒅, 𝑺𝑼𝑽, 𝑫𝒐𝒎𝒆𝒔𝒕𝒊𝒄 ?

𝑃(𝑥Τ𝑁𝑜) > 𝑃(𝑥Τ𝑌𝑒𝑠)


0.096 > 0.024

Therefore, the car 𝑥 is not stolen.

16
Applied Example-II

Fruit Yellow Sweet long Total Available


Mango 350 450 0 650
Banana 400 300 350 400
Others 50 100 50 150
Total 800 850 400 1200

𝑭𝒊𝒏𝒅 𝒕𝒉𝒆 𝒑𝒓𝒐𝒃𝒂𝒃𝒊𝒍𝒊𝒕𝒚 𝒐𝒇 𝒇𝒓𝒖𝒊𝒕 𝒙 𝒘𝒊𝒕𝒉 𝒕𝒉𝒆 𝒇𝒐𝒍𝒍𝒐𝒘𝒊𝒏𝒈 𝒇𝒆𝒂𝒕𝒖𝒓𝒆𝒔:


𝒙 = 𝒀𝒆𝒍𝒍𝒐𝒘, 𝑺𝒘𝒆𝒆𝒕, 𝒍𝒐𝒏𝒈 ?

17
Applied Example-II
𝑃 𝑥 Τ𝑀𝑎𝑛𝑔𝑜 = 𝑃(𝑀𝑎𝑛𝑔𝑜) ∙ 𝑃 𝑌𝑒𝑙𝑙𝑜𝑤 Τ𝑀𝑎𝑛𝑔𝑜 ∙ 𝑃 𝑆𝑤𝑒𝑒𝑡Τ𝑀𝑎𝑛𝑔𝑜 ∙ 𝑃 𝐿𝑜𝑛𝑔Τ𝑀𝑎𝑛𝑔𝑜

𝑃 𝑥 Τ𝐵𝑎𝑛𝑎𝑛𝑎 = 𝑃(𝐵𝑎𝑛𝑎𝑛𝑎) ∙ 𝑃 𝑌𝑒𝑙𝑙𝑜𝑤 Τ𝐵𝑎𝑛𝑎𝑛𝑎 ∙ 𝑃 𝑆𝑤𝑒𝑒𝑡Τ𝐵𝑎𝑛𝑎𝑛𝑎 ∙ 𝑃 𝐿𝑜𝑛𝑔Τ𝐵𝑎𝑛𝑎𝑛𝑎

𝑃 𝑥 Τ𝑂𝑡ℎ𝑒𝑟𝑠 = 𝑃(𝑂𝑡ℎ𝑒𝑟𝑠) ∙ 𝑃 𝑌𝑒𝑙𝑙𝑜𝑤 Τ𝑂𝑡ℎ𝑒𝑟𝑠 ∙ 𝑃 𝑆𝑤𝑒𝑒𝑡Τ𝑂𝑡ℎ𝑒𝑟𝑠 ∙ 𝑃 𝐿𝑜𝑛𝑔Τ𝑂𝑡ℎ𝑒𝑟𝑠


Fruit Yellow Sweet long Total Available
Mango 350 450 0 650
Banana 400 300 350 400
Others 50 100 50 150
Total 800 850 400 1200

𝑭𝒊𝒏𝒅 𝒕𝒉𝒆 𝒑𝒓𝒐𝒃𝒂𝒃𝒊𝒍𝒊𝒕𝒚 𝒐𝒇 𝒇𝒓𝒖𝒊𝒕 𝒙 𝒘𝒊𝒕𝒉 𝒕𝒉𝒆 𝒇𝒐𝒍𝒍𝒐𝒘𝒊𝒏𝒈 𝒇𝒆𝒂𝒕𝒖𝒓𝒆𝒔:


18
𝒙 = 𝒀𝒆𝒍𝒍𝒐𝒘, 𝑺𝒘𝒆𝒆𝒕, 𝒍𝒐𝒏𝒈 ?
Applied Example-II
650
𝑃 𝑀𝑎𝑛𝑔𝑜 = = 0.541
1200
400
𝑃 𝐵𝑎𝑛𝑎𝑛𝑎 = = 0.33
1200
150
𝑃 𝑂𝑡ℎ𝑒𝑟𝑠 = = 0.125
1200

800
𝑃 𝑌𝑒𝑙𝑙𝑜𝑤 = = 0.66
1200
850
𝑃 𝑆𝑤𝑒𝑒𝑡 = = 0.70
1200
400
𝑃 𝐿𝑜𝑛𝑔 = = 0.33
1200
19
Applied Example-II
350
𝑃(𝑀𝑎𝑛𝑔𝑜Τ𝑌𝑒𝑙𝑙𝑜𝑤)∙𝑃(𝑌𝑒𝑙𝑙𝑜𝑤) (800)∙(0.66)
𝑃 𝑌𝑒𝑙𝑙𝑜𝑤 Τ𝑀𝑎𝑛𝑔𝑜 = = = 0.533
𝑃 𝑀𝑎𝑛𝑔𝑜 (0.541)

450
𝑃(𝑀𝑎𝑛𝑔𝑜Τ𝑆𝑤𝑒𝑒𝑡)∙𝑃(𝑆𝑤𝑒𝑒𝑡) (850)∙(0.70)
𝑃 𝑆𝑤𝑒𝑒𝑡Τ𝑀𝑎𝑛𝑔𝑜 = = = 0.685
𝑃 𝑀𝑎𝑛𝑔𝑜 (0.541)

0
𝑃(𝑀𝑎𝑛𝑔𝑜Τ𝐿𝑜𝑛𝑔)∙𝑃(𝐿𝑜𝑛𝑔) ( )∙(0.33)
400
𝑃 𝐿𝑜𝑛𝑔Τ𝑀𝑎𝑛𝑔𝑜 = = =0
𝑃 𝑀𝑎𝑛𝑔𝑜 (0.541)

20
Applied Example-II
𝑃 𝑌𝑒𝑙𝑙𝑜𝑤 Τ𝑀𝑎𝑛𝑔𝑜 = 0.533
𝑃 𝑆𝑤𝑒𝑒𝑡Τ𝑀𝑎𝑛𝑔𝑜 = 0.685
𝑃 𝐿𝑜𝑛𝑔Τ𝑀𝑎𝑛𝑔𝑜 = 0

𝑃 𝑥 Τ𝑀𝑎𝑛𝑔𝑜 = 𝑃(𝑀𝑎𝑛𝑔𝑜) ∙ 𝑃 𝑌𝑒𝑙𝑙𝑜𝑤 Τ𝑀𝑎𝑛𝑔𝑜 ∙ 𝑃 𝑆𝑤𝑒𝑒𝑡Τ𝑀𝑎𝑛𝑔𝑜 ∙ 𝑃 𝐿𝑜𝑛𝑔Τ𝑀𝑎𝑛𝑔𝑜

𝑃 𝑥 Τ𝑀𝑎𝑛𝑔𝑜 = 0.541 ∙ 0.533 ∙ 0.685 ∙ 0 = 0

21
Applied Example-II
400
𝑃(𝐵𝑎𝑛𝑎𝑛𝑎 Τ𝑌𝑒𝑙𝑙𝑜𝑤)∙𝑃(𝑌𝑒𝑙𝑙𝑜𝑤) ( )∙(0.66)
800
𝑃 𝑌𝑒𝑙𝑙𝑜𝑤 Τ𝐵𝑎𝑛𝑎𝑛𝑎 = = =1
𝑃 𝐵𝑎𝑛𝑎𝑛𝑎 (0.33)

300
𝑃(𝐵𝑎𝑛𝑎𝑛𝑎Τ𝑆𝑤𝑒𝑒𝑡)∙𝑃(𝑆𝑤𝑒𝑒𝑡) ( )∙(0.70)
850
𝑃 𝑆𝑤𝑒𝑒𝑡Τ𝐵𝑎𝑛𝑎𝑛𝑎 = = = 0.74
𝑃 𝐵𝑎𝑛𝑎𝑛𝑎 (0.33)

350
𝑃(𝐵𝑎𝑛𝑎𝑛𝑎Τ𝐿𝑜𝑛𝑔)∙𝑃(𝐿𝑜𝑛𝑔) ( )∙(0.33)
400
𝑃 𝐿𝑜𝑛𝑔Τ𝐵𝑎𝑛𝑎𝑛𝑎 = = = 0.875
𝑃 𝐵𝑎𝑛𝑎𝑛𝑎 (0.33)

22
Applied Example-II
𝑃 𝑌𝑒𝑙𝑙𝑜𝑤 Τ𝐵𝑎𝑛𝑎𝑛𝑎 = 1
𝑃 𝑆𝑤𝑒𝑒𝑡Τ𝐵𝑎𝑛𝑎𝑛𝑎 = 0.74
𝑃 𝐿𝑜𝑛𝑔Τ𝐵𝑎𝑛𝑎𝑛𝑎 = 0.875

𝑃 𝑥 Τ𝐵𝑎𝑛𝑎𝑛𝑎 = 𝑃(𝐵𝑎𝑛𝑎𝑛𝑎) ∙ 𝑃 𝑌𝑒𝑙𝑙𝑜𝑤 Τ𝐵𝑎𝑛𝑎𝑛𝑎 ∙ 𝑃 𝑆𝑤𝑒𝑒𝑡Τ𝐵𝑎𝑛𝑎𝑛𝑎 ∙ 𝑃 𝐿𝑜𝑛𝑔 Τ𝐵𝑎𝑛𝑎𝑛𝑎

𝑃 𝑥 Τ𝐵𝑎𝑛𝑎𝑛𝑎 = 0.33 ∙ 1 ∙ 0.74 ∙ 0.875 = 0.21

23
Applied Example-II
50
𝑃(𝑂𝑡ℎ𝑒𝑟𝑠 Τ𝑌𝑒𝑙𝑙𝑜𝑤)∙𝑃(𝑌𝑒𝑙𝑙𝑜𝑤) (800)∙(0.66)
𝑃 𝑌𝑒𝑙𝑙𝑜𝑤 Τ𝑂𝑡ℎ𝑒𝑟𝑠 = = = 0.33
𝑃 𝑂𝑡ℎ𝑒𝑟𝑠 (0.125)

100
𝑃(𝑂𝑡ℎ𝑒𝑟𝑠Τ𝑆𝑤𝑒𝑒𝑡)∙𝑃(𝑆𝑤𝑒𝑒𝑡) ( )∙(0.70)
850
𝑃 𝑆𝑤𝑒𝑒𝑡Τ𝑂𝑡ℎ𝑒𝑟𝑠 = = = 0.658
𝑃 𝑂𝑡ℎ𝑒𝑟𝑠 (0.125)

50
𝑃(𝑂𝑡ℎ𝑒𝑟𝑠 Τ𝐿𝑜𝑛𝑔)∙𝑃(𝐿𝑜𝑛𝑔) ( )∙(0.33)
400
𝑃 𝐿𝑜𝑛𝑔Τ𝑂𝑡ℎ𝑒𝑟𝑠 = = = 0.33
𝑃 𝑂𝑡ℎ𝑒𝑟𝑠 (0.125)

24
Applied Example-II
𝑃 𝑌𝑒𝑙𝑙𝑜𝑤 Τ𝑂𝑡ℎ𝑒𝑟𝑠 = 0.33
𝑃 𝑆𝑤𝑒𝑒𝑡Τ𝑂𝑡ℎ𝑒𝑟𝑠 = 0.658
𝑃 𝐿𝑜𝑛𝑔Τ𝑂𝑡ℎ𝑒𝑟𝑠 = 0.33

𝑃 𝑥 Τ𝑂𝑡ℎ𝑒𝑟𝑠 = 𝑃(𝑂𝑡ℎ𝑒𝑟𝑠) ∙ 𝑃 𝑌𝑒𝑙𝑙𝑜𝑤 Τ𝑂𝑡ℎ𝑒𝑟𝑠 ∙ 𝑃 𝑆𝑤𝑒𝑒𝑡Τ𝑂𝑡ℎ𝑒𝑟𝑠 ∙ 𝑃 𝐿𝑜𝑛𝑔Τ𝑂𝑡ℎ𝑒𝑟𝑠

𝑃 𝑥 Τ𝐵𝑎𝑛𝑎𝑛𝑎 = 0.125 ∙ 0.33 ∙ 0.658 ∙ 0.33 = 0.0089

25
Applied Example-II

𝑭𝒊𝒏𝒅 𝒕𝒉𝒆 𝒑𝒓𝒐𝒃𝒂𝒃𝒊𝒍𝒊𝒕𝒚 𝒐𝒇 𝒇𝒓𝒖𝒊𝒕 𝒙 𝒘𝒊𝒕𝒉 𝒕𝒉𝒆 𝒇𝒐𝒍𝒍𝒐𝒘𝒊𝒏𝒈 𝒇𝒆𝒂𝒕𝒖𝒓𝒆𝒔:


𝒙 = 𝒀𝒆𝒍𝒍𝒐𝒘, 𝑺𝒘𝒆𝒆𝒕, 𝒍𝒐𝒏𝒈 ?

𝑃(𝑥Τ𝐵𝑎𝑛𝑎𝑛𝑎) > 𝑃(𝑥Τ𝑂𝑡ℎ𝑒𝑟𝑠) > 𝑃(𝑥Τ𝑀𝑎𝑛𝑔𝑜)


0.21 > 0.0089 > 0

Therefore, the probability of fruit Banana is higher than Mango and


others.

26
Applied Example-III
Day Outlook Temperature Humidity Wind Play Golf
1 Sunny Hot High Weak No
2 Sunny Hot High Strong No
3 Overcast Hot High Weak Yes
4 Rain Mild High Weak Yes
5 Rain Cool Normal Weak Yes
6 Rain Cool Normal Strong No
7 Overcast Cool Normal Strong Yes
8 Sunny Mild High Weak No
9 Sunny Cool Normal Weak Yes
10 Rain Mild Normal Weak Yes
11 Sunny Mild Normal Strong Yes
12 Overcast Mild High Strong Yes
13 Overcast Hot Normal Weak Yes
14 Rain Mild High Strong No
27
Applied Example-III

Given the following features for a new instance, determine the


probability of a player either playing golf or not.

Outlook = Sunny
Temperature = Cool
Humidity = High
Wind = Strong

28
Applied Example-III

9
𝑃 𝑃𝑙𝑎𝑦 𝐺𝑜𝑙𝑓 Τ𝑌𝑒𝑠 = = 0.64
14

5
𝑃 𝑃𝑙𝑎𝑦Τ𝑁𝑜 = = 0.36
14

29
Applied Example-III

Outlook Yes No
2 3
Sunny = 0.222 = 0.6
9 5
4 0
Overcast = 0.444 =0
9 4
3 2
Rain = 0.333 = 0.4
9 5

30
Applied Example-III

Humidity Yes No
3 4
High = 0.333 = 0.8
9 5
6 1
Normal = 0.666 = 0.2
9 4

31
Applied Example-III

Humidity Yes No
3 3
Strong = 0.333 = 0.6
9 5
6 2
Weak = 0.666 = 0.4
9 5

32
Applied Example-III

Temperature Yes No
2 2
Hot = 0.222 = 0.4
9 5
4 2
Mild = 0.444 = 0.4
9 5
3 1
Cool = 0.333 = 0.2
9 5

33
Applied Example-II
𝒗𝑵𝑩 = 𝒂𝒓𝒈𝒎𝒂𝒙𝑷𝒋∈{𝒀𝒆𝒔,𝑵𝒐} 𝑷(𝒗𝒋 ) ∙ ς𝒊 𝑷(𝒂𝒊 |𝒗𝒋 )

𝒗𝑵𝑩 = 𝒂𝒓𝒈𝒎𝒂𝒙𝑷𝒋 ∈{𝒀𝒆𝒔,𝑵𝒐} 𝑷 𝒗𝒋 ∙ 𝑷 𝒐𝒖𝒕𝒍𝒐𝒐𝒌 = 𝒔𝒖𝒏𝒏𝒚 𝒗𝒋


∙ 𝑷 𝑻𝒆𝒎𝒑. = 𝑪𝒐𝒐𝒍 𝒗𝒋
∙ 𝑷 𝑯𝒖𝒎𝒊𝒅𝒊𝒕𝒚 = 𝑯𝒊𝒈𝒉 𝒗𝒋
∙ 𝑷(𝑾𝒊𝒏𝒅 = 𝒔𝒕𝒓𝒐𝒏𝒈|𝒗𝒋 )

34
Applied Example-II

𝑣𝑁𝐵 (𝑌𝑒𝑠) = 𝑃(𝑌𝑒𝑠) ∙ 𝑃 𝑆𝑢𝑛𝑛𝑦Τ𝑌𝑒𝑠 ∙ 𝑃 𝐶𝑜𝑜𝑙 Τ𝑌𝑒𝑠 ∙ 𝑃 𝐻𝑖𝑔ℎΤ𝑌𝑒𝑠 ∙ 𝑃 𝑆𝑡𝑟𝑜𝑛𝑔Τ𝑌𝑒𝑠

𝑣𝑁𝐵 (𝑌𝑒𝑠) = 0.64 ∙ 0.222 ∙ 0.333 ∙ 0.333 ∙ 0.333 = 0.0052

𝑣𝑁𝐵 (𝑁𝑜) = 𝑃(𝑁𝑜) ∙ 𝑃 𝑆𝑢𝑛𝑛𝑦Τ𝑁𝑜 ∙ 𝑃 𝐶𝑜𝑜𝑙 Τ𝑁𝑜 ∙ 𝑃 𝐻𝑖𝑔ℎΤ𝑁𝑜 ∙ 𝑃 𝑆𝑡𝑟𝑜𝑛𝑔Τ𝑁𝑜

𝑣𝑁𝐵 (𝑁𝑜) = 0.36 ∙ 0.6 ∙ 0.2 ∙ 0.8 ∙ 0.6 = 0.0207

35
Applied Example-II
Probability Normalization

𝑣𝑁𝐵 (𝑌𝑒𝑠) = 0.0052


𝑣𝑁𝐵 𝑁𝑜 = 0.0207
𝑉𝑁𝐵 (𝑌𝑒𝑠) 0.0052
𝑣𝑁𝐵 𝑌𝑒𝑠 = = = 0.2007
𝑉𝑁𝐵 𝑌𝑒𝑠 + 𝑉𝑁𝐵 (𝑁𝑜) 0.0052 + 0.0207
𝑉𝑁𝐵 𝑁𝑜 0.0207
𝑣𝑁𝐵 𝑁𝑜 = = = 0.7992
𝑉𝑁𝐵 𝑌𝑒𝑠 + 𝑉𝑁𝐵 𝑁𝑜 0.0052 + 0.0207

𝑉𝑁𝐵 𝑌𝑒𝑠 + 𝑉𝑁𝐵 𝑁𝑜 = 0.2007 + 0.7992 = 1 36


Applied Example-III
Given the following features for a new instance, determine the
probability of a player either playing golf or not.
Outlook = Sunny
Temperature = Cool
Humidity = High
Wind = Strong

𝑣𝑁𝐵 𝑌𝑒𝑠 < 𝑣𝑁𝐵 (𝑁𝑜)


0.2007 < 0.7992

Therefore, the player will not play golf.


37
Assignment

Consider a football game between two rival teams, say team A and
team B. Suppose team A wins 65% of the time and team B wins the
remaining matches. Among the games won by team A, only 35% of
them comes from playing at team B’s football field. On the other
hand, 75% of the victories for team B are obtained while playing at
home.
1. If team B is to host the next match between the two teams, what
is the probability that it will emerge as the winner?
2. It team B is to host the next match between the two teams, who
will emerge as the winner?
38
Assignment - Solution

Let
Y = Winning football match
X = Hosting football match

Probability that team A wins is 𝑃 𝑌𝐴 = 0.65


Probability that team B wins is 𝑃 𝑌𝐵 = 1 − 0.65 = 0.35
Probability that team B hosted the match it has won is 𝑃 𝑋𝐵 𝑌𝐵 = 0.75
Probability that team B hosted the match won by team A is 𝑃 𝑋𝐵 𝑌𝐴 = 0.35

39
Assignment - Solution

1. If team B is to host the next match between the two teams, what
is the probability that it will emerge as the winner?

𝑃(𝑋𝐵 |𝑌𝐵 ) ∙ 𝑃(𝑌𝐵 )


𝑃 𝑌𝐵 𝑋𝐵 =
𝑃(𝑋𝐵 )
𝑃(𝑋𝐵 |𝑌𝐵 )∙𝑃(𝑌𝐵 )
𝑃 𝑌𝐵 𝑋𝐵 =
𝑃 𝑋𝐵 𝑌𝐴 𝑃 𝑌𝐴 +𝑃 𝑋𝐵 𝑌𝐵 𝑃 𝑌𝐵
0.75∙0.35
𝑃 𝑌𝐵 𝑋𝐵 =
(0.35∙0.65)+(0.75∙0.35)
𝑃 𝑌𝐵 𝑋𝐵 = 0.5357

40
Assignment - Solution

2. It team B is to host the next match between the two teams, who
will emerge as the winner?

𝑃(𝑋𝐵 |𝑌𝐴 ) ∙ 𝑃(𝑌𝐴 )


𝑃 𝑌𝐴 𝑋𝐵 =
𝑃(𝑋𝐵 )
𝑃(𝑋𝐵 |𝑌𝐴 )∙𝑃(𝑌𝐴 )
𝑃 𝑌𝐴 𝑋𝐵 =
𝑃 𝑋𝐵 𝑌𝐴 𝑃 𝑌𝐴 +𝑃 𝑋𝐵 𝑌𝐵 𝑃 𝑌𝐵
0.35 ∙ 0.65
𝑃 𝑌𝐴 𝑋𝐵 =
(0.35 ∙ 0.65)+(0.75 ∙ 0.35)
𝑃 𝑌𝐴 𝑋𝐵 = 0.4542

41
Acknowledgment
• [Peter Andrew Bruce] Practical Statistics for Data Scientists
• [David Forsyth] Probability and Statistics for Computer Science
• [Michael Baron] Probability and Statistics for Computer Scientists
• .

42

You might also like