Essentials of Machine Learning - Lesson 02
Essentials of Machine Learning - Lesson 02
Lesson 02 - Probability
• Quantification of uncertainty
• Frequentist interpretation: long run frequencies of events
e.g.: The probability of a particular coin landing heads up is 0.5
• Bayesian interpretation: quantify our degrees of belief about something
e.g.: the probability of it raining tomorrow is 0.3
• Not possible to repeat “tomorrow" many times
• Basic rules of probability are the same, no matter which interpretation is
adopted
• 3
• Examples:
• Colour of a car blue, green, red
• Number of children in a family 0, 1, 2, 3, 4, 5, 6, > 6
• Toss two coins, let X = (number of heads)2. X can take on the values 0, 1 and
4.
• Example p(Colour = red) = 0:3
• ∑! 𝑃 𝑥 = 1
• Continuous RVs take on values that vary continuously within one or more real
intervals
• Probability density function (pdf) p(x) for a continuous random variable X
#
𝑃 𝑎≤𝑋≤𝑏 = ∫" 𝑝 𝑥 𝑑𝑥
therefore
𝑃 𝑥 ≤ 𝑋 ≤ 𝑥 + 𝛿𝑥 ≅ 𝑝 𝑥 𝛿𝑥
• ∫ 𝑝 𝑥 𝑑𝑥 = 1 (but values of p(x) can be greater than 1)
• Examples (coming soon): Gaussian, Gamma, Exponential, Beta
• Let X and Y be two disjoint groups of variables, such that p(Y = y) > 0. Then the conditional
probability distribution (CPD) of X given Y = y is given by:
$(&,()
𝑝 𝑋=𝑥𝑌=𝑦 =𝑝 𝑥𝑦 = $(()
• Product rule
p(𝑋, 𝑌) = 𝑝 𝑋 𝑝 𝑌 𝑋 = 𝑝 𝑌 𝑝 𝑋 𝑌
• Example: In the grades example, what is p(Intelligence = high|Grade = A)?
• ∑! p 𝑋 = 𝑥 𝑌 = 𝑦 = 1 for all x
*𝑌 𝑋 *(+) % 𝑌 𝑋 %(')
𝑃(𝑋|𝑌) =
∑+ * 𝑌 𝑋 *(+)
= %(')
• Person gets a positive test result. What is p(TB = yes |Test = yes)?
𝑃 𝑇𝐵 = 𝑦𝑒𝑠 | 𝑇𝑒𝑠𝑡 = 𝑦𝑒𝑠 = * -./01(./*(-./014./)
| -31(./ *(-31(./)
).+, × ).))!
= ≅ 0.0187
).+,× ).)!.).),×).+++
Thushari Silva, PhD Essentials of Machine Learning
Independence