Chapter6 Probability
Chapter6 Probability
Lecture 6
1 / 20
Contents
1 Introduction
2 Dependence and independence of events
3 Conditional Probability
4 Bayes’s Theorem
5 Random Variables
6 Continuous Distributions
7 Probability Density Function
8 Cumulative Distribution Function
9 The Normal Distribution
10 The Central Limit Theorem
2 / 20
Introduction
3 / 20
Dependence and independence of events
4 / 20
Conditional Probability
5 / 20
Conditional Probability (Contd.)
When E and F are independent, you can check that this gives:
P(E | F ) = P(E)
which is the mathematical way of expressing that knowing F
occurred gives us no additional information about whether E
occurred.
6 / 20
Bayes’s Theorem
7 / 20
Bayes’s Theorem
The event F can be split into the two mutually exclusive events “F
and E” and “F and not E.” If we write -E for “not E” (i.e., “E doesn’t
happen”), then:
P(F ) = P(F , E) + P(F , −E)
so that:
P(E| F ) = P(F | E)P(E)/[P(F | E)P(E) + P(F | −E)P(−E)]
which is how Bayes’s theorem is often stated.
8 / 20
Random Variables
9 / 20
Continuous Distributions
10 / 20
Probability Density Function
11 / 20
Cumulative Distribution Function
12 / 20
The Normal Distribution
13 / 20
Normal Distribution (Contd.)
14 / 20
Normal Distribution (Contd.)
15 / 20
The Central Limit Theorem
16 / 20
Central Limit Theorem (Contd.)
17 / 20
Central Limit Theorem (Contd.)
The
pmean of a Bernoulli(p) variable is p, and its standard deviation
is p(1 − p).
The central limit theorem says that as n gets large, a Binomial(n,p)
variable is approximately a normal random
p variable with mean
µ = np and standard deviation σ = np(1 − p).
18 / 20
References
[1] Data Science from Scratch: First Principles with Python by Joel Grus
19 / 20
Thank You
Any Questions?
20 / 20