Week3 4
Week3 4
FX (x) = P(X x)
pX (x) = P(X = x)
6
PMF v.s. CDF
If you toss a fair coin 8 times, let X denote number of heads you get. The
probability mass function and cumulative distribution function of X are
8
Binomial random variable
Consider n = 6 and k = 3,
11
Urn example
You have a urn with 10 blue and 20 red balls inside. You pick 9 balls with
replacement. Let X be the number of blue balls. What is P(X = 3)?
• With replacement is critical here. I can draw the same ball twice.
10
• We have P(blue) = 30 = 13 , and P(red) = 23 .
9 1 3 2 6
• Therefore the probability is P(X = 3) = 3 3 3 .
• 1 P(X = 0) = 1 (1 0.3)10
13
Geometric random variable
14
Geometric random variable
You are repeating a Bernoulli experiment, you won’t stop until seeing
desired outcome.
15
Geometric random variable
• Alice keep buying lottery until she wins a hundred million dollars.
She is interested in the random variable “number of lottery tickets
bought until win 100M$”.
• Alice tries to catch a taxi. How many occupied taxis will drive past
before she finds a vacant one?
16
Geometric random variable
• X = k means that the k-th toss you get a head, and you get tails
for all k 1 tosses before.
17
PMF of geometric random variable
18
Geometric random variable
• You failed in the first k 1 trials. The k-th might be a success, but
who knows.
19
Geometric random variable
P1
• P(X k) = j=k p(1 p)i 1
= (1 p)k 1
. You get the sum by
calculus.
• X > k is equivalent to X k + 1, so
P(X > k) = P(X k + 1) = (1 p)k
20
Geometric: mean and variance
1 1 p
E [X ] = , Var[X ] =
p p2
See page 156 of Ross for a proof.
Intuitively, the smaller p, the more times you have to try to get the first
success.
21
Memoryless of geometric random variable
• Basically, you keep doing your poor experiments, and get b more
failure.
23
Example of memoryless
24
Example of memoryless
1
X
E [X | X > 9] = xP(X = x | X > 9)
x=10
X1
= XP(X = x 9) by memoryless
x=10
= 9 + E [X ] = 19
25
Negative binomial random variable
• Geometric PMF describes how many times you have to try, until
the first success.
• We go one step further from geometric: how many times you have
to try, until r success.
26
Negative binomial random variable
• Denote it as X ⇠ NegBin(p, r ).
• PMF:
✓ ◆
n 1 r
P(X = n) = p (1 p)n r
, n = r , r + 1, r + 2, · · ·
r 1
27
Negative binomial random variable
• So the last trial (n-th) must be the r -th success, right? That’s when
we stop.
28
Negative binomial random variable
29
Mean and variance
Suppose X ⇠ NegBin(p, r ).
r r (1 p)
E [X ] = , Var[X ] =
p p2
See page 159 of Ross for a proof.
30
Example
• X ⇠ NegBin(0.7, 3)
4
• P(X = 5) = 2 0.73 (1 0.7)2 = 0.185
31
Hypergeometric random variable
Suppose an urn has N balls, of which m are white and N m are black.
You randomly choose n balls from the urn without replacement. X
denote number of white balls being selected. It is a hypergeometric
random variable X ⇠ HyperGeo(N, m, n).
m N m
i n i
P(X = i) = N
, i = 0, 1, · · · , m
n
• To get i white balls, means that you draw i out of m white balls,
and n i out of N m balls.
h i
mn mn (n 1)(m 1)
• E [X ] = N , Var[X ] = N N 1 +1
32
Example
• X ⇠ HyperGeo(10, 2, 3).
(20)(83)
• P(X = 0) = 10 = 0.47
(3)
33
Large n, small p
• I write a book with 10,000 words. Probability that a word has a typo
1
is 1000 . I am interested in how many typos can be there in the book?
34
Large n, small p
There are many settings in real life that may have huge n, but small p.
35
Poisson random variable
36
Poisson as an approximation to binomial
• The plot above, we have np = 3 but di↵erent n and p. You can see
the approximation is better when n large p tiny.
37
Poisson: mean and variance
E [X ] = , Var[X ] =
• Derivation is omitted here, see page 145 of Ross for the proof.
• One interesting property of Poisson random variable is its mean and
variance are the same.
• How do you remember it?
• Hint: Binomial(n, p) can be approximated by Poisson( ) where
= np. So .. np ⇡ ?
38
Example
Assume that on a given day 1000 cars are out in the city. On an average
3 out of 1000 cars run into a traffic accident per day. Suppose each
accident is independent to each other.
39
Example
1
2.2 2.2 2.2
P(Arsenal gets 1) = e =e ⇥ 2.2
1!
1
2.7 2.7
P(Chelsea gets 1) = e = e 2.7 ⇥ 2.7
1!
P(1 1 draw) = P(Arsenal gets 1) ⇥ P(Chelsea gets 1)
2.2 2.7
=e ⇥ 2.2 ⇥ e ⇥ 2.7
= 0.044
40