08 The Poisson Probability Distribution
08 The Poisson Probability Distribution
Alvin Lin
Probability and Statistics: January 2017 - May 2017
As n → ∞ and p → 0, np → µ > 0.
b(x; n, p) → p(x; µ)
Poisson Distribution
(
e−µ µx
x!
, x = 0, 1, 2, . . .
p(x; µ) =
0 , otherwise
The Poisson model is a reasonably good approximation of the binomial model when
n ≥ 20 with p ≤ 0.05 or n ≥ 100 with p ≤ 0.10.
binomial Poisson
E(X) = np E(X) = µ
np ∼ µ
V (X) = np(1 − p) V (X) = µ
np(1 − p) ∼ np ∼ µ
1
∞ ∞
X X e−µ µx
p(x; µ) =
x=0 x=0
x!
∞
−µ
X µx
=e
x=0
x!
−µ µ
=e e
= e−µ+µ
= e0
=1
Example
Let X denote the number of traps (defects of a certain kind) in a particular type of
semiconductor transistor, and suppose it has Poisson distribution with µ = 2. The
probability that there are exactly 3 traps is:
P (X = 3) = p(3; µ)
= p(3; 2)
e−2 (2)3
=
3!
≈ 0.18
Poisson Process
1. There exists a parameter α > 0 such that for any short time interval of length
∆t, the probability that exactly one event occurs is:
α∆t + o(∆t)
2. The probability of more than one event occurring during ∆t is o(∆t) [which,
along with assumption 1, implies that the probability of no events during ∆t
is 1 − α∆t − o(∆t)].
3. The number of events occurring during the time interval ∆t is independent of
the number that occur prior to this time interval.
4. The probability that the “event” occurs k times in a time interval of length t
is:
e−(αt) (αt)k
Pk (t) =
k!
2
where α is the average rate of occurrence of the “event”.
Little o notation
x2 is o(|x|) for x → 0.
x2
→ 0 as x → 0
|x|
x3 + x5 is o(|x|2 ) for x → 0.
x3 + x5
→ 0 as x → 0
|x|2
p
|x| is not o(|x|) for x → 0.
p
|x| 1
=p as x → 0
|x| |x|
√1 is not o( x1 ) for x → 0.
x
√1
x
1 6→ 0 as x → ∞
x
1
x2
is o( x1 ) for x → ∞.
1
x2
1 → 0 as x → ∞
x
Let Rn (x) = f (x) − Tn (x), where Tn (x) is the nth order Maclaurin polynomial for f .
Then Rn (x) = o(|x|n ) for x → 0.
Example
Suppose pulses arrive at a counter at an average rate of six per minute so that α = 6.
Suppose this is a Poisson process. Find the probability that at least one pulse arrives
during a time interval of length 0.5 minutes. Let X be a random variable denoting
3
the number of pulses arrived.
P (X ≥ 1) = 1 − P (X = 0)
e−αt (αt)k
P (X = 0) =
k!
e−(6·0.5) (6 · 0.5)0
=
0!
= e−3
P (X ≥ 1) = 1 − e−3
≈ 0.950
Tell whether the statement is true:
• ∞
X e−αt (αt)k
=1
k=0
k!
True, since P (X = 0) + P (X = 1) + P (X = 2) + . . . is exhaustive.
• The probability that 6 pulses arrive during a time interval of length 1.0 minute
is equal to:
e−(6·0.5) (6 · 0.5)3 e−(6·0.5) (6 · 0.5)3
+
3! 3!
False, they are not additive.
• If 4 pulses arrive during 3:00:00-3:00:30, the probability that 3 pulses arrive
during 3:00:30-3:01:00 is equal to:
e−(6·0.5) (6 · 0.5)4 e−(6·0.5) (6 · 0.5)3
·
4! 3!
False, the events are independent.
• The probability that 6 pulses are received in a time interval of length 1.0 minute
is equal to:
e−6·1 (6 · 1)6
6!
True
• The probability that 6 pulses are received in a time interval of length 1.0 minute
is equal to:
6
X e−6·0.5 (6 · 0.5)k e−6·0.5 (6 · 0.5)6−k
·
k=0
k! (6 − k)!
4
Example
In proof testing of circuit boards, the probability that any particular diode will fail
is .01. Suppose a circuit board contains 200 diodes.
• How many diodes would you expect to fail, and what is the standard deviation
of the number expected to fail?
• What is the (approximate) probability that at least four diodes will fail on a
randomly selected board?
P (X ≥ 4) = 1 − P (X ≤ 3)
3
X
=1− b(x; 200, 0.01)
x=0
X3
≈1− p(x; µ)
x=0
µ = 200 × 0.01 = 2
For a random variable X with its pmf being p(x; µ), it turns out that E(X) = µ
and V (X) = µ.
Example
Let X be a continuous random variable. The probability distribution or probability
density function (pdf) of X is a function f (x) such that for any two numbers a and
b with a ≤ b. Z b
P (a ≤ X ≤ b) = f (x) dx
a