0% found this document useful (0 votes)
3 views31 pages

Lec11 Jan31

The document covers key concepts in probability and statistics, focusing on the hypergeometric and Poisson distributions, as well as the geometric distribution. It includes examples and solutions related to these distributions, illustrating their applications in real-world scenarios. Additionally, it discusses the relationship between binomial and Poisson random variables and introduces the Poisson process.

Uploaded by

Shadow X
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views31 pages

Lec11 Jan31

The document covers key concepts in probability and statistics, focusing on the hypergeometric and Poisson distributions, as well as the geometric distribution. It includes examples and solutions related to these distributions, illustrating their applications in real-world scenarios. Additionally, it discusses the relationship between binomial and Poisson random variables and introduces the Poisson process.

Uploaded by

Shadow X
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 31

MATH F113: Probability & Statistics

Second Semester 2024-2025

Divyum Sharma
Assistant Professor
Department of Mathematics
BITS Pilani
Lecture 10 Summary

1. Hypergeometric distribution
2. Binomial approximation to hypergeometric
distribution

Probability & Statistics Divyum Sharma 1


Recall: Geometric Distribution

Suppose a random experiment consists of a series of


independent trials, where each trial results into two
outcomes namely success (S) and failure (F ) which
have constant probabilities p and 1 p = q,
respectively in each trial. Let X denote the number
of trials to get the first success. Then the pmf of X
is given by

Probability & Statistics Divyum Sharma 2


Recall: Geometric Distribution

Suppose a random experiment consists of a series of


independent trials, where each trial results into two
outcomes namely success (S) and failure (F ) which
have constant probabilities p and 1 p = q,
respectively in each trial. Let X denote the number
of trials to get the first success. Then the pmf of X
is given by
(
q x 1 p for x = 1, 2, 3 . . .
g (x; p) =
0 otherwise.
P
Note. 1 x=1 g (x; p) = 1
Probability & Statistics Divyum Sharma 2
Recall
If X is a geometric random variable, then

• E (X ) = 1/p, V (X ) = q/p 2
pe t
• mX (t) = 1 qe t , t< ln q

Proof. We compute E (X ) below. Rest is left as an


exercise.
1
X 1
X
E (X ) = xg (x; p) = xq x 1 p
x=1 x=1
= p(1 + 2q + 3q + 4q 3 + · · · )
2

1 1
=p = .
(1 q)2 p
Probability & Statistics Divyum Sharma 3
Question
Suppose that you are looking for a student at the
college who lives within a hundred kilometer
distance from you. You know that 5% of the 5,000
students live within a hundred kilometer distance
from you. You randomly contact students from the
college until one says they live within a hundred
kilometer distance from you. Determine the
probability that you need to contact five students.

Probability & Statistics Divyum Sharma 4


Solution. Let X be the number of students
contacted, until one says they live within a hundred
kilometer distance from you. Then X is a geometric
rv with p = 0.05. So
P(X = 5) = 0.954 ⇥ 0.05 ⇡ 0.0407.

Probability & Statistics Divyum Sharma 5


Question
People are selected randomly and independently for
a drug test. Each person passes the test 98% of the
time. What is the expected number of people who
take the test until the first person fails?

Probability & Statistics Divyum Sharma 6


Question
People are selected randomly and independently for
a drug test. Each person passes the test 98% of the
time. What is the expected number of people who
take the test until the first person fails?

Solution. Let X be the number of people tested,


until the first person fails the test. Then X is a
geometric rv with p = 0.02. So
E (X ) = 1/0.02 = 50. So we expect 50 people to be
tested until the first person fails the test.
(Here, V (X ) = 0.98/0.022 = 2450.)

Probability & Statistics Divyum Sharma 6


Siméon Denis Poisson
(1781–1840)

Probability & Statistics Divyum Sharma 7


Poisson Distribution

Definition
A discrete random variable X is said to have a
Poisson distribution with parameter µ (µ > 0) if its
pmf is given by
µ x
e µ
p(x; µ) = , x = 0, 1, 2, . . .
x!

Probability & Statistics Divyum Sharma 8


Validity of pmf.

1. p(x; µ) 0 for all x


P1
2. x=0 p(x; µ) = 1 as

1
X 1
X µ x 1
X
e µ µ µx µ µ
p(x; µ) = =e =e e =1
x=0 x=0
x! x=0
x!

Probability & Statistics Divyum Sharma 9


Proposition (Mean & Variance)
If X has a Poisson distribution with parameter µ,
then E (X ) = V (X ) = µ.

Probability & Statistics Divyum Sharma 10


Proposition (Mean & Variance)
If X has a Poisson distribution with parameter µ,
then E (X ) = V (X ) = µ.

Proof.
1
X 1
X µ x
e µ
E (X ) = xp(x; µ) = x
x=0 x=0
x!
1
X 1
X
µ µx µ µx
=e x =e
x=1
x! x=1
(x 1)!
X1 1
X
µ µx 1 µ µy µ
=e µ =e µ =e µe µ = µ.
x=1
(x 1)! y =0
y!

Probability & Statistics Divyum Sharma 10


Proof (contd.)

1
X µ x 1
X
2 2e µ µ 2 µx
E (X ) = x =e (x x) + E (X )
x=0
x! x=0
x!
1
X 1
X
µ µx µ µx
=e x(x 1) + E (X ) = e +µ
x=2
x! x=2
(x 2)!
1
X x 2 1
X y
µ 2 µ µ 2 µ
=e µ +µ=e µ +µ
x=2
(x 2)! y =0
y!
µ 2 µ
=e µ e + µ = µ2 + µ.

Probability & Statistics Divyum Sharma 11


MGF

Theorem
For a Poisson random variable X with parameter
µ > 0,
mX (t) = e µ(e 1) .
t

Probability & Statistics Divyum Sharma 12


MGF

Theorem
For a Poisson random variable X with parameter
µ > 0,
mX (t) = e µ(e 1) .
t

Proof. Exercise.
Exercise
Using the mgf of a Poisson distribution, derive the
formulas for its mean and variance.

Probability & Statistics Divyum Sharma 12


The relationship between binomial and
Poisson rv

Let X be a binomial rv with pmf b(x; n, p). If


n ! 1, p ! 0, and np ! µ, then b(x; n, p) !
p(x; µ).

Probability & Statistics Divyum Sharma 13


(contd.)

✓ ◆
n µ x
b(x; n, µ/n) = ( /n) (1 µ/n)n x
x
n(n 1) · · · (n x + 1) µ x µ/n)n x
= ( /n) (1
x!
n(n 1) · · · (n x + 1) µx µ/n)n x
= ( /x!)(1
nx
x
1/n) · · · (1 (x 1)/n)(1 µ/n) x
µ µ/n)n
= n/n(1 (1
x!
µx µ
!1· e = p(x; µ)
as n ! 1 x!
Probability & Statistics Divyum Sharma 14
The relationship between binomial and
Poisson rv

Let X be a binomial rv with pmf b(x; n, p). If


n ! 1, p ! 0, and np ! µ, then b(x; n, p) !
p(x; µ).

So in any binomial experiment in which n is large and


p is small such that np ! µ, we have
b(x; n, p) ⇡ p(x; µ). As a rule of thumb, this
approximation can safely be applied if n > 50 and
np < 5. This is also useful as binomial cdf tables
won’t be available for large n.
Probability & Statistics Divyum Sharma 15
Cdf of Poisson distribution

Cdf values are tabulated in Table A.2 (p. A-4 to A-5)


of the text book.

Probability & Statistics Divyum Sharma 16


Cdf of Poisson distribution

..
.

Probability & Statistics Divyum Sharma 17


Question
In a certain industrial facility, accidents occur
infrequently. It is known that the probability of an
accident on any given day is 0.005 and accidents
are independent of each other.

(a) What is the probability that in any given period


of 600 days there will be an accident on one
day?
(b) What is the probability that there are at most
three days with an accident?

Probability & Statistics Divyum Sharma 18


Solution. Let X = the number of accidents in the
period of 600 days. Then X is a binomial rv with
parameters n = 600 and p = 0.005. As n > 50 and
np = 3, we can approximate X by a Poisson rv Y
with parameter µ = np = 3.

(a) Required probability ⇡ P(Y = 1) =


F (1) F (0) = 0.199 0.050 = 0.149. (or, find
this probability using the pmf of Y )
(b) Required probability
⇡ P(Y  3) = F (3) = .647.

Probability & Statistics Divyum Sharma 19


Poisson Process

A very important application of the Poisson


distribution arises in connection with the occurrence
of events of some type over time.

Probability & Statistics Divyum Sharma 20


Poisson Process. We make the following 3
assumptions about the way in which the events of
interest occur.

Probability & Statistics Divyum Sharma 21


Poisson Process. We make the following 3
assumptions about the way in which the events of
interest occur.
Time homogeneity and small interval probabilities:
1. There exists a parameter ↵ > 0 such that for any
short time interval of length t, the probability that
exactly one event occurs is ↵ · t + o( t). (Note
that o( t)/ t ! 0 as t ! 0.)

Probability & Statistics Divyum Sharma 21


Poisson Process. We make the following 3
assumptions about the way in which the events of
interest occur.
Time homogeneity and small interval probabilities:
1. There exists a parameter ↵ > 0 such that for any
short time interval of length t, the probability that
exactly one event occurs is ↵ · t + o( t). (Note
that o( t)/ t ! 0 as t ! 0.)
2. The probability of more than one event occurring
during any short time interval of length t is o( t).

Probability & Statistics Divyum Sharma 21


Poisson Process. We make the following 3
assumptions about the way in which the events of
interest occur.
Time homogeneity and small interval probabilities:
1. There exists a parameter ↵ > 0 such that for any
short time interval of length t, the probability that
exactly one event occurs is ↵ · t + o( t). (Note
that o( t)/ t ! 0 as t ! 0.)
2. The probability of more than one event occurring
during any short time interval of length t is o( t).
3. The number of events occurring during a short
time interval of length t is independent of the
number that occur prior.
Probability & Statistics Divyum Sharma 21
Let Pk (t) denote the probability that k events will be observed
during any particular time interval of length t (for
k = 0, 1, 2, . . .).

Probability & Statistics Divyum Sharma 22


Let Pk (t) denote the probability that k events will be observed
during any particular time interval of length t (for
k = 0, 1, 2, . . .).

↵t k
• Pk (t) = e k!(↵t) , so that the number of events
occurring during a time interval of length t is a
Poisson rv with parameter µ = ↵t.

• The expected number of events during a time


interval of length t is ↵t. So the expected number
during a unit interval of time is ↵.

The occurrence of events over time in this manner is called a


Poisson process and the parameter ↵ specifies the rate for
the process.

Probability & Statistics Divyum Sharma 22

You might also like