0% found this document useful (0 votes)
44 views38 pages

Standard Discrete Random Variables

The document provides information on several discrete random variables including the Bernoulli, binomial, geometric, and Poisson distributions. It defines each distribution and gives the probability mass function and key properties like the mean, variance, and moment generating function. Examples are also provided to demonstrate how to calculate probabilities for each distribution. The cumulative distribution function of the binomial random variable is also stated.

Uploaded by

aaryadeotale
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
44 views38 pages

Standard Discrete Random Variables

The document provides information on several discrete random variables including the Bernoulli, binomial, geometric, and Poisson distributions. It defines each distribution and gives the probability mass function and key properties like the mean, variance, and moment generating function. Examples are also provided to demonstrate how to calculate probabilities for each distribution. The cumulative distribution function of the binomial random variable is also stated.

Uploaded by

aaryadeotale
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 38

Probability and Statistics (MATH F113)

Pradeep Boggarapu

Department of Mathematics
BITS PILANI K K Birla Goa Campus, Goa

January 25, 2024

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 1 / 38
Standard Examples for Discrete Random Variables

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 2 / 38
Outline

1 Bernoulli and Binomial random variables.


2 Geometric random variable.
3 Poisson random variable.
4 Hypergeometric random variable.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 3 / 38
Bernoulli random variable
Definition 0.1 (Bernoulli trial).
A random experiment or a trial whose outcome can be
classified as either success or a failure is called Bernoulli
trial.
In Bernoulli trial, define a random variable X by
X = 1, when the outcome is a success and X = 0
when it is a failure, then X is called Bernoulli random
variable.
If p is the probability that the trial is success, then the
probability mass function is given by
f (x) = p x (1 − p)1−x for x = 0, 1.
Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 4 / 38
Binomial random variable
Consider n Bernoulli trials which are indipendent and
identical in the sense that the outcome of one trial has
no effect on the outcome of any other and the
probability of success, p, 0 ≤ p ≤ 1 (let’s say) remains
the same from trial to trial.
If X denotes the number of success that occur in the
n trials, X is said to be binomial random variable with
parameters (n, p).
The pmf of a binomial random variable having
parameters (n, p) is given by
 
n x
f (x) = p (1 − p)n−x , for x = 0, 1, 2 . . . n.
x
Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 5 / 38
Examples

Example 1. Five fair coins are flipped. If the outcomes are


assumed independent, find the probability mass function
of the number of heads obtained. And also find the
probability that atleast two heads are obtained.
Example 2. It is known that disks produced by a certain
company will be defective with probability 0.01
independently of each other. The company sells the disks
in packages of 10 and offers a money-back guarantee that
at most 1 of the 10 disks is defective. (i) What proportion
of packages is returned? (ii) If someone buys three
packages, what is the probability that exactly one of them
will be returned?
Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 6 / 38
Mean, Variance and Mgf of Binomial RV

Theorem 0.2.
If X is a binomial random variable with parameters (n, p),
then
1 E (X ) = np
2 Var (X ) = np(1 − p)
3 The mgf of X is given by mX (t) = (pe t + 1 − p)n .

Proof. Note that the pmf or pdf of X is given by


 
n x
fX (x) = p (1 − p)n−x , for x = 0, 1, 2, . . . n.
x

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 7 / 38
Mean, Variance and Mgf of Binomial RV
n  
X n x
E (X ) = x p (1 − p)n−x
x=0
x
n
X n!
= x p x (1 − p)n−x
x=0
x!(n − x)!
n
X (n − 1)!
= np p x−1 (1 − p)n−x
x=1
(x − 1)!(n − x)!
n−1
X (n − 1)! j
= np p (1 − p)n−1−j
j!(n − j)!
j=0
= np(p + 1 − p)n−1 = np

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 8 / 38
Mean, Variance and Mgf of Binomial RV
n  
2
X n x
2
E (X ) = x p (1 − p)n−x
x
x=0
n
X
2 n!
= x p x (1 − p)n−x
x!(n − x)!
x=0
n
X (n − 1)!
=np (x − 1 + 1) p x−1 (1 − p)n−x
(x − 1)!(n − x)!
x=1
n
X (n − 1)! 
=np (x − 1) p x−1 (1 − p)n−1−x
(x − 1)!(n − x)!
x=1
n
X (n − 1)! 
+ np p x−1 (1 − p)n−1−x
(x − 1)!(n − x)!
x=1
=n(n − 1)p 2 + np = np(1 − p) + n2 p 2 .

Therefore, Var (X ) = E [X 2 ] − (E [X ])2 = np(1 − p), since E [X ] = np.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 9 / 38
Mean, Variance and Mgf of Binomial RV

Moment generating function is given by


n  
tx n
X
tX
mX (t) =E [e ] = e p x (1 − p)n−x
x=0
x
n  
X n
= (pe t )x (1 − p)n−x = pe t + 1 − p)n .
x=0
x

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 10 / 38
Cumulative Distribution Function of Binomial RV

Remark 0.3.
The cdf of bionomial random variable X with parameters
(n, p) is given by



 0, if x < 0
[x]  


X n j
F (x) = p (1 − p)n−j , if 0 ≤ x < n
 j


 j=0
1, if x ≥ n.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 11 / 38
Geometric random variable

Suppose that independent Bernoulli trials, each having


a probability p, 0 < p < 1, of being a success, are
performed until a success occurs.
If we let X equal the number of trials required to
obtain the first success, then

P(X = x) = (1 − p)x−1 p, for x = 1, 2, 3, . . .

here X is called geometric random variable with


parameter p.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 12 / 38
Geometric random variable

Definition 0.4.
A random variable X is said to have a geometric
distribution with parameter p, 0 < p < 1 if its density
function f is given by

f (x) = (1 − p)x−1 p = q x−1 p for x = 1, 2, 3, . . . ,

where q = 1 − p.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 13 / 38
Mean, Variance and Mgf of Geometric RV

Theorem 0.5.
If X is a geometric random variable with parameters
p, 0 < p < 1, then
1 E (X ) = 1/p
2 Var (X ) = q/p 2 , where q = 1 − p
3 The mgf of X is given by
pe t
mX (t) = , t < − ln q
1 − qe t

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 14 / 38
Mean, Variance and Mgf of Geometric RV
Proof. Note that the pmf or pdf of X is given by
f (x) = q x−1 p, for x = 1, 2, 3, . . . .

X ∞
X
E [X ] = xf (x) = xq x−1 p
x=1 x=1

d  X x
=p q
dq x=0
d  1 
=p
dq 1 − q
 1 
=p = 1/p.
(1 − q)2
Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 15 / 38
Mean, Variance and Mgf of Geometric RV


X ∞
X
2 2
E [X ] = x f (x) = x 2 q x−1 p
x=1 x=1
X∞ ∞
X
2 x−1
= (x − x)q p+ xq x−1 p
x=1 x=1

X 1
=pq x(x − 1)q x−2 +
x=2
p

2 X
d x
 1
=pq q +
dq 2 x=0
p

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 16 / 38
Mean, Variance and Mgf of Geometric RV

which implies,

2 d2  1  1
E [X ] =pq +
dq 2 1 − q p
 2pq  1 2q + p q+1
= + = = .
(1 − q)3 p p2 p2
Therefore,
q+1 1 q
Var [X ] = E [X 2 ] − E [X ]2 = − = .
p2 p2 p2

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 17 / 38
Mean, Variance and Mgf of Geometric RV

Moment generating function is given by



X
tX
mX (t) = E [e ] = e tx q x−1 p
x=1

X
t
=pe (e t q)x−1
x=1
t
pe
= t
, for qe t < 1 or t < − ln q.
1 − qe

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 18 / 38
Cumulative Distribution Function of Geometric RV

Remark 0.6.
The cdf of geonetric random variable X with parameter
p, 0 < p < 1 is given by
(
0, if x < 1
F (x) =
1 − q [x] , if x ≥ 1

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 19 / 38
Examples

Example 3. An urn contains N white and M black balls.


Balls are randomly selected, one at a time, until a black
one is obtained. If we assume that each ball selected is
replaced before the next one is drawn, what is the
probability that
1 exactly n draws are needed?
2 at least k draws are needed?

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 20 / 38
Poisson random variable

Definition 0.7 (Poisson random variable).


A random variable X is said to have a Poisson distribution
with parameter k if its density f is given by
e −k k x
f (x) = ; for x = 0, 1, 2, . . . , and k > 0.
x!

It can be verified that f (x) is non-negative and



X e −k k x
= e −k e k = 1.
x=0
x!

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 21 / 38
Example 4. Suppose that the number of typographical
errors on a single page of this book has a Poisson
distribution with parameter k = 12 . Calculate the
probability that there is at least one error on this page.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 22 / 38
Mean, Variance and Mgf of Poisson RV

Theorem 0.8.
If X is a Poisson random variable with parameters k > 0,
then
1 E (X ) = k
2 Var (X ) = k
3 The mgf of X is given by
t
−1)
mX (t) = e k(e

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 23 / 38
Proof. Pmf of X is given by
e −k k x
f (x) = ; for x = 0, 1, 2, . . . , and k > 0.
x!
∞ x ∞
X
−k k −k
X kx
E [X ] = xe =e x
x=0
x! x=1
x!
∞ ∞
−k
X k x−1 −k
X kj
=e k = ke
x=1
(x − 1)! j!
j=0
−k k
=ke e = k.

Before we find variance, we will find moment generating


function.
Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 24 / 38
The mgf of X is given by
∞ x ∞
tX
X
tx −k k −k
X (ke t )x
E [e ] = e e =e
x=0
x! x=0
x!
−k ke t t
−1)
=e e = e k(e .

It can be seen easily that


dmX (t) t k(e t −1) d 2 mX k(e t −1) t t 2

= ke e and = e ke + (ke )
dt dt 2
d2
and hence E [x 2 ] = dt 2 t=0 mX (t) = k + k 2.
Therefore Var [X ] = E [X ] − (E [X ])2 = k + k 2 − k 2 = k.
2

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 25 / 38
Poisson Approximation to the Binomial Distribution
Poisson random variable can be used as an approximation
for a binomial random variable with parameters (n, p)
when n is large and p is small enough so that np is of
moderate size.
To see this, suppose that X is a binomial random variable
with parameters (n, p), and let k = np. Then
n!
P(X = x) = p x (1 − p)n−x
(n − i)!x!
n!  k x  k n−x
= 1−
(n − x)!x! n n
n(n − 1) · · · (n − x + 1) k x (1 − k/n)n
=
nx x! (1 − k/n)x
Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 26 / 38
Now, for n large and k moderate

n(n − 1) · · · (n − x + 1)
x
≈ 1; (1 − k/n)n ≈ e −k ,
n
(1 − k/n)x ≈ 1.

Hence, for n large and k moderate,


kx
P(X = x) ≈ e −k .
x!

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 27 / 38
Problems

Example 5. Suppose that the probability that an item


produced by a certain machine will be defective is 0.1.
Find the probability that a sample of 10 items will contain
at most 1 defective item.
Example 6. A loom experience one yarn breakage
approximately every 10 hours. A particular style of cloth is
being produced that will take 25 hours on this loom. If 3
or more breaks are required to render the product
unsatisfactory, find the probability that the style of cloth
is fnished with acceptable quality.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 28 / 38
Problems

Example 7. If the number of claims handled daily by an


insurance company follows Poisson distribution with mean
5, what is the probability that there will be 4 claims each
in exactly 3 of the next 5 days ? Assume that the number
of claims on different days is independent

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 29 / 38
Hypergeometric random variable

Suppose that a sample of size n is to be chosen randomly


(without replacement) from an urn containing N balls, of
which r are white and N − r are black.
If we let X denote the number of white balls selected, then
r N−r
 
x n−x
P(X = x) = N

n

for max[0, n − (N − r )] ≤ x ≤ min (n, r ).


A random variable with the above functions as density
function is called Hypergeometric random variable with
parameters (N, n, r ).
Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 30 / 38
Hypergeometric random variable

Theorem 0.9.
Let X be a hypergeometric random variable with
parameters (N, n, r ) then
r 
1 E [X ] = n
N
 r  N − r  N − n 
2 Var (X ) = n
N N N −1

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 31 / 38
Problems

Problem 8. Twenty microprocessor chips are in stock.


Three have etching errors that can not be detected by
naked eye. Five chips are selected and initially in the field
equipment.(a) Find the density for X , the number of chips
selected that have etching errors. (b) Find E (X ) and
Var (X ). (c) Find the probability that no chips with
etching errors will be selected. (d) Find the probability
that at least one chip with an etching error will be chosen.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 32 / 38
Poisson Process

Suppose we are concerned with discrete events taking


place over continuous intervals (not in the usual
mathematical sense) of time, length or space; such as the
arrival of telephone calls at a switchboard, or number of
red blood cells in a drop of blood (here the continuous
interval involved is a drop of blood).
These can be thought of examples for Poisson precess.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 33 / 38
Poisson Process

A Poisson process with rate (or intensity) λ > 0 is a


counting process {Xt : t ≥ 0} such that
1 X0 = 0
2 Xt is the number of events occured in the interval of
(0, t] which is Poisson random variable with parameter
λt.
3 the increments are independent: if (s1 , t1 ] ∩ (s2 , t2 ],
then Xt1 − Xs1 and Xt2 − Xs2 are independent.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 34 / 38
Example

Example 9. The arrival of trucks at a receiving dock is a


Poisson process with a mean arrival rate of 2 per hour.
1 Find the probability that exactly 5 trucks arrive in a
two hour period.
2 Find the probability that 3 or more truck arrive in a
two hour period.
3 Find the probability that exactly two trucks arrive in a
one hour period and exactly 3 trucks arrive in the next
one hour period

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 35 / 38
Memory less property

Remark 0.10.
A discrete random variable X which takes positive
integers satisfies memory less property i.e.,

P(X > m + n|m > 0) = P(X > n)

for all positive integers m, n if and only if X is geometric


random variable.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 36 / 38
Memory less property

Problem 10. Let X and Y be independent binomial


random variables having respective parameters (n, p) and
(m, p). Then prove the following:
1 The random variable X + Y is also binomial with
parameters (m + n, p).
2 The conditional probability mass function of X given
that X + Y = k is hypergeometric.

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 37 / 38
Thank you for your attention

Pradeep Boggarapu (Dept. of Maths) Probability and Statistics January 25, 2024 38 / 38

You might also like