Material1 Latex
Material1 Latex
Certain types of random variables occur over and over again in applications. In this chapter,
we will study a variety of them.
P{X = 0} = 1 − p (5.1.1)
P{X = 1} = p
E [X ] = 1 · P{X = 1} + 0 · P{X = 0} = p
That is, the expectation of a Bernoulli random variable is the probability that the random
variable equals 1.
Suppose now that n independent trials, each of which results in a “success” with prob-
ability p and in a “failure” with probability 1 − p, are to be performed. If X represents
the number of successes that occur in the n trials, then X is said to be a binomial random
variable with parameters (n, p).
141
142 Chapter 5: Special Random Variables
The probability mass function of a binomial random variable with parameters n and p
is given by
n i
P{X = i} = p (1 − p)n−i , i = 0, 1, . . . , n (5.1.2)
i
where ni = n!/[i!(n − i)!] is the number of different groups of i objects that can be
chosen from a set of n objects. The validity of Equation 5.1.2 may be verified by first
noting that the probability of any particular sequence of the n outcomes containing i
n independence of trials, p (1 − p) .
successes and n − i failures is, by the assumed i n−i
Equation 5.1.2 then follows since there are i different sequences of the n outcomes
n − i failures — which can perhaps most easily be seen by
leading to i successes and
noting that there are ni different selections of the i trials that result in successes. For
instance, if n = 5, i = 2, then there are 52 choices of the two trials that are to result in
successes — namely, any of the outcomes
(s, s, f , f , f ) ( f , s, s, f , f ) ( f , f , s, f , s)
(s, f , s, f , f ) ( f , s, f , s, f )
(s, f , f , s, f ) ( f , s, f , f , s) ( f , f , f , s, s)
(s, f , f , f , s) ( f , f , s, s, f )
The probability mass function of three binomial random variables with respective param-
eters (10, .5), (10, .3), and (10, .6) are presented in Figure 5.1. The first of these is
symmetric about the value .5, whereas the second is somewhat weighted, or skewed, to
lower values and the third to higher values.
EXAMPLE 5.1a It is known that disks produced by a certain company will be defective
with probability .01 independently of each other. The company sells the disks in packages
of 10 and offers a money-back guarantee that at most 1 of the 10 disks is defective.
What proportion of packages is returned? If someone buys three packages, what is the
probability that exactly one of them will be returned?
SOLUTION If X is the number of defective disks in a package, then assuming that customers
always take advantage of the guarantee, it follows that X is a binomial random variable
5.1 The Bernoulli and Binomial Random Variables 143
0.20
0.15
0.10
0.05
0.0
0 1 2 3 4 5 6 7 8 9 10
0.25
0.20
0.15
0.10
0.05
0.0
0 1 2 3 4 5 6 7 8 9 10
0.25
0.20
0.15
0.10
0.05
0.0
0 1 2 3 4 5 6 7 8 9 10
with parameters (10, .01). Hence the probability that a package will have to be replaced is
Because each package will, independently, have to be replaced with probability .005, it
follows from the law of large numbers that in the long run .5 percent of the packages will
have to be replaced.
It follows from the foregoing that the number of packages that the person will have to
return is a binomial random variable with parameters n = 3 and p =&.005.
' Therefore, the
3
probability that exactly one of the three packages will be returned is 1 (.005)(.995)2 =
.015. ■
EXAMPLE 5.1b The color of one’s eyes is determined by a single pair of genes, with the gene
for brown eyes being dominant over the one for blue eyes. This means that an individual
having two blue-eyed genes will have blue eyes, while one having either two brown-eyed
genes or one brown-eyed and one blue-eyed gene will have brown eyes. When two people
mate, the resulting offspring receives one randomly chosen gene from each of its parents’
gene pair. If the eldest child of a pair of brown-eyed parents has blue eyes, what is the
probability that exactly two of the four other children (none of whom is a twin) of this
couple also have blue eyes?
SOLUTION To begin, note that since the eldest child has blue eyes, it follows that both
parents must have one blue-eyed and one brown-eyed gene. (For if either had two brown-
eyed genes, then each child would receive at least one brown-eyed gene and would thus
have brown eyes.) The probability that an offspring of this couple will have blue eyes is
equal
1 1 to
the1 probability that it receives the blue-eyed gene from both parents, which is
2 2 = 4 . Hence, because each of the other four children will have blue eyes with
probability 14 , it follows that the probability that exactly two of them have this eye color is
4
(1/4)2 (3/4)2 = 27/128 ■
2
Any random variable X whose probability mass function is given by Equation 5.3.1 is said
to be a hypergeometric random variable with parameters N , M , n.
EXAMPLE 5.3a The components of a 6-component system are to be randomly chosen from
a bin of 20 used components. The resulting system will be functional if at least 4 of its
6 components are in working condition. If 15 of the 20 components in the bin are in
working condition, what is the probability that the resulting system will be functional?
SOLUTION If X is the number of working components chosen, then X is hypergeometric
with parameters 15, 5, 6. The probability that the system will be functional is
6
P{X ≥ 4} = P{X = i}
i =4
15 5 15 5 15 5
+ +
4 2 5 1 6 0
=
20
6
≈ .8687 ■
* We are following the convention that mr = 0 if r > m or if r < 0.
5.3 The Hypergeometric Random Variable 157
To compute the mean and variance of a hypergeometric random variable whose prob-
ability mass function is given by Equation 5.3.1, imagine that the batteries are drawn
sequentially and let
1 if the ith selection is acceptable
Xi =
0 otherwise
Now, since the ith selection is equally likely to be any of the N + M batteries, of which
N are acceptable, it follows that
N
P{Xi = 1} = (5.3.2)
N +M
Also, for i = j,
which follows since, given that the ith selection is acceptable, the jth selection is equally
likely to be any of the other N + M − 1 batteries of which N − 1 are acceptable.
To compute the mean and variance of X, the number of acceptable batteries in the
sample of size n, use the representation
n
X = Xi
i=1
This gives
n
n
nN
E [X ] = E [Xi ] = P{Xi = 1} = (5.3.4)
N +M
i=1 i=1
Also, Corollary 4.7.3 for the variance of a sum of random variables gives
n
Var(X ) = Var(Xi ) + 2 Cov(Xi , Xj ) (5.3.5)
i=1 1≤i<j≤n
N M
Var(Xi ) = P{Xi = 1}(1 − P{Xi = 1}) = (5.3.6)
N +M N +M
158 Chapter 5: Special Random Variables
E [Xi Xj ] = P{Xi Xj = 1}
= P{Xi = 1, Xj = 1}
N (N − 1)
= from Equation 5.3.3 (5.3.7)
(N + M )(N + M − 1)
If we let p = N /(N + M ) denote the proportion of batteries in the bin that are acceptable,
we can rewrite Equations 5.3.4 and 5.3.8 as follows.
E (X ) = np
n−1
Var(X ) = np(1 − p) 1 −
N +M −1