0% found this document useful (0 votes)
86 views7 pages

Material1 Latex

This chapter discusses special types of random variables that commonly arise in applications: the Bernoulli, binomial, and hypergeometric random variables. The Bernoulli random variable models a single success/failure trial with probability p of success. The binomial random variable models the number of successes in n independent trials, each with probability p of success. It has a probability mass function defined in terms of n and p. The hypergeometric random variable models sampling without replacement from a finite population containing objects of two types, and is used to calculate the probability of various outcomes of the sampling.

Uploaded by

Juan G
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
86 views7 pages

Material1 Latex

This chapter discusses special types of random variables that commonly arise in applications: the Bernoulli, binomial, and hypergeometric random variables. The Bernoulli random variable models a single success/failure trial with probability p of success. The binomial random variable models the number of successes in n independent trials, each with probability p of success. It has a probability mass function defined in terms of n and p. The hypergeometric random variable models sampling without replacement from a finite population containing objects of two types, and is used to calculate the probability of various outcomes of the sampling.

Uploaded by

Juan G
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Chapter 5

SPECIAL RANDOM VARIABLES

Certain types of random variables occur over and over again in applications. In this chapter,
we will study a variety of them.

5.1 THE BERNOULLI AND BINOMIAL


RANDOM VARIABLES
Suppose that a trial, or an experiment, whose outcome can be classified as either a “success”
or as a “failure” is performed. If we let X = 1 when the outcome is a success and X = 0
when it is a failure, then the probability mass function of X is given by

P{X = 0} = 1 − p (5.1.1)
P{X = 1} = p

where p, 0 ≤ p ≤ 1, is the probability that the trial is a “success.”


A random variable X is said to be a Bernoulli random variable (after the Swiss mathe-
matician James Bernoulli) if its probability mass function is given by Equations 5.1.1 for
some p ∈ (0, 1). Its expected value is

E [X ] = 1 · P{X = 1} + 0 · P{X = 0} = p

That is, the expectation of a Bernoulli random variable is the probability that the random
variable equals 1.
Suppose now that n independent trials, each of which results in a “success” with prob-
ability p and in a “failure” with probability 1 − p, are to be performed. If X represents
the number of successes that occur in the n trials, then X is said to be a binomial random
variable with parameters (n, p).

141
142 Chapter 5: Special Random Variables

The probability mass function of a binomial random variable with parameters n and p
is given by

n i
P{X = i} = p (1 − p)n−i , i = 0, 1, . . . , n (5.1.2)
i
 
where ni = n!/[i!(n − i)!] is the number of different groups of i objects that can be
chosen from a set of n objects. The validity of Equation 5.1.2 may be verified by first
noting that the probability of any particular sequence of the n outcomes containing i
 n  independence of trials, p (1 − p) .
successes and n − i failures is, by the assumed i n−i

Equation 5.1.2 then follows since there are i different sequences of the n outcomes
 n − i failures — which can perhaps most easily be seen by
leading to i successes and
noting that there are ni different selections of the i trials that result in successes. For
 
instance, if n = 5, i = 2, then there are 52 choices of the two trials that are to result in
successes — namely, any of the outcomes

(s, s, f , f , f ) ( f , s, s, f , f ) ( f , f , s, f , s)
(s, f , s, f , f ) ( f , s, f , s, f )
(s, f , f , s, f ) ( f , s, f , f , s) ( f , f , f , s, s)
(s, f , f , f , s) ( f , f , s, s, f )

where the outcome ( f , s, f , s, f ) means,


 for instance, that the two successes appeared on
trials 2 and 4. Since each of the 52 outcomes has probability p2 (1 − p)3 , we see that the
 
probability of a total of 2 successes in 5 independent trials is 52 p2 (1 − p)3 . Note that, by
the binomial theorem, the probabilities sum to 1, that is,

 n & '
 n
p(i) = pi (1 − p)n−i = [p + (1 − p)]n = 1
i
i=0 i=0

The probability mass function of three binomial random variables with respective param-
eters (10, .5), (10, .3), and (10, .6) are presented in Figure 5.1. The first of these is
symmetric about the value .5, whereas the second is somewhat weighted, or skewed, to
lower values and the third to higher values.

EXAMPLE 5.1a It is known that disks produced by a certain company will be defective
with probability .01 independently of each other. The company sells the disks in packages
of 10 and offers a money-back guarantee that at most 1 of the 10 disks is defective.
What proportion of packages is returned? If someone buys three packages, what is the
probability that exactly one of them will be returned?
SOLUTION If X is the number of defective disks in a package, then assuming that customers
always take advantage of the guarantee, it follows that X is a binomial random variable
5.1 The Bernoulli and Binomial Random Variables 143

Binomial (10, 0.5)


0.25

0.20

0.15

0.10

0.05

0.0
0 1 2 3 4 5 6 7 8 9 10

Binomial (10, 0.3)


0.30

0.25

0.20

0.15

0.10

0.05

0.0
0 1 2 3 4 5 6 7 8 9 10

Binomial (10, 0.6)


0.30

0.25

0.20

0.15

0.10

0.05

0.0
0 1 2 3 4 5 6 7 8 9 10

FIGURE 5.1 Binomial probability mass functions.

with parameters (10, .01). Hence the probability that a package will have to be replaced is

P{X > 1} = 1 − P{X = 0} − P{X = 1}


 
10 0 10 10
=1− (.01) (.99) − (.01)1 (.99)9 ≈ .005
0 1
144 Chapter 5: Special Random Variables

Because each package will, independently, have to be replaced with probability .005, it
follows from the law of large numbers that in the long run .5 percent of the packages will
have to be replaced.
It follows from the foregoing that the number of packages that the person will have to
return is a binomial random variable with parameters n = 3 and p =&.005.
' Therefore, the
3
probability that exactly one of the three packages will be returned is 1 (.005)(.995)2 =
.015. ■

EXAMPLE 5.1b The color of one’s eyes is determined by a single pair of genes, with the gene
for brown eyes being dominant over the one for blue eyes. This means that an individual
having two blue-eyed genes will have blue eyes, while one having either two brown-eyed
genes or one brown-eyed and one blue-eyed gene will have brown eyes. When two people
mate, the resulting offspring receives one randomly chosen gene from each of its parents’
gene pair. If the eldest child of a pair of brown-eyed parents has blue eyes, what is the
probability that exactly two of the four other children (none of whom is a twin) of this
couple also have blue eyes?
SOLUTION To begin, note that since the eldest child has blue eyes, it follows that both
parents must have one blue-eyed and one brown-eyed gene. (For if either had two brown-
eyed genes, then each child would receive at least one brown-eyed gene and would thus
have brown eyes.) The probability that an offspring of this couple will have blue eyes is
equal
 1  1 to
 the1 probability that it receives the blue-eyed gene from both parents, which is
2 2 = 4 . Hence, because each of the other four children will have blue eyes with
probability 14 , it follows that the probability that exactly two of them have this eye color is

4
(1/4)2 (3/4)2 = 27/128 ■
2

EXAMPLE 5.1c A communications system consists of n components, each of which will,


independently, function with probability p. The total system will be able to operate
effectively if at least one-half of its components function.
(a) For what values of p is a 5-component system more likely to operate effectively
than a 3-component system?
(b) In general, when is a 2k + 1 component system better than a 2k − 1 component
system?
SOLUTION
(a) Because the number of functioning components is a binomial random variable
with parameters (n, p), it follows that the probability that a 5-component system
will be effective is
 
5 3 2 5 4
p (1 − p) + p (1 − p) + p5
3 4
156 Chapter 5: Special Random Variables

5.3 THE HYPERGEOMETRIC RANDOM VARIABLE


A bin contains N + M batteries, of which N are of acceptable quality and the other M are
defective. A sample of size n is to be randomly chosen (without replacements)
  in the sense
that the set of sampled batteries is equally likely to be any of the N +M
n subsets of size n.
If we let X denote the number of acceptable batteries in the sample, then
N  M 
P{X = i} = iN +M
n−i
 , i = 0, 1, . . . , min(N , n)∗ (5.3.1)
n

Any random variable X whose probability mass function is given by Equation 5.3.1 is said
to be a hypergeometric random variable with parameters N , M , n.

EXAMPLE 5.3a The components of a 6-component system are to be randomly chosen from
a bin of 20 used components. The resulting system will be functional if at least 4 of its
6 components are in working condition. If 15 of the 20 components in the bin are in
working condition, what is the probability that the resulting system will be functional?
SOLUTION If X is the number of working components chosen, then X is hypergeometric
with parameters 15, 5, 6. The probability that the system will be functional is


6
P{X ≥ 4} = P{X = i}
i =4
     
15 5 15 5 15 5
+ +
4 2 5 1 6 0
= 
20
6
≈ .8687 ■

 
* We are following the convention that mr = 0 if r > m or if r < 0.
5.3 The Hypergeometric Random Variable 157

To compute the mean and variance of a hypergeometric random variable whose prob-
ability mass function is given by Equation 5.3.1, imagine that the batteries are drawn
sequentially and let

1 if the ith selection is acceptable
Xi =
0 otherwise

Now, since the ith selection is equally likely to be any of the N + M batteries, of which
N are acceptable, it follows that

N
P{Xi = 1} = (5.3.2)
N +M

Also, for i  = j,

P{Xi = 1, Xj = 1} = P{Xi = 1}P{Xj = 1|Xi = 1}


N N −1
= (5.3.3)
N +M N +M −1

which follows since, given that the ith selection is acceptable, the jth selection is equally
likely to be any of the other N + M − 1 batteries of which N − 1 are acceptable.
To compute the mean and variance of X, the number of acceptable batteries in the
sample of size n, use the representation


n
X = Xi
i=1

This gives


n 
n
nN
E [X ] = E [Xi ] = P{Xi = 1} = (5.3.4)
N +M
i=1 i=1

Also, Corollary 4.7.3 for the variance of a sum of random variables gives


n 
Var(X ) = Var(Xi ) + 2 Cov(Xi , Xj ) (5.3.5)
i=1 1≤i<j≤n

Now, Xi is a Bernoulli random variable and so

N M
Var(Xi ) = P{Xi = 1}(1 − P{Xi = 1}) = (5.3.6)
N +M N +M
158 Chapter 5: Special Random Variables

Also, for i < j,


Cov(Xi , Xj ) = E [Xi Xj ] − E [Xi ]E [Xj ]
Now, because both Xi and Xj are Bernoulli (that is, 0 − 1) random variables, it follows
that Xi Xj is a Bernoulli random variable, and so

E [Xi Xj ] = P{Xi Xj = 1}
= P{Xi = 1, Xj = 1}
N (N − 1)
= from Equation 5.3.3 (5.3.7)
(N + M )(N + M − 1)

So from Equation 5.3.2 and the foregoing we see that for i  = j,


 2
N (N − 1) N
Cov(Xi , Xj ) = −
(N + M )(N + M − 1) N +M
−NM
=
(N + M )2 (N + M − 1)
 
Hence, since there are n2 terms in the second sum on the right side of Equation 5.3.5,
we obtain from Equation 5.3.6

nNM n(n − 1)NM


Var(X ) = −
(N + M ) 2 (N + M )2 (N + M − 1)

nNM n−1
= 1− (5.3.8)
(N + M )2 N +M −1

If we let p = N /(N + M ) denote the proportion of batteries in the bin that are acceptable,
we can rewrite Equations 5.3.4 and 5.3.8 as follows.

E (X ) = np
 
n−1
Var(X ) = np(1 − p) 1 −
N +M −1

It should be noted that, for fixed p, as N + M increases to ∞, Var(X) converges to


np(1 − p), which is the variance of a binomial random variable with parameters (n, p).
(Why was this to be expected?)

You might also like