Companion For Chapter 09
Companion For Chapter 09
OF
FUNDAMENTALS
OF PROBABILITY
WITH STOCHASTIC PROCESSES
FOURTH EDITION
SAEED GHAHRAMANI
Western New England University
Springfield, Massachusetts, USA
9 Multivariate Distributions 3
9A Additional Examples 3
9B Transformations of n > 2 Random Variables 9
Section 9A Additional Examples 3
Chapter 9
Multivariate Distributions
9A ADDITIONAL EXAMPLES
Solution: Note that for i ≥ 1, X i is a gamma random variable with parameters 2 and λ.
Hence Xi is the sum of
two independent exponential random variables. Consider a Poisson
process N1 (t) : t ≥ 0 with parameter λ. The times between consequent events of this
process are exponential with parameter λ. Thus N (t) = n if and only if N 1 (t) = 2n or
2n + 1. This observation yields
e−λt (λt)2n e−λt (λt)2n+1
P N (t) = n = + , n = 0, 1, 2, . . ..
(2n)! (2n)!
Solution: Given that min 1≤i≤n Xi > t, each Xi is greater than t. Due to the memoryless
property of exponential random variables, after t units of time, X i1 −t, Xi2 −t, . . . , Xin −t
have the same distributions as X i1 , Xi2 , . . . , Xin , respectively. So
P (Xi1 < Xi2 < · · · < Xin | min Xi > t) = P (Xi1 − t < Xi2 − t < · · · < Xin − t)
1≤i≤n
Solution: Let X 1 , X2 , and X3 be the amounts of time the student spends at steps 1, 2, and
3, respectively. We are given that X 1 , X2 , and X3 are independent exponential random
variables with parameters λ 1 , λ2 , and λ3 , respectively. Let f1 , f2 , and f3 be the probability
density functions of X 1 , X2 , and X3 , respectively. We will now find the probability density
function of the desired quantity, X 1 + X2 + X3 , in cases (a) and (b).
(a) Using the convolution theorem, we will first find, g, the probability density function
of X1 + X2 . We have
∞
g(t) = f1 (x)f2 (t − x) dx
0
t
= λ1 e−λ1 x λ2 e−λ2 (t−x) dx
0
t
−λ2 t
= λ1 λ2 e e−(λ1 −λ2 )x dx
0
1
= λ1 λ2 e−λ2 t 1 − e−(λ1 −λ2 )t
λ1 − λ2
λ1 λ2
= λ2 e−λ2 t + λ1 e−λ1 t .
λ1 − λ2 λ2 − λ1
λ2 λ3 λ1 λ3
= λ1 eλ1 t + λ2 e−λ2 t
(λ2 − λ1 )(λ3 − λ1 ) (λ1 − λ2 )(λ3 − λ2 )
λ1 λ2
+ λ3 e−λ3 t .
(λ1 − λ3 )(λ2 − λ3 )
(b) In this case also, by the convolution theorem, g, the probability density function of
X1 + X2 , is given by
∞ t
g(t) = f1 (x)f2 (t − x) dx = λe−λx · λe−λ(t−x)dx = λ2 te−λt .
−∞ 0
Section 9A Additional Examples 5
This is the density function of a gamma random variable with parameters 2 and λ.
Hence X1 + X2 is gamma with parameters 2 and λ. Let h be the probability density
function of X 1 + X2 + X3 ; applying the convolution theorem again, straightforward
calculations give
∞
h(t) = g(x)f3 (t − x) dx = (1/2)λ3t2 e−λt .
0
Example 9d Michelle and Fred are shopping at a mall. Michelle decides to buy a leather
coat and Fred, knowing that it will take her a while, goes to buy an ice cream. At the
ice cream parlor, there is only one employee behind the counter. When Fred arrives, there
is one customer currently being waited on and 6 more are in line. Suppose that the time that
it takes to buy an ice cream is exponentially distributed with mean 3 minutes. Furthermore,
the time it takes to choose and buy a leather coat is also exponentially distributed, but with
mean 28 minutes. Also suppose that Michelle is the only customer in the coat shop, and,
lastly, it takes 2 minutes to travel from the coat shop to the ice cream parlor. What is the
probability that Fred will return before Michelle has finished buying her coat?
Solution: When Fred enters the ice cream parlor, there are 7 customers ahead of him. Let
us call the customer who is currently being waited on Customer 1, and the 6 customers in
line Customers 2 to 7, respectively. Fred is Customer 8. Let X 1 be the period starting when
Fred enters the ice cream parlor and ending when Customer 1 leaves. By the memoryless
property of the exponential random variables, X 1 is exponential with mean 3 minutes. For
2 ≤ i ≤ 8, let Xi be the time that it takes for Customer i to buy an ice cream; X i is
exponential with mean 3 minutes. Finally, let Y be the time until Michelle buys her fa-
vorite leather coat. Then, it is reasonable to assume that {Y, X 1, X2, . . . , X8 } is a set of
independent random variables. To find the desired probability,
84 −4/28
= e = 0.783.
93
Thus, P (X1 < Y − 4) = P (2 + X1 < Y − 2), the probability that Customer 1 leaves at
least 2 minutes before Michelle has bought her leather coat is 0.783. Next, we calculate,
P (X1 + X2 < Y − 4), the probability that Customer 2 also leaves at least 2 minutes
before Michelle is done. To do so, note that by the memoryless property of the exponential,
the period starting when Customer 1 leaves, and ending when Michelle is done, is still
exponentially distributed with mean 28 minutes. Therefore repeating the same argument
yields
P (X1 + X2 < Y − 4 | X1 < Y − 4) = 0.783.
Thus
P (X1 + X2 < Y − 4) =
P (X1 + X2 < Y − 4 | X1 < Y − 4)P (X1 < Y − 4) = (0.783)2.
Similarly,
P (X1 + X2 + X3 < Y − 4 | X1 + X2 < Y − 4) = 0.783.
Hence,
P (X1 + X2 + X3 < Y − 4) =
P (X1 + X2 + X3 < Y − 4 | X1 + X2 < Y − 4) ×
P (X1 + X2 < Y − 4) = (0.783)(0.783)2 = (0.783)3.
Section 9A Additional Examples 7
Continuing this argument, we have that the probability that Fred (Customer 8) leaves the
ice cream parlor at least 2 minutes before Michelle has bought her leather coat is
Example 9e Suppose that shocks occur to a system according to a Poisson process with
parameter λ. Furthermore, suppose that k types of shocks are identified to occur and,
independently form other shocks, the probability that a shock is of type i, 1 ≤ i ≤ k, is p i .
For 1 ≤ i ≤ k, let Xi be the number of shocks of type i occurring between 0 and t. Find
the joint probability mass function of X 1 , X2 , . . . , Xk .
To find qi = P (Xi = ji ), let N (t) be the number of shocks occurring to the system between
0 and t. We have
∞
qi = P (Xi = ji ) = P Xi = ji | N (t) = n P N (t) = n
n=0
∞
n ji e−λt (λt)n
= pi (1 − pi )n−ji ·
ji n!
n=ji
∞
n! (λt)n−ji (λt)ji
= pji i e−λt (1 − pi )n−ji ·
ji ! (n − ji )! n!
n=ji
∞ n−ji
(pi λt)ji e−λt λt(1 − pi )
=
ji ! (n − ji )!
n=ji
P (X1 = j1 , X2 = j2 , . . . , Xk = jk )
(j1 + j2 + · · · + jk )! (p1 λt)j1 e−p1 λt (p2 λt)j2 e−p2 λt (pk λt)jk e−pk λt
= ··· ,
j1 ! j2 ! · · · , jk ! j1 ! j2 ! jk !
Example 9f (Genetics) As we know, in humans, for blood type, there are three alle-
les A, B, and O. The alleles A and B are codominant to each other and dominant to O. A
man of genotype AB marries a woman of genotype BO. If they have six children, what is
the probability that three will have type B blood, two will have type A blood, and one will
have type AB blood?
Solution: The probability is 1/4 that the blood type of a child of this man and this woman
is AB. The probability is 1/4 that it is A, and the probability is 1/2 that it is B. The desired
probability is equal to
6! 1 3 1 2 1 1 15
= = 0.117.
3! 2! 1! 2 4 4 128
Solution: The probability of two AA’s, two Aa’s, and two aa’s is
6! 2 2
g(p) = (p2 )2 2p(1 − p) (1 − p)2 = 360p6 (1 − p)6 .
2! 2! 2!
To find the maximum of this function, set g (p) = 0 to obtain p = 1/2.
Section 9B Transformations of n > 2 Random Variables 9
Theorem 8.8 of the book can be generalized in the following obvious way for functions of
more than two random variables.
(ii) the functions g 1 , g2 , . . . , gn have continuous partial derivatives and the Jacobian of
the transformation ⎧
⎪
⎪y1 = g1 (x1 , x2 , . . . , xn )
⎪
⎪
⎪
⎨y2 = g2 (x1 , x2 , . . . , xn )
⎪ ..
⎪
⎪ .
⎪
⎪
⎩y = g (x , x , . . . , x ),
n n 1 2 n
has the unique solution x 1 = (y1 + y3 )/2, x2 = (y1 − y2 )/2, x3 = (y2 − y3 )/2. Also
1 1 1
∂(g1 , g2, g3 )
= 1 −1 1 = 4 = 0.
∂(x1 , x2 , x3 )
1 −1 −1