0% found this document useful (0 votes)
16 views10 pages

Companion For Chapter 09

Chapter 9 of 'Fundamentals of Probability with Stochastic Processes' discusses multivariate distributions, providing additional examples related to independent random variables and their transformations. It includes various scenarios such as mine explosions, registration processes, and customer service interactions, illustrating the application of probability mass functions and density functions. The chapter emphasizes the independence of rank orderings and the memoryless property of exponential random variables in different contexts.

Uploaded by

happyhaha174
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views10 pages

Companion For Chapter 09

Chapter 9 of 'Fundamentals of Probability with Stochastic Processes' discusses multivariate distributions, providing additional examples related to independent random variables and their transformations. It includes various scenarios such as mine explosions, registration processes, and customer service interactions, illustrating the application of probability mass functions and density functions. The chapter emphasizes the independence of rank orderings and the memoryless property of exponential random variables in different contexts.

Uploaded by

happyhaha174
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

COMPANION FOR CHAPTER 9

OF

FUNDAMENTALS
OF PROBABILITY
WITH STOCHASTIC PROCESSES

FOURTH EDITION

SAEED GHAHRAMANI
Western New England University
Springfield, Massachusetts, USA

A CHAPMAN & HALL BOOK


C ontents

 9 Multivariate Distributions 3

9A Additional Examples 3
9B Transformations of n > 2 Random Variables 9
Section 9A Additional Examples 3

Chapter 9

Multivariate Distributions

9A ADDITIONAL EXAMPLES

Example 9a Suppose that mine explosions in a certain country occur at X 1 , X1 + X2 ,


X1 +X2 +X3 , . . . , where Xi’s are identically distributed and independent random variables
with probability density function.
f (x) = λ2 xe−λx x ≥ 0.
Let N (t) be the total number of mine explosions in that country by time t. Find the proba-
bility mass function of N (t).

Solution: Note that for i ≥ 1, X i is a gamma random variable with parameters 2 and λ.
Hence Xi is the sum of 
two independent exponential random variables. Consider a Poisson
process N1 (t) : t ≥ 0 with parameter λ. The times between consequent events of this
process are exponential with parameter λ. Thus N (t) = n if and only if N 1 (t) = 2n or
2n + 1. This observation yields
  e−λt (λt)2n e−λt (λt)2n+1
P N (t) = n = + , n = 0, 1, 2, . . .. 
(2n)! (2n)!

Example 9b Let X1 , X2 , . . . , Xn be independent exponential random variables with


parameters λ1 , λ2 , . . . , λn , respectively. There are n! possible rank orderings of X 1 , X2 ,
. . . , Xn , one of them being X 1 < X2 < X3 < · · · < Xn . Show that the rank ordering of
{X1 , X2 , . . ., Xn } is independent of min(X 1 , X2, . . . , Xn ). That is, if (i1 , i2, . . . , in) is a
permutation of (1, 2, . . ., n), then
P (Xi1 < Xi2 < · · · < Xin | min Xi > t) = P (Xi1 < Xi2 < · · · < Xin ).
1≤i≤n

Solution: Given that min 1≤i≤n Xi > t, each Xi is greater than t. Due to the memoryless
property of exponential random variables, after t units of time, X i1 −t, Xi2 −t, . . . , Xin −t
have the same distributions as X i1 , Xi2 , . . . , Xin , respectively. So
P (Xi1 < Xi2 < · · · < Xin | min Xi > t) = P (Xi1 − t < Xi2 − t < · · · < Xin − t)
1≤i≤n

= P (Xi1 < Xi2 < · · · < Xin ). 


Section 9A Additional Examples 4

Example 9c At an institution, the registration process consists of three steps. First,


students discuss their options for courses with their faculty advisors. Second, students
check in at the Administration Services to receive clearance on their financial and health
status. Third, they meet with a registrar who inputs their courses into the system. Suppose
that the amounts of time spent at each step are independent exponential random variables
with parameters λ1 , λ2 , and λ3 , respectively. Find the probability density function of the
total time a student spends on the entire registration process if
(a) λi = λj , if i = j;
(b) λ1 = λ2 = λ3 = λ.

Solution: Let X 1 , X2 , and X3 be the amounts of time the student spends at steps 1, 2, and
3, respectively. We are given that X 1 , X2 , and X3 are independent exponential random
variables with parameters λ 1 , λ2 , and λ3 , respectively. Let f1 , f2 , and f3 be the probability
density functions of X 1 , X2 , and X3 , respectively. We will now find the probability density
function of the desired quantity, X 1 + X2 + X3 , in cases (a) and (b).
(a) Using the convolution theorem, we will first find, g, the probability density function
of X1 + X2 . We have
 ∞
g(t) = f1 (x)f2 (t − x) dx
0
 t
= λ1 e−λ1 x λ2 e−λ2 (t−x) dx
0
 t
−λ2 t
= λ1 λ2 e e−(λ1 −λ2 )x dx
0
 1  
= λ1 λ2 e−λ2 t 1 − e−(λ1 −λ2 )t
λ1 − λ2
λ1 λ2
= λ2 e−λ2 t + λ1 e−λ1 t .
λ1 − λ2 λ2 − λ1

Let h be the probability density function of X 1 + X2 + X3 ; applying the convolution


theorem again, straightforward calculations give
 ∞
h(t) = g(x)f3 (t − x) dx
0

λ2 λ3 λ1 λ3
= λ1 eλ1 t + λ2 e−λ2 t
(λ2 − λ1 )(λ3 − λ1 ) (λ1 − λ2 )(λ3 − λ2 )
λ1 λ2
+ λ3 e−λ3 t .
(λ1 − λ3 )(λ2 − λ3 )
(b) In this case also, by the convolution theorem, g, the probability density function of
X1 + X2 , is given by
 ∞  t
g(t) = f1 (x)f2 (t − x) dx = λe−λx · λe−λ(t−x)dx = λ2 te−λt .
−∞ 0
Section 9A Additional Examples 5

This is the density function of a gamma random variable with parameters 2 and λ.
Hence X1 + X2 is gamma with parameters 2 and λ. Let h be the probability density
function of X 1 + X2 + X3 ; applying the convolution theorem again, straightforward
calculations give
 ∞
h(t) = g(x)f3 (t − x) dx = (1/2)λ3t2 e−λt .
0

Hence if λ1 = λ2 = λ3 = λ, then X1 + X2 + X3 is gamma with parameters 3 and


λ.

Remark Let X1 , X2 , . . . , Xn be n independent exponential random variables with


parameters λ1 , λ2 , . . . , λn , respectively. Using induction and the result of this example,
we can show that,

(a) if λi = λj , for i = j, then, f, the probability density function of X 1 + X2 + · · ·+ Xn


is given by
n
λj
f (t) = λi e−λi t ;
λj − λi
i=1 j=i

(b) if λi = λ, for 1 ≤ i ≤ n, then X1 + X2 + · · · + Xn is gamma with parameters n and


λ. 

Example 9d Michelle and Fred are shopping at a mall. Michelle decides to buy a leather
coat and Fred, knowing that it will take her a while, goes to buy an ice cream. At the
ice cream parlor, there is only one employee behind the counter. When Fred arrives, there
is one customer currently being waited on and 6 more are in line. Suppose that the time that
it takes to buy an ice cream is exponentially distributed with mean 3 minutes. Furthermore,
the time it takes to choose and buy a leather coat is also exponentially distributed, but with
mean 28 minutes. Also suppose that Michelle is the only customer in the coat shop, and,
lastly, it takes 2 minutes to travel from the coat shop to the ice cream parlor. What is the
probability that Fred will return before Michelle has finished buying her coat?

Solution: When Fred enters the ice cream parlor, there are 7 customers ahead of him. Let
us call the customer who is currently being waited on Customer 1, and the 6 customers in
line Customers 2 to 7, respectively. Fred is Customer 8. Let X 1 be the period starting when
Fred enters the ice cream parlor and ending when Customer 1 leaves. By the memoryless
property of the exponential random variables, X 1 is exponential with mean 3 minutes. For
2 ≤ i ≤ 8, let Xi be the time that it takes for Customer i to buy an ice cream; X i is
exponential with mean 3 minutes. Finally, let Y be the time until Michelle buys her fa-
vorite leather coat. Then, it is reasonable to assume that {Y, X 1, X2, . . . , X8 } is a set of
independent random variables. To find the desired probability,

P (4 + X1 + X2 + · · · + X8 < Y ) = P (X1 + X2 + · · · + X8 < Y − 4),

we will successively calculate


Section 9A Additional Examples 6

P (X1 < Y − 4),


P (X1 + X2 < Y − 4),
P (X1 + X2 + X3 < Y − 4),
..
.
P (X1 + X2 + · · · + X8 < Y − 4).
To do so, note that since the random variables X 1 and Y are independent, f (x, y), the joint
probability density function of these two random variables is given by
1 1
f (x, y) = e−x/3 · e−y/28 , x ≥ 0, y ≥ 0.
3 28
Therefore,
 ∞  ∞ 
1 −x/3 1 −y/28
P (X1 < Y − 4) = e · e dy dx
0 x+4 3 28
 ∞  ∞ 
1 −x/3 1 −y/28
= e e dy dx
0 3 x+4 28
 ∞  ∞
1 −x/3 −(x+4)/28 1
= e ·e dx = e−4/28 e−31x/84 dx
0 3 3 0

84 −4/28
= e = 0.783.
93
Thus, P (X1 < Y − 4) = P (2 + X1 < Y − 2), the probability that Customer 1 leaves at
least 2 minutes before Michelle has bought her leather coat is 0.783. Next, we calculate,
P (X1 + X2 < Y − 4), the probability that Customer 2 also leaves at least 2 minutes
before Michelle is done. To do so, note that by the memoryless property of the exponential,
the period starting when Customer 1 leaves, and ending when Michelle is done, is still
exponentially distributed with mean 28 minutes. Therefore repeating the same argument
yields
P (X1 + X2 < Y − 4 | X1 < Y − 4) = 0.783.
Thus

P (X1 + X2 < Y − 4) =
P (X1 + X2 < Y − 4 | X1 < Y − 4)P (X1 < Y − 4) = (0.783)2.

Similarly,
P (X1 + X2 + X3 < Y − 4 | X1 + X2 < Y − 4) = 0.783.
Hence,

P (X1 + X2 + X3 < Y − 4) =
P (X1 + X2 + X3 < Y − 4 | X1 + X2 < Y − 4) ×
P (X1 + X2 < Y − 4) = (0.783)(0.783)2 = (0.783)3.
Section 9A Additional Examples 7

Continuing this argument, we have that the probability that Fred (Customer 8) leaves the
ice cream parlor at least 2 minutes before Michelle has bought her leather coat is

P (X1 + X2 + · · · + X8 < Y − 4) = (0.783)8 = 0.141. 

Example 9e Suppose that shocks occur to a system according to a Poisson process with
parameter λ. Furthermore, suppose that k types of shocks are identified to occur and,
independently form other shocks, the probability that a shock is of type i, 1 ≤ i ≤ k, is p i .
For 1 ≤ i ≤ k, let Xi be the number of shocks of type i occurring between 0 and t. Find
the joint probability mass function of X 1 , X2 , . . . , Xk .

Solution: Let j 1 , j2 , . . . , jk be non-negative integers. For 1 ≤ i ≤ k, let q i = P (Xi = ji ).


Then
(j1 + j2 + · · · + jk )! j1 j2
P (X1 = j1 , X2 = j2 , . . ., Xk = jk ) = q1 q2 · · · qkjk .
j1 ! j2 ! · · · , jk !

To find qi = P (Xi = ji ), let N (t) be the number of shocks occurring to the system between
0 and t. We have

   
qi = P (Xi = ji ) = P Xi = ji | N (t) = n P N (t) = n
n=0

∞ 
n ji e−λt (λt)n
= pi (1 − pi )n−ji ·
ji n!
n=ji


n! (λt)n−ji (λt)ji
= pji i e−λt (1 − pi )n−ji ·
ji ! (n − ji )! n!
n=ji

∞  n−ji
(pi λt)ji e−λt λt(1 − pi )
=
ji ! (n − ji )!
n=ji

(pi λt)ji e−λt λt(1−pi) (piλt)ji e−pi λt


= ·e = .
ji ! ji !
Therefore, the joint probability mass function of X 1 , X2 , . . . , Xk is

P (X1 = j1 , X2 = j2 , . . . , Xk = jk )
    
(j1 + j2 + · · · + jk )! (p1 λt)j1 e−p1 λt (p2 λt)j2 e−p2 λt (pk λt)jk e−pk λt
= ··· ,
j1 ! j2 ! · · · , jk ! j1 ! j2 ! jk !

where j1 , j2 , . . . , jk are non-negative integers. 


Section 9A Additional Examples 8

Example 9f (Genetics) As we know, in humans, for blood type, there are three alle-
les A, B, and O. The alleles A and B are codominant to each other and dominant to O. A
man of genotype AB marries a woman of genotype BO. If they have six children, what is
the probability that three will have type B blood, two will have type A blood, and one will
have type AB blood?

Solution: The probability is 1/4 that the blood type of a child of this man and this woman
is AB. The probability is 1/4 that it is A, and the probability is 1/2 that it is B. The desired
probability is equal to

6! 1 3 1 2 1 1 15
= = 0.117. 
3! 2! 1! 2 4 4 128

Example 9g (Genetics) Let p and q be positive numbers with p + q = 1. For a gene


with dominant allele A and recessive allele a, let p 2 , 2pq, and q 2 be the probabilities that a
randomly selected person from a population has genotype AA, Aa, and aa, respectively. A
group of six members of the population is selected randomly. Determine the value of p that
maximizes the probability of the event of obtaining two AA’s, two Aa’s, and two aa’s.

Solution: The probability of two AA’s, two Aa’s, and two aa’s is

6!  2  2
g(p) = (p2 )2 2p(1 − p) (1 − p)2 = 360p6 (1 − p)6 .
2! 2! 2!
To find the maximum of this function, set g  (p) = 0 to obtain p = 1/2. 
Section 9B Transformations of n > 2 Random Variables 9

9B TRANSFORMATIONS OF n>2 RANDOM VARIABLES

Theorem 8.8 of the book can be generalized in the following obvious way for functions of
more than two random variables.

Theorem 9a Let X1 , X2 , . . . , Xn be n random variables with the joint probability den-


sity function f (x 1 , x2 , . . . , xn ). Let g1 , g2 , . . . , gn be n real valued functions of n variables
and Y1 = g1 (X1 , X2 , . . ., Xn ), Y2 = g2 (X1 , X2, . . . , Xn ), . . . , Yn = gn (X1 , X2 , . . ., Xn ).
If
(i) the system of n equations in n unknowns


⎪g1 (x1 , x2 , . . . , xn) = y1



⎨g2 (x1 , x2 , . . . , xn) = y2
.. (1)



⎪ .

⎩g (x , x , . . . , x ) = y
n 1 2 n n

has a unique solution for x 1 , x2 , . . . , xn in terms of y 1 , y2 , . . . , yn , and

(ii) the functions g 1 , g2 , . . . , gn have continuous partial derivatives and the Jacobian of
the transformation ⎧

⎪y1 = g1 (x1 , x2 , . . . , xn )



⎨y2 = g2 (x1 , x2 , . . . , xn )
⎪ ..

⎪ .


⎩y = g (x , x , . . . , x ),
n n 1 2 n

is nonzero at all points (x 1 , x2 , . . ., xn ); that is, the following n × n determinant is


nonzero  
 ∂g1 ∂g1 ∂g1 
 
 ∂x1 ∂x2 · · · ∂xn 
 
 
 ∂g 
 2 ∂g 2 ∂g 2 
 ··· 
∂(g1 , g2, . . . , gn )  ∂x1 ∂x2 ∂xn 
=   = 0,
∂(x1 , x2 , . . ., xn )  
 ... .. .. 
 . . 
 
 
 ∂gn ∂gn ∂g 
 ···
n 
 
∂x1 ∂x2 ∂xn
then, the random variables Y 1 , Y2 , . . . , Yn are jointly continuous with the joint probability
density function h(y 1 , y2 , . . ., yn ) given by
 
 ∂(g1 , g2 , . . . , gn) −1
h(y1 , y2 , . . ., yn ) = f (x1 , x2 , . . . , xn )   ,
∂(x1 , x2 , . . . , xn) 
where in this formula, for given (y 1 , y2 , . . ., yn ), we have that (x1 , x2 , . . . , xn) is the unique
solution of the system (1).
Section 9B Transformations of n > 2 Random Variables 10

Example 9h Let f (x1 , x2 , x3 ) be the joint probability density function of X 1 , X2 , and


X3 . Find the joint probability density function of Y 1 = X1 +X2 +X3 , Y2 = X1 −X2 +X3 ,
and Y3 = X1 − X2 − X3 .

Solution: Let g 1 (x1 , x2 , x3 ) = x1 + x2 + x3 , g2 (x1 , x2 , x3 ) = x1 − x2 + x3 , and


g3 (x1 , x2 , x3 ) = x1 − x2 − x3 . Then the system of equations


⎨x1 + x2 + x3 = y1

x1 − x2 + x3 = y2


⎩x − x − x = y
1 2 3 3

has the unique solution x 1 = (y1 + y3 )/2, x2 = (y1 − y2 )/2, x3 = (y2 − y3 )/2. Also
 
 1 1 1 
∂(g1 , g2, g3 ) 
=  1 −1 1  = 4 = 0.
∂(x1 , x2 , x3 ) 
1 −1 −1 

Hence, by Theorem 9a, h(y1 , y2 , y3 ), the joint probability density function of Y 1 , Y2 , Y3 is


given by

1 1 y1 + y3 y1 − y2 y2 − y3
h(y1 , y2 , y3 ) = f (x1 , x2 , x3 ) = f , , . 
4 4 2 2 2

You might also like