0% found this document useful (0 votes)
138 views

Chapter #6 - Jointly Distributed Random Variables

The document discusses joint probability mass functions and density functions for multiple random variables. It provides examples of: 1) The joint probability mass function for the values obtained from rolling two dice, with the variables being the maximum value and sum. 2) The joint probability mass function for selecting balls from an urn without replacement, with examples for two and three balls. 3) Verifying that a given function is a valid joint probability density function for two continuous random variables X and Y. It also provides examples of calculating the marginal densities and probabilities from the joint density.

Uploaded by

Abhishek Goswami
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
138 views

Chapter #6 - Jointly Distributed Random Variables

The document discusses joint probability mass functions and density functions for multiple random variables. It provides examples of: 1) The joint probability mass function for the values obtained from rolling two dice, with the variables being the maximum value and sum. 2) The joint probability mass function for selecting balls from an urn without replacement, with examples for two and three balls. 3) Verifying that a given function is a valid joint probability density function for two continuous random variables X and Y. It also provides examples of calculating the marginal densities and probabilities from the joint density.

Uploaded by

Abhishek Goswami
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 9

Chapter #6 – Jointly Distributed Random Variables

Question #1: Two fair dice are rolled. Find the joint probability mass function of X and Y
when (a) X is the largest value obtained on any die and Y is the sum of the values; (b) X is
the value on the first die and Y is the larger of the two values; (c) X is the smallest and Y is
the largest value obtained on the dice. (The answers are based on the following table. Also
note that only the solution to part (a) is presented as parts (b) and (c) are solved similarly.)

Sum of two dice


First Die
1 2 3 4 5 6
1 2 3 4 5 6 7
2 3 4 5 6 7 8
Second Die

3 4 5 6 7 8 9
4 5 6 7 8 9 10
1
5 6 7 8 9 11
0
1
6 7 8 9 10 12
1

a) If X is the max on either die and Y is the sum of the dice, then the joint probability
mass function p XY ( x , y )=P ( X=x , Y = y ) is given in the following table.

X
Y 1 2 3 4 5 6 PY (Y =i)
1 0 0 0 0 0 0 0
2 1/36 0 0 0 0 0 1/36
3 0 2/36 0 0 0 0 2/36
2/3 3/36
4 0 1/36 0 0 0
6
2/3 4/36
5 0 0 2/36 0 0
6
1/3 5/36
6 0 0 2/36 2/36 0
6
7 0 0 0 2/36 2/36 2/36 6/36
8 0 0 0 1/36 2/36 2/36 5/36
9 0 0 0 0 2/36 2/36 4/36
10 0 0 0 0 1/36 2/36 3/36
11 0 0 0 0 0 2/36 2/36
12 0 0 0 0 0 1/36 1/36
5/3
P X (X=i ) 1/36 3/36 7/36 9/36 11/36
6

Note that the marginal probability mass function for X is given by summing the
columns while the marginal mass function for Y is given by summing the rows.
Question #2: Suppose that 3 balls are chosen without replacement from an urn consisting
of 5 white and 8 red balls (13 total). Let X i equal 1 if the ith ball selected is white, and let it
equal 0 otherwise. Give the joint probability mass function of (a) X 1 , X 2 ; (b) X 1 , X 2 , X 3 .
8∗7∗11
a) We have that X 1 ={1,0 } and X 2 ={1,0 }. We then calculate p ( 0,0 )= =14 /39 ,
13∗12∗11
5∗8∗11 10 8∗5∗11 10 5∗4∗11 5
p ( 1,0 )= = , p ( 0,1 )= = and p ( 1,1 )= = .
13∗12∗11 39 13∗12∗11 39 13∗12∗11 39
b) We have that X 1 ={1,0 }, X 2 ={1,0 } and X 3 ={1,0 }. We can therefore calculate
8∗7∗6 5∗8∗7 8∗5∗7 8∗7∗5
p ( 0,0,0 )= , p ( 1,0,0 )= , p ( 0,1,0 )= , p ( 0,0,1 )= ,
13∗12∗11 13∗12∗11 13∗12∗11 13∗12∗11
5∗4∗8 5∗8∗4 8∗5∗4 5∗4∗3
p ( 1,1,0 )= , p ( 1,0,1 )= , p ( 0,1,1 )= , p ( 1,1,1 )= .
13∗12∗11 13∗12∗11 13∗12∗11 13∗12∗11

Question #9: The joint probability density function of X and Y is given by the following

(6
f ( x , y ) = x 2+
7
xy
2 )
where x ∈ ( 0,1 ) and y ∈ ( 0,2 ) . (a) Verify that this is indeed a joint density

function. (b) Compute the probability density function of X and Y . (c) Compute the
P( X > Y ) . (d) Find the probability P(Y > ½∨X < ½). (e) Find E( X) . (f) Find E(Y ).

a) We must verify that the sum is


∞ ∞ 1 2 1 1

( ) (
2 1
6 2 xy 6 x 2 6 6 2 3 1 2 6 2
1= ∫ ∫ f (x , y)dydx=∫ ∫ dydx = ∫ [ x y + y ] dx= ∫ ( 2 x + x ) dx = [ x + x ] =
2 2
x +
−∞ −∞ 0 0 7 2 7 0 4 0 7 0 7 3 2 0 7 3
.

b) We must find the following integral


∞ 2

( )
2
6 2 xy 6 2 x 2 6
f X ( x )=∫ f ( x , y) dy=∫ dy = [ x y + y ] = ( 2 x + x ).
2
x + Thus,
−∞ 0 7 2 7 4 0 7
6
f X ( x )= ( 2 x + x )
2
where x ∈ ( 0,1 ). Similarly,
7
∞ 1

( ) ( )
1
6 2 xy 6 1 3 y 2 6 1 y
f Y ( y )= ∫ f ( x , y )dx =∫ x + dx= [ x + x ] = + , y ∈ ( 0,2 ) .
−∞ 0 7 2 7 3 4 0 7 3 4
c) We calculate that

( )
1 x 1 1 1

( )
❑ x 3
6 2 xy 6 x 2 6 3 x 6 5 3
P ( X >Y )= ∬ f ( x , y ) dydx=∫ ∫ x + dydx= ∫ [ x y + y ] dx= ∫ x + dx= ∫ x dx =
2

( x , y ) : x> y 0 0 7 2 70 4 0 70 4 70 4
.

2 1/ 2 2 1 /2
1 1 6 2 xy
P( y > ∩ X < ) 7 ∫ ∫ x + 2 dxdy ∫ ∫ x + 2 dxdy
2 xy
( ) ( )
1
( | )
1
d) We have P Y > X < =
2 2
2
1
2
= 1/ 2 1/0 2
6
= 1 /2 10/2 .
P( X < )
2 7
∫ ( 2 x + x ) dx
2
∫ ( 2 x + x ) dx
2

0 0

6
e) Since f X ( x ) = ( 2 x 2+ x ) where x ∈ ( 0,1 ), we have
7
∞ 1

( ) ()
1
6 6 1 4 1 3 6 1 1 6 5 5
E ( X ) =∫ x∗f ( x ) dx= ∫ 2 x + x dx= [ x + x ] =
3 2
+ = = .
−∞ 7 0 7 2 3 0 7 2 3 7 6 7

f) Since f Y ( x )= (
6 1 y
+
7 3 4 ) where y ∈ ( 0,2 ) , we have that
∞ 2

( ) ( ) ()
2
6 1 y 6 1 1 2 6 2 1 6 7
E ( Y )= ∫ y∗f ( y )dy= ∫ + dy= [ y + y ] = + = =1.
−∞ 70 3 4 7 3 8 0 7 3 2 7 6
Question #13: A man and a woman agree to meet at a certain location at about 12:30 PM.
(a) If the man arrives at a time uniformly distributed between 12:15 and 12:45, and if the
woman independently arrives at a time uniformly distributed between 12:00 and 1 PM,
find the probability that the first to arrive waits no longer than 5 minutes. (b) What is the
probability that the man arrives first?

a) If we let X denote the arrival time of the man, then X UNIF (15,45) while the
1
density is f X ( x )= when x ∈(15,45) and zero otherwise. Similarly, if we let Y
30
1
denote the arrival time of the women, then Y UNIF (0,60) and is f Y ( y )= when
60
x ∈(0,60) and zero otherwise. Since X and Y are independent, we have that the joint
1
∗1
density function f XY ( x , y )=f X ( x )∗f Y ( y), which implies that 30 1
f XY ( x , y )= =
60 1800
whenever we have that ( x , y ) ∈ [ 15,45 ] ×[0,60] and zero otherwise. Therefore, the
probability that the first to arrive waits no longer than 5 minutes is given by
45 x+5
1 1
P (|X −Y |≤5 )=P (−5 ≤ X−Y ≤5 )=P (−X−5≤−Y ≤−X +5 )=P ( X +5 ≥Y ≥ X−5 )=∫ ∫ dydx =
15 x−5 1800 1800
.
b) The probability that the man arrives first is
45 60 45
1 1 1 1 45 1
P ( X <Y )=∫∫ dydx= ∫ (60−x) dx= [60 x− x 2 ] = .
15 x 1800 1800 15 1800 2 15 2

Question #17: Three points X 1 , X 2 and X 3 are selected at random on a line L. What is the
probability that X 2 lies between X 1 and X 3 ?
1
 The probability is since each of the points is equally likely to be the middle one.
3

Question #21: Let the joint density function f ( x , y )=24 xy where (x , y )∈ [ 0,1 ] ×[0,1] and
0 ≤ x+ y ≤ 1 while we let f ( x , y )=0 otherwise. (a) Show that f ( x , y ) is a joint probability
density function. (b) Find the E( X) . (c) Find the E(Y ).
a) We must verify that the sum
∞ ∞ 1 1− x 1 1 1
1= ∫ ∫ f (x , y)dydx=∫ ∫ 24 xy dydx=∫ [12 x y ] dx=∫ 12 x(1−2 x + x )dx=∫ ( 12 x−24 x +12 x ) d
2 1−x 2 2 3
0
−∞ −∞ 0 0 0 0 0

b) We first find the marginal density for X. We thus have


∞ 1− x
f X ( x )=∫ f (x , y) dy= ∫ 24 xy dy=[12 x y 2 ]1−x 2 2 3
0 =12 x (1−x ) =12 x−24 x +12 x . Then
−∞ 0
∞ 1 1
4 12 5 2
we have E ( X ) =∫ x∗f ( x ) dx=∫ ( 12 x −24 x +12 x ) dx=[4 x −6 x + x ] = .
2 3 4 3

−∞ 0 5 0 5

2
c) By the same reasoning as above, we find that E ( Y )= .
5
Question #29: The gross weekly sales at a certain restaurant is a normal random variable
with mean $2200 and standard deviation $230. What is the probability that (a) the total
gross sales over the next 2 weeks exceeds $5000; (b) weekly sales exceed $2000 in at least
2 of the next 3 weeks? What independence assumptions have you made (independence)?

a) If we let W i be the random variable equal to the restaurant’s sales in week i , then we
see that W i N (2200,230). We can then define the random variable X =W 1 +W 2 and
by proposition 3.2, we have that X N (2200+2200,230+ 230) or that X N ( 4400,460).
We therefore have that
P ( X >5000 )=P
460 (
X−4400 5000−4400
>
460 )
=P ( Z> 1.85 )=1−Φ ( 1.85 )=0.0326 , where

Z N (0,1).
b) In any given week, we calculate that

P ( W i >2000 ) =P (
W i−2200 2000−2200
230
>
230 )
=P ( Z >−0.87 )=1−Φ (−0.87 )=0.808 . The

probability that weekly sales will exceed this amount in at least 2 of the next 3
weeks can be calculated with the binomial distribution where Y is the random
variable equal to the number of weeks that sales exceed $2,000. We therefore have
3

()2 ()
3
3 0
()
P ( Y ≥ 2 )=P ( Y =2 )+ P ( Y =3 )= 3 ( 0.808 ) ( 0.192 ) + 3 ( 0.808 ) ( 0.192 ) =∑ 3 ( 0.808 ) ( 0.192 ) =0.9036
2 1

i=2 i
n 3−i

Question #34: Jay has two jobs to do, one after the other. Each attempt at job i takes one
hour and is successful with probability pi. If p1=0.3 and p2=0.4 , what is the probability that
it will take Jay more than 12 hours to be successful on both jobs?

 Assume that Jay’s strategy is to attempt job 1 until he succeeds, and then move on to
job 2 after. The probability that Jay never gets job 1 done in 12 attempts is given by
0.7 12. The probability that Jay fails job 1 in his first i attempts, succeeding in the i+1
attempts and then never getting job 2 done is given by ( 0.7 )i ( 0.3 ) ( 0.6 )11−i. Thus, the
11
total probability is given by 0.7 + ∑ ( 0.7 ) ( 0.3 ) ( 0.6 )
12 i 11−i
.
i=0

Question #38: Choose a number X at random from the set of numbers {1, 2, 3, 4, 5}. Now
choose a number at random from the subset no larger than X , that is, from {1, . . . ,X}. Call
this second number Y. (a) Find the joint mass function of X and Y . (b) Find the conditional
mass function of X given that Y =i . Do it for i = 1, 2, 3, 4, 5. (c) Are X and Y independent?
Why?

a) We can represent the joint probability mass function of X and Y either with a table
or with an analytic formula. For the formula, we may write the joint PMF of these

two random variables as p XY ( X = j , Y =i )=


1 1
5 j ( )( )
. The table below presents this

joint PMF of X and Y also, where the row sums give pY ( y ), the probability mass
function of Y , while the column sums give p X (x), the probability mass function of X .
X
Y 1 2 3 4 5 PY (Y = j)
1/2 0.46
1 1/5 1/10 1/15 1/20
5
2 0 1/10 1/15 1/20 1/2 0.26
5
1/2 0.16
3 0 0 1/15 1/20
5
1/2 0.09
4 0 0 0 1/20
5
1/2 0.04
5 0 0 0 0
5
P X (X=i) 1/5 1/5 1/5 1/5 1/5

1 1
p XY ( X=i , Y = j) 5j j 1
b) We have that p X ∨Y ( X=i|Y = j )= p Y (Y = j)
= 5
= . The
5
= 5
1 1
∑ 5k ∑ k j∗∑ k
−1

k=i k=i k=i

construction of this PMF follows from the definition of conditional distributions.

c) No, because the distribution of Y depends on the value of X .

−x (y +1)
Question #41: The joint density function of X and Y is given by f XY ( x , y )=x e
whenever ( x , y ) ∈(0 , ∞)×(0 , ∞) and f XY ( x , y )=0 otherwise. (a) Find the conditional density
of X , given Y = y , and that of Y , given X =x . (b) Find the density function of Z=XY .

f XY ( x , y) xe
−x ( y+1)
f X∨Y ( x| y )= =∞ =( y +1 )2 x e−x( y+ 1)
a) We have that f Y ( y) for x >0 and that
∫ x e− x( y+1) dx
0

f XY ( x , y) x e− x( y+1) −xy
f Y ∨X ( y|x )= =∞ =x e
f X ( x) for y >0 .
∫ x e− x( y+1) dy
0

a
∞ x
b) We calculate the CDF of Z as F ( a )=F ( a ) =P ( XY ≤ a )=∫∫ x e−x ( y+1) dydx=1−e−a,
Z XY
0 0
d d −a
and then differentiate to obtain F Z ( a )= (1−e ), which implies that the density
da da
−a
is given by f Z ( a )=e for a> 0. (Or use the CDF technique.)

Question #52: Let X and Y denote the coordinates of a point uniformly chosen in the circle
1
of radius 1 centered at the origin. That is, their joint density is f XY ( x , y )= whenever we
π
have x 2+ y 2 ≤ 1 and f XY ( x , y )=0 otherwise. Find the joint density function of the polar
−1 Y
( )
1
coordinates R=( X 2 +Y 2 ) 2 and Θ=tan .
X

 We have f XY ( x , y ) and wish to find f R Θ ( r , θ ) , where the random variables X and Y


have been transformed into R and Θ by the function g1 ( x , y )=√ x 2+ y 2 and the

function g2 ( x , y )=tan
−1
( xy ). From the change of variables formula, we know that
x=rcos (θ) and y=rsin(θ) so we are able to uniquely solve for x and y in terms of r
and θ . The next step is to compute the Jacobian for this transformation, which is

[ ][ ]
∂ g1 ∂ g1 x y
=det √ x + y √ x + y2
∂x ∂y 2 2 2
11
J ( x , y )=det =…= = . We then use the
∂ g2 ∂ g2 −y
2 2
x
2 2
√x +y r
2 2

∂x ∂y x +y x +y
( x 1 , x 2 )∗1 ( h1 ( y 1 , y 2), h2 ( y 1 , y 2) )∗1
formula f Y Y ( y 1 , y 2 )=f X =f X X to find that
|J ( x 1 , y 1 )| |J ( x 1 , y 1 )|
1 2 1 X2 1 2

1
∗1
π r
f R Θ ( r , θ )= = whenever 0 ≤ r ≤1, 0<θ <2 π . Note that we could also compute the
1 π
r

[ ]
∂ h1 ∂ h1
∂r ∂θ
Jacobian as J ( r ,θ )=det and just add |J ( r , θ )|.
∂ h2 ∂ g2
∂r ∂θ

1
Question #55: X and Y have joint density function f ( x , y )= 2 when x ≥ 1 and y ≥1 (i.e.
2
x y
whenever ( x , y ) ∈ [ 1 , ∞ ) × ¿) and f ( x , y )=0 otherwise. (a) Compute the joint density function
X
of U =XY and V = . (b) What are the marginal densities?
Y

a) We have f XY ( x , y ) and wish to find f U V (u , v ), where the random variables X and Y


have been transformed into U and V by the function g1 ( x , y )=xy and the function
x
g2 ( x , y )= . We can solve for x and y uniquely in terms of u and v and find that we
y
√u
have x=√ u √ v and y= . The next step is to compute the Jacobian, which is given
√v
[ ]
∂ g1 ∂ g1

[ ]
y x
∂x ∂y −2 x −2 √u √ v
by J ( x , y )=det =det 1 −x = = =−2 v . We can then find
∂ g2 ∂ g2
y y
2 y √ u
∂x ∂y √v
1
∗1
( )
2
2 √u 1
that joint density is (√ u √ v ) ∗1 whenever √ u √ y ≥ 1
√v u
2
1
f U V (u , v ) = = =
|−2 v| 2v 2vu
2

and
√u ≥ 1 which reduces to u ≥1 and 1 < v <u.
√v u

∞ u
1 ln ( u )
b) We have that f U ( u )=∫ f U V ( u , v ) dv=∫ 2 v u 2 dv= u2 whenever u ≥1 and that
−∞ 1
u
∞ ∞
1 1
f V ( v )= ∫ f U V (u , v ) du=∫ 2
du= 2 for v>1 and similarly for v<1 we have
−∞ v 2vu 2v
∞ ∞
1 1
f V ( v )= ∫ f U V (u , v ) du=∫ 2
du= .
−∞ 1 2vu 2
2

Question #56: If X and Y are independent and identically distributed uniform random
variables on (0, 1), compute the joint density functions of each of the following random
X X X
variables (a) U =X +Y and V = ; (b) U =X and V = ; (c) U =X +Y and V = .
Y Y X +Y

a) We have f X ( x )=1 and f Y ( y )=1 if x , y ∈(0,1) and zero otherwise. The independence
of X and Y implies that their joint probability density function is
f XY ( x , y )=f X ( x ) f Y ( y ) =1 if 0< x <1 and 0< y<1 . The transformations are then u=x+ y
x u uv
and v= , so y=u−x=u−vy implying that y= and that x=vy= . This
y 1+ v 1+ v
allows to find

[ ] [ ]
∂x ∂x v u
∂u ∂v 1+v ( 1+ v )2 −uv u −u ( 1+ v ) −u
J=det =det = − = = , so we can
∂y ∂y 1 −u 3
( 1+ v ) (1+ v ) 3
( 1+ v ) 3
( 1+ v )2
∂u ∂v 1+v ( 1+ v )2
calculate f UV (u , v )=f XY ( uv
,
u
1+v 1+v )
∨J ∨¿
u
( 1+ v )2
. To find the bounds, we have 0<x <1

uv 1+ v u
so 0< <1 →0<uv <1+ v → 0< v < and 0< y<1 so 0< <1 →0<u <1+ v .
1+v v 1+v

x x u
b) We have u=x so x=u and v= so y= = , which allows us to calculate the
y v v

[ ] [ ]
∂x ∂x
1 0
−u
Jacobian as J=det ∂ u ∂ v =det 1 −u = 2 . We then have that
∂y ∂y v
v v2
∂u ∂v

( uv )∨J ∨¿ vu
f UV (u , v )=f XY u ,
u
if 0< x <1 →0< u<1 and 0< y<1 → 0< < 1→ 0<u< v .
2
v
u
Combining the bounds gives us that f UV (u , v )= 2 whenever 0<u< v <1.
v

x
c) We have u=x+ y and v= so x=v ( x + y )=uv and y=u−x=u−uv=u(1−v ), which
x+ y
implies that the Jacobian is

[ ]
∂x ∂x
J=det
∂u
∂y
∂v
∂y
=det
v
[u
1−v −u ]
=−uv −u ( 1−v ) =−uv−u+uv=−u. Thus,
∂u ∂v
f UV (u , v )=f XY ( uv , u(1−v ))∨J ∨¿u whenever we have that the bounds are
1 1
0< x <1 →0< uv< 1→ 0< v< and 0< y<1 → 0<u(1−v )< 1→ 0<u< .
u 1−v

You might also like