0% found this document useful (0 votes)
186 views17 pages

P&S Solved-Problems

This document contains solved problems related to joint distributions, conditional distributions, conditional expectation, covariance, correlation, and characteristic functions. The problems cover topics such as finding the probability that one random variable is greater than or equal to another, determining marginal probability distribution functions, computing expectations of random variables, and finding the probability distribution of functions of random variables.

Uploaded by

Vaibhavvashisht
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
186 views17 pages

P&S Solved-Problems

This document contains solved problems related to joint distributions, conditional distributions, conditional expectation, covariance, correlation, and characteristic functions. The problems cover topics such as finding the probability that one random variable is greater than or equal to another, determining marginal probability distribution functions, computing expectations of random variables, and finding the probability distribution of functions of random variables.

Uploaded by

Vaibhavvashisht
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 17

MATH-221: Probability and Statistics

Solved Problems (Joint Distributions, Conditional Distributions, Conditional


Expectation, Covariance, Correlation, Characteristic Function)

1
1. Let X and Y be random variables with the joint pmf P {X = i, Y = j} = N2
, i, j =
1, 2, · · · , N . Find (a) P {X ≥ Y } (b) P {X = Y }.

Solution: (a)
X
P {X ≥ Y } = f (x, y)
(x,y):x≥y
N X
X N
= f (x, y)
y=1 x≥y
N X N
X 1
=
y=1 x≥y
N2
N N
1 XX
= 1.
N 2 y=1 x≥y
" N N N N
#
1 X X X X
= 2 1+ 1+ 1 + ··· 1
N x=1 x=2 x=3 x=N
1
= [N + (N − 1) + (N − 3) + · · · 1]
N2
1 N (N + 1)
= 2×
N 2
1 1
= +
2 2N

(b)
X
P {X = Y } = f (x, y)
(x,y):x=y
N
X
= f (y, y)
y=1
N
X 1
=
y=1
N2
1
=
N

2. Let X andY be random variables with joint pdf given by


λ2 e−λy , if 0 ≤ x ≤ y
f (x, y) = , where λ > 0.
0, otherwise.
Find the marginal pdfs of X and Y . Also find the joint distribution function of X, Y .
Solution: The marginal pdf of X is
Z ∞
fX (x) = f (x, y)dy
−∞

 Z
λ2 e−λy dy if x ≥ 0

=
 0x if x < 0
 −λx
λe if x ≥ 0
=
0 if x < 0
The marginal pdf of Y is
Z ∞
fY (y) = f (x, y)dx
−∞
y
 Z
λ2 e−λy dx if y ≥ 0

=
 00 if y < 0
 2 −λy
λ ye if y ≥ 0
=
0 if y < 0
The joint distribution function of X and Y is
Z x Z y
F (x, y) = f (t, s) dtds
−∞ −∞
Z x Z y 
2 −λs
= λ e ds dt if 0 ≤ x ≤ y
0 t
Z x
= λ[e−λt − e−λy ]dt
0
 −λt x
e −λy
=λ − te
−λ 0
= 1 − e−λx − λxe−λy if 0 ≤ x ≤ y

If 0 ≤ y < x then
F (x, y) = 1 − e−λy (1 + λy)
In second, third and fourth quadrant F (x, y) = 0.
3. Let X, Y be continuous random variables with joint density function
 −y
e (1 − e−x ) if 0 < x < y < ∞
fX,Y (x, y) =
e−x (1 − e−y ) if 0 < y ≤ x < ∞

Show that X and Y are identically distributed. Also compute E[X] and E[Y ].
Solution:
Z ∞
fX (x) = fX,Y (x, y)dy
−∞

 Z
fX,Y (x, y)dy if 0 < x < ∞

= 0
 0 otherwise

Now for x > 0,


Z ∞ Z ∞ Z x
−y −x
fX,Y (x, y)dy = e (1 − e )dy + e−x (1 − e−y )dy
0 x 0
∞ x
= (1 − e−x ) −e−y x + e−x y + e−y 0
 

= e−x (1 − e−x ) + e−x [x + e−x − 1]


= xe−x

Therefore,

xe−x if 0 < x < ∞



fX (x) =
0 otherwise

Similarly,
Z ∞
fY (y) = fX,Y (x, y)dx
−∞

 Z
fX,Y (x, y)dx if 0 < y < ∞

= 0
 0 otherwise

Now for y > 0,


Z ∞ Z y Z ∞
−y −x
fX,Y (x, y)dx = e (1 − e )dx + e−x (1 − e−y )dx
0 0 y
−y −x y
∞
x + e 0 + (1 − e−y ) −e−x y
  
=e
= e−y [y + e−y − 1] + e−y (1 − e−y )
= ye−y

Therefore,

ye−y if 0 < y < ∞



fY (y) =
0 otherwise

Both X and Y are identically distributed. Further, E[X] = E[Y ] = 2.


4. Let X and Y be continuous random variables with joint pdf f . Find the pdf of (a)
XY (b) X − Y (c) |Y − X|.
Solution: (a) We first determine the distribution function of XY . For z ∈ R, set
Az = {(x, y) ∈ R2 |xy ≤ z}
Now
FXY (z) = P {XY ≤ z}
= P {(X, Y ) ∈ Az }
ZZ
= f (x, y)dxdy
Az
! z
!
Z 0 Z ∞ Z ∞ Z
x
= f (x, y)dy dx + f (x, y)dy dx
z
−∞ x
0 −∞

Now
! !
Z 0 Z ∞ Z ∞ Z ∞
f (x, y)dy dx = f (−x, y)dy dx
z z
−∞ x
0 −x
Z ∞ Z z 
u  du 
= f −x, − dx
0 −∞ x x
Z z Z ∞ 
1  u
= f −x, − dx du
−∞ 0 x x

z
!
Z ∞ Z Z  ∞ Z z
x u  du 
f (x, y)dy dx = f x, dx
0 −∞ 0 −∞ x x
Z z Z ∞ 
1  u
= f x, dx du
−∞ 0 x x
Z z Z ∞
 u i 
1h  u
FXY (z) = f −x, − + f x, dx du
−∞ 0 x x x
Z z Z ∞ 
1  u
= f x, dx du
−∞ −∞ |x| x
Therefore pdf of XY is
Z ∞
1  z
fXY (z) = f x, dx
−∞ |x| x
(b) We first determine the distribution function of Z := X − Y . For z ∈ R, set

Az = {(x, y) ∈ R2 |x − y ≤ z}

Now

FZ (z) = P {X − Y ≤ z}
= P {(X, Y ) ∈ Az }
ZZ
= f (x, y)dxdy
Az
Z ∞ Z ∞ 
= f (x, y)dy dx
−∞ x−z
Z ∞ Z −∞ 
= −f (x, x − s)ds dx (put y = x − s )
−∞ z
Z ∞ Z z 
= f (x, x − s)ds dx
−∞ −∞
Z z Z ∞ 
= f (x, x − s)dx ds
−∞ −∞

Therefore pdf of X − Y is
Z ∞
fY −X (z) = f (x, x − z)dx
−∞

(c) We first determine the distribution function of Z := |Y − X|. For z ∈ R, set

Az = {(x, y) ∈ R2 ||y − x| ≤ z}
If z < 0 then {(X, Y ) ∈ Az } = ∅. Therefore FZ (z) = 0 if z < 0. Now for z ≥ 0

FZ (z) = P {|Y − X| ≤ z}
= P {(X, Y ) ∈ Az }
ZZ ZZ ZZ
= f (x, y)dxdy = f (x, y)dxdy + f (x, y)dxdy
Az {y−x≥0}∩Az {y−x<0}∩Az
Z ∞ Z x+z  Z ∞ Z x 
= f (x, y)dy dx + f (x, y)dy dx
−∞ x −∞ x−z
Z ∞ Z z 
= f (x, x + s)ds dx (put y = x + s )
−∞ 0
Z ∞ Z 0 
+ −f (x, x − s)ds dx (put y = x − s )
−∞ z
Z ∞ Z z 
= [f (x, x + s) + f (x, x − s)]ds dx
−∞ 0
Z z Z ∞ 
= [f (x, x + s) + f (x, x − s)]dx ds
0 −∞

 Z ∞
[f (x, x + z) + f (x, x − z)]dx if z ≥ 0

f|Y −X| (z) = −∞
0 ; otherwise

5. Suppose X and Y be random variables that assume four values 1, 2, 3, 4. Their joint
probabilities are given by the following table.

HH Y
HH
1 2 3 4
X HH
1 1 1
1 20 20 20
0
1 2 3 1
2 20 20 20 20
1 2 3 1
3 20 20 20 20
1 1 1
4 0 20 20 20

Find the pmf of X + Y .

Solution: First of all the values taken by random variable Z := X + Y are {2, 3, 4, 5, 6, 7, 8}.
Hence pmf of Z is
1
fZ (2) = P (X + Y = 2) = P (X = 1, Y = 1) =
20
2
fZ (3) = P (X + Y = 3) = P (X = 1, Y = 2) + P (X = 2, Y = 1) =
20
4
fZ (4) = f (2, 2) + f (1, 3) + f (3, 1) =
20

6.

7. Let X ∼ uniform[0, 1] and Y ∼ Bernoulli(0.5), and assume that X, Y are indepen-


dent. Then determine the joint cumulative distribution function of X and Y .

Solution: Then joint CFD is given by

F (x, y) = FX (x)Fy (y)

where  
 0 if x ≤ 0  0 if y < 0
1
FX (x) = x if 0 ≤ x ≤ 1 , FY (y) = if 0 ≤ y < 1 ,
 2
1 if x ≥ 1 1 if y ≥ 1

Hence 

 0 if x ≤ 0 or y < 0
x
if 0 < x ≤ 1 and 0 ≤ y < 1


 2


1 if x ≥ 1 and y = 0
F (x, y) = ,

 x if 0 ≤ x ≤ 1 and y ≥ 1
1 if x ≥ 1 and y ≥ 1




 1
2
if x ≥ 1 and 0 ≤ y < 1

8. Let X and Y be independent Poisson random variables with parameters λ1 , λ2 re-


spectively. Find P (Y = m|X + Y = n) for m = 0, 1, 2 · · · , n.
Solution: Set Z := X + Y then Z takes values 0, 1, 2, · · · . For m = 0, 1, 2 · · · , n.
P (Y = m, X + Y = n)
P (Y = m|Z = n) =
P (X + Y = n)
P (Y = m, X = n − m)
= n
X
P (X = k, Y = n − k)
k=0
P (Y = m)P (X = n − m)
= n
X
P (X = k)P (Y = n − k)
k=0
λm λn−m
e−λ1 m!1 × e−λ2 (n−m)!
2

= n
X λk λn−k
e−λ1 1 × e−λ2 2
k=0
k! (n − k)!
n−m
λm
1 λ2
m! (n−m)!
= n
X λk1 λ2n−k
×
k=0
k! (n − k)!
λm λn−m
n! m!1 (n−m)!
2

= n
X λk1 λ2n−k
n! ×
k=0
k! (n − k)!
λm λn−m n
 n−m m
n! m!1 (n−m)!
2
λ λ2
= n
= m 1
(λ1 + λ2 ) (λ1 + λ2 )n

9. Let X and Y be independent and identically distributed random variables such that
1
P (X = 0) = P (X = 1) = . Show that X and |X − Y | are independent.
2
Solution: Set W = |X − Y |, Then W take values {0, 1} with pmf P (W = 0) = 21 , P (W = 1) =
1
2
. Note that X and W both are Bernoulli(0.5).
P (X = 0, W = 0) = P (X = 0, |X − Y | = 0) = P (X = 0, Y = 0)
P (X = 0, W = 1) = P (X = 0, |X − Y | = 1) = P (X = 0, Y = 1)
Yes, X and W are independent.
10. Let X and Y be two random variables. Suppose that var(X) = 4, and var(Y ) = 9.
If we know that the two random variables Z = 2X − Y and W = X + Y are
independent, find ρ(X, Y ).
Solution: Since independent random variables are uncorrelated, therefore we have
0 = cov(Z, W ) = cov(2X − Y, X + Y ) = cov(2X − Y, X) + cov(2X − Y, Y )
= cov(2X, X) + cov(−Y, X) + cov(2X, Y ) + cov(−Y, Y )
= 2cov(X, X) − cov(Y, X) + 2cov(X, Y ) − cov(Y, Y ) = 2var(X) + cov(X, Y ) − va
= 8 + cov(X, Y ) − 9 =⇒ cov(X, Y ) = 1
1 1
ρ(X, Y ) = √ =
4×9 6
11. Suppose X and Y are two independent random variables such that EX 4 = 2, EY 2 =
1, EX 2 = 1, and EY = 0. Compute var(X 2 Y ).
Solution:
2
var(X 2 Y ) = E (X 2 Y )2 − E X 2 Y
  
2
= E X 4Y 2 − E X 2Y
  
2
= E[X 4 ]E[Y 2 ] − E X 2 E[Y ]
 

= 2 × 1 − (1 × 0)2 = 2
12. The speed of a typical vehicle that drives past a police radar is modeled as an
exponentially distributed random variable X with mean 50 miles per hour. The
police radar’s measurement Y of the vehicle’s speed has an error which is modeled
as a normal random variable with zero mean and standard deviation equal to one
tenth of the vehicle’s speed. What is the joint PDF of X and Y ?
 
1 1
Solution: Recall that if X ∼ exp(λ) then E[X] = . So we have X ∼ exp . Therefore
λ 50
the pdf of X is
 1 −x
50
e 50 if x ≥ 0
fX (x) =
0 if x < 0
Also, conditioned
 x on X = x, the measurement Y has normal density with mean 0
2
and variance . Therefore,
10
2
1 − yx 2 10 y2
fY |X (y|x) = x √ e 2( 10 )
= √ e− 50x2 for all x > 0, y ∈ R
10
2π x 2π
Hence the joint pdf of X and Y is
f (x, y) = fY |X (y|x)fX (x)

10 1 x y2
 √ × e− 50 e− 50x2 if x > 0
= x 2π 50
 0 if x ≤ 0

3 − x2 −xy+y2
13. Let X and Y have the joint density f given by f (x, y) = e 2 , x, y ∈ R.

Find the conditional density of Y given X.

Solution: First compute fX (x).



√ Z ∞ √
3 − x2 ∞ − y2 −xy
Z Z
3 2
− x −xy+y
2
fX (x) = f (x, y)dy = e 2 dy = e 2 e 2 dy
−∞ 4π −∞ 4π −∞
√ √
3 − x2 ∞ − 21 y2 −2. x2 y+ x42 − x42 3 − x2 + x2 ∞ − 12 (y− x2 )2
Z   Z
= e 2 e dy = e 2 8 e dy
4π −∞ 4π −∞

3 − 3x2 ∞ − u2
Z
= e 8 e 2 du (subsituing y − x2 = u)
4π −∞
√ −  x 2
2

3 − 3x2 √ 1 2 √2
= e 8 2π = 2 √ e 3
4π √ 2π
3
 
4
that is X ∼ N 0, . Now
3
√ 2 2
3 − x −xy+y  2
x − xy + y 2 3x2

f (x, y) 4π
e 2 1
fY |X (y|x) = = √ 2√ = √ exp − +
fX (x) 3 − 3x8
e 2π 2π 2 8
4π 
1 1 2 y 2 xy

1 1 1 x 2
= √ exp − x + − = √ e− 2 (y− 2 )
2π 2 4 2 2 2π
In other words, the conditional density of Y given X = x is the normal density with
x
mean and variance 1.
2
14. Let X and Y be continuous random variables having a joint density f . Suppose that
Y and φ(X)Y have finite expectation. Show that
Z ∞
E[φ(X)Y ] = φ(x)E[Y |X = x]fX (x)dx
−∞

Solution:
Z ∞ Z ∞ Z ∞ Z ∞
E[φ(X)Y ] = φ(x)yf (x, y)dxdy = φ(x)yfY |X (y|x)fX (x)dxdy
Z−∞

−∞
Z ∞ −∞ −∞
 Z ∞
= φ(x)fX (x) yfY |X (y|x)dy dx = φ(x)fX (x)E[Y |X = x]dx
−∞ −∞ −∞
15. Let X and Y be two independent Poisson distributed random variables having pa-
rameters λ1 and λ2 respectively. Compute E[Y |X + Y = z] where z is a nonnegative
integer.

Solution: Recall from solution of Problem 8, for m = 0, 1, 2 · · · , z.

P (Y = m, X + Y = z) P (Y = m, X = z − m)
fY |X+Y (m|z) = P (Y = m|X + Y = z) = =
P (X + Y = z) P (X + Y = z)
λz−m λm
P (Y = m)P (X = z − m) e−λ1 (z−m)!
1
× e−λ2 m!2
= = z
P (X + Y = z) e−(λ1 +λ2 ) (λ1 +λ
z!
2)

z
 z−m m
m
λ1 λ2
=
(λ1 + λ2 )z

Therefore
z z z
λz−m

X X 1 λm
m 2
E[Y |X + Y = z] = mfY |X+Y (m|z) = m z
m=0 m=0
(λ1 + λ2 )
z
1 X z!
= m λz−m λm
(λ1 + λ2 ) m=0 m!(z − m)! 1
z 2

z
1 X z!
= m λz−m λm
(λ1 + λ2 ) m=1 m!(z − m)! 1
z 2

z
1 X z!
= z
λz−m
1 λm
2
(λ1 + λ2 ) m=1 (m − 1)!(z − m)!
z
z X (z − 1)!
= λz−m λm
(λ1 + λ2 ) m=1 (m − 1)!(z − m)! 1
z 2

z  
z X z − 1 z−m m
= λ λ2
(λ1 + λ2 )z m=1 m − 1 1
z−1  
z X z − 1 z−k−1 k+1
= λ1 λ2 (put m = k + 1)
(λ1 + λ2 )z k=0 k
z−1 
z − 1 k (z−1)−k zλ2 (λ2 + λ1 )z−1

zλ2 X zλ2
= z
λ2 λ1 = z
=
(λ1 + λ2 ) k=0 k (λ1 + λ2 ) λ1 + λ2

16. Let X and Y have a joint density f that is uniform over the interior of the triangle
with vertices at (0, 0), (2, 0), and (1, 2). Find the conditional expectation of Y given
X.
(1, 2)

y = 2x

y + 2x = 4

(0, 0) (2, 0)
Solution:
Area of triangle with vertices at (0, 0), (2, 0), and (1, 2) is 12 × 2 × 2 = 2. Therefore
the joint density is 12 over the triangle, zero elsewhere. First we need to compute
E[Y |X = x]. So we need conditional density of Y given X. For that, if x ≤ 0 or
x ≥ 2 then f (x, y) = 0 therefore fX (x) = 0. For x ∈ (0, 2)
 Z 2x
1
dy = x ; 0<x≤1
Z ∞ 


fX (x) = f (x, y)dy = 2 Z0
−∞ 1 −2x+4
dy = 2 − x ; 1 < x ≤ 2



2 0
Therefore

 1

f (x, y)  2x ; 0 < x ≤ 1, 0 < y < 2x
fY |X (y|x) = = 1
fX (x)  ; 1 < x ≤ 2, 0 < y < 4 − 2x
2(2 − x)

For x ∈ (0, 2), we have


 Z 2x
1
y dy = x ; 0 < x ≤ 1,


Z 

E[Y |X = x] = yfY |X (y|x)dy = Z0 4−2x2x
−∞ 1
y dy = 2 − x ; 1 < x ≤ 2,



0 4 − 2x

For other values of x, E[Y |X = x] = 0.

17. Let X and Y be iid exponential random variables with parameter λ, and set Z =
X + Y . Find the conditional expectation of X given Z.
Solution: We need to compute the conditional pdf of X given Z which is by definition
(
f (x,z)
fZ (y)
if fZ (z) > 0
fX|Z (x|z) =
0 if fZ (z) = 0

where f is joint density of X and Z. In order to find joint density of X and Z we


use the formula
f (x, z) = fX (x)fZ|X (z, x)
Basically we first determine the conditional distribution function of Z given X, i.e.,
P (Z ≤ z|X = x). Then we have the relation
Z z
P (Z ≤ z|X = x) = fZ|X (t|x)dt
−∞

Now

P (Z ≤ z|X = x) = P (X + Y ≤ z|X = x)
= P (x + Y ≤ z|X = x)
= P (x + Y ≤ z) ( ∵ X, Y are indepedent)
= P (Y ≤ z − x)
Z z−x
= fY (y)dy
−∞
Z z
= fY (t − x)dt (put y = t − x)
−∞

Hence fZ|X (z|x) = fY (z − x).


Also density of sum of independent random variable is the convolution of marginal
densities hence

0 if z < 0
fZ (z) = (fX ∗ fY )(z) = 2 −λz
λ ze if z ≥ 0

Hence
(
fX (x)fY (z−x)
fZ (z)
if fZ (z) > 0
fX|Z (x|z) =
0 if fZ (z) = 0
λe−λx λe−λ(z−x)

λ2 ze−λz
if z > 0, z − x ≥ 0, x ≥ 0
=
0 if z < 0
1

z
if 0 ≤ x ≤ z
=
0 otherwsie
Now
Z ∞ Z z
1 z
E[X|Z = z] = xfX|Z (x|z)dx = x dx =
−∞ 0 z 2

18. Suppose that X and Y are random variables with the same variance. Show that
X − Y and X + Y are uncorrelated.

Solution:

E[(X + Y )(X − Y )] = E X 2 − Y 2 = E[X 2 ] − E[Y 2 ]


 

= var(X) + (E[X])2 − var(Y ) − (EY )2 = (EX + EY )(EX − EY )


= E(X + Y )E(X − Y )

19. Find the characteristic function of the following distributions:


(a) exp(λ) (b) geometric(p).

Solution: (a) exp(λ) :


∞ ∞
e(it−λ)x ∞ −λ
Z Z
itX itx −λx
e(it−λ)x dx = λ
 
φX (t) = E e = e λe dx = λ =
0 0 it − λ 0 it − λ

(b) geometric(p) :
∞ ∞ ∞
 itX  X itk
X
itk k−1 it
X
φX (t) = E e = e P (X = k) = e (1 − p) p = pe eit(k−1) (1 − p)k−1
k=1 k=1 k=1

it
X  it k−1 1
= pe e (1 − p) = peit
k=1
1− eit (1 − p)

20. Let X be a random variable such that X and −X has same distribution. Show that
φX (t) ∈ R for all t ∈ R.

Solution: If X and −X have the same distribution, then by definition

φX (t) = φ−X (t) ∀t ∈ R


= E[eit(−X) ]
= E[ei(−t)X ]
= φX (−t)
= φX (t)

Hence φX (t) ∈ R for all t ∈ R.


21. Find the characteristic function of the random variable with pmf given by
1
f (x) = , x = 1, 2, · · ·
2x

Solution:
∞ ∞ ∞ x eit
eit eit

 itX
 X
itx
X
itx 1 X
2
φX (t) = E e = e P (X = x) = e = = eit
=
x=1 x=1
2x x=1
2 1− 2
2 − eit

22. Find the characteristic function of the random variable X with pdf given by

1 − |x|, if |x| ≤ 1
f (x) =
0, otherwise.

Solution:
Z 1 Z 1 Z 1
 itX  itx
φX (t) = E e = e (1 − |x|)dx = (1 − |x|) cos tx dx + i (1 − |x|) sin tx dx
−1 −1 −1
Z 1 Z 1
=2 (1 − |x|) cos tx dx + 0 = 2 (1 − x) cos tx dx
0 0
  Z 1    
sin tx 1 sin tx 1 sin tx sin t sin t cos tx 1
=2 − x − dx =2 − + 2
t 0 t 0 0 t t t t 0
2
= 2 (1 − cos t)
t

23. Let X be a random variable such that P (X ∈ Z) = 1. Show that

φX (t) = φX (t + 2π), ∀t ∈ R

Solution:

φX (t + 2π) = E[ei(t+2π)X ]
= E[eitX ei2πX ]
= E[eitX {cos(2πX) + i sin(2πX)}]

Since P (X ∈ Z) = 1, so range of X is Z. Hence cos 2πX = 1 and sin 2πX = 0.

24. Let X and Y be independent, identically distributed random variables. Show that
φX−Y (t) = |φX (t)|2 .
Solution:

φX−Y (t) = E eit(X−Y ) = E eitX e−itY = E eitX E e−itY = E eitX E ei(−t)Y


           

= E eitX E ei(−t)X = φX (t)φX (−t) = φX (t)φX (t)


   

25. Let X and Y be independent and identically distributed continuous random variables
with density f . Find P {X 2 > Y }.

Solution:
B = {(x, y) ∈ R2 |x2 > y}
Now

P {X 2 > Y } = P {(X, Y ) ∈ B}
ZZ
= fX (x)fY (y)dxdy
ZZ B

= f (x)f (y)dxdy
B
!
Z ∞ Z x2
= f (x) f (y)dy dx
−∞ −∞

Y
26. Let X and Y be random variables having joint density f . Find the density of .
X
Y n y o y
Solution: Set Z = and Az = (x, y) ∈ R2 ≤ z . Note that, if x < 0, then ≤ z if and

X x x
only if y ≥ xz. Thus
[
Az = {(x, y)|x < 0, y ≥ xz} {(x, y)|x > 0, y ≤ xz}
Consequently,
ZZ
FZ (z) = f (x, y)dxdy
Az
Z 0 Z −∞  Z ∞ Z zx 
= f (x, y)dy dx + f (x, y)dy dx
−∞ xz 0 −∞

In the Inner integrals, we make the change of variable y = xv (with dy = xdv) to


obtain
Z 0 Z −∞  Z ∞ Z z 
FZ (z) = xf (x, xv)dv dx + xf (x, xv)dv dx
−∞ z 0 −∞
Z 0 Z z  Z ∞ Z z 
= (−x)f (x, xv)dv dx + xf (x, xv)dv dx
−∞ −∞ 0 −∞
Z 0 Z z  Z ∞ Z z 
= |x|f (x, xv)dv dx + |x|f (x, xv)dv dx
−∞ −∞ 0 −∞
Z ∞ Z z 
= |x|f (x, xv)dv dx
−∞ −∞
Z z Z ∞ 
= |x|f (x, xv)dx dv
−∞ −∞

Hence Z has the density


Z ∞
fZ (z) = |x|f (x, xz)dx
−∞

You might also like