One Function of Two Random Variables
One Function of Two Random Variables
Z g ( X , Y ). (8-1)
X 2 Y 2 X /Y
tan 1 ( X / Y )
Dz
Dz
X
3
Fig. 8.1
PILLAI
Example 8.1: Z = X + Y. Find f Z (z ).
Solution:
z y
FZ ( z ) P X Y z f XY ( x, y )dxdy, (8-4)
y x
x z y
Fig. 8.2 4
PILLAI
We can find f Z (z) by differentiating FZ (z) directly. In this
context, it is useful to recall the differentiation rule in (7-
15) - (7-16) due to Leibnitz. Suppose
b( z )
H ( z) a( z)
h( x, z )dx. (8-5)
Then
b ( z ) h ( x , z )
h b( z ), z h a ( z ), z
dH ( z ) db( z ) da ( z )
dx. (8-6)
dz dz dz a ( z ) z
Using (8-6) in (8-4) we get
z y z y f
XY ( x, y )
f Z ( z) f XY ( x, y )dx dy f XY ( z y, y ) 0 dy
z z
f XY ( z y, y )dy. (8-7)
(z ,0)
x z y
x
(0, z )
7
Fig. 8.4
PILLAI
In that case
z z y
FZ ( z ) f XY ( x, y )dxdy
y 0 x 0
or
z
f Z ( z ) f XY ( x, y )dx dy 0 XY
z z y f ( z y, y )dy, z 0,
y 0 z x 0
(8-12)
0, z 0.
On the other hand, by considering vertical strips first in
Fig. 8.4, we get
z zx
FZ ( z ) f XY ( x, y )dydx
x 0 y 0
or
z f ( x ) f ( z x )dx, z 0,
f XY ( x, z x )dx y 0 X
z
f Z ( z) Y
(8-13)
x 0
0, z 0,
if X and Y are independent random variables.
8
PILLAI
Example 8.2: Suppose X and Y are independent exponential
r.vs with common parameter , and let Z = X + Y.
Determine f Z (z).
x y
Solution: We have Xf ( x ) e U ( x ), f Y ( y ) e U ( y ), (8-14)
and we can make use of (13) to obtain the p.d.f of Z = X + Y.
z z
f Z ( z) e
2 x
e ( z x )
dx e
2 z
dx z2ezU ( z ). (8-15)
0 0
x z y
x z y
x x
(a ) 0 z 1 (b) 1 z 2
Fig. 8.5
For 0 z 1,
z z y z z2
FZ ( z ) 1 dxdy ( z y )dy , 0 z 1. (8-16)
y 0 x 0 y 0 2
For 1 z 2, notice that it is easy to deal with the unshaded
region. In that case
FZ ( z ) 1 PZ z 1
1 1
y z 1 x z y
1 dxdy
1 (2 z )2 (8-17)
1 (1 z y )dy 1 , 1 z 2. 10
y z 1 2
PILLAI
Using (8-16) - (8-17), we obtain
dFZ ( z ) z 0 z 1,
f Z ( z) (8-18)
dz 2 z, 1 z 2.
By direct convolution of f X (x) and fY ( y ), we obtain the
same result as above. In fact, for 0 z 1 (Fig. 8.6(a))
z
f Z ( z ) f X ( z x) fY ( x)dx 1 dx z. (8-19)
0
Fig 8.6 (c) shows f Z (z) which agrees with the convolution
of two rectangular waveforms as well.
11
PILLAI
fY (x) f X ( z x) f X ( z x) fY ( x)
x x x
1 z 1 z z
(a ) 0 z 1
fY (x) f X ( z x) f X ( z x) fY ( x)
x x x
1 z 1 z
z 1 1
(b) 1 z 2
f Z (z )
z
0 1 2
Fig. 8.6 (c) 12
PILLAI
Example 8.3: Let Z X Y . Determine its p.d.f f Z (z ).
Solution: From (8-3) and Fig. 8.7
z y
FZ ( z ) P X Y z f XY ( x, y )dxdy
y x
and hence
dFZ ( z ) z y
fZ ( z) f XY ( x, y )dx dy f XY ( y z, y )dy.
y z
(8-21)
dz x
y x y z
x yz
x
13
Fig. 8.7 PILLAI
As a special case, suppose
f X ( x) 0, x 0, and fY ( y) 0, y 0.
In this case, Z can be negative as well as positive, and that
gives rise to two situations that should be analyzed
separately, since the region of integration for z 0 and z 0
are quite different. For z 0, from Fig. 8.8 (a) y
z y
FZ ( z ) f XY ( x, y )dxdy x z y
y 0 x 0
x
and for z 0, from Fig 8.8 (b) z
z
(a)
z y
FZ ( z ) f XY ( x, y )dxdy y
y z x 0
x 15
(a) Fig. 8.9 (b) PILLAI
Integrating over these two regions, we get
yz 0
FZ ( z ) f XY ( x, y )dxdy f XY ( x, y )dxdy. (8-26)
y 0 x y x yz
x
Fig. 8.10 16
PILLAI
This gives
yz
FZ ( z ) f XY ( x, y )dxdy
y 0 x 0
or
y f ( yz , y )dy,
f Z ( z ) 0 XY
z 0,
(8-28)
0, otherwise.
Example 8.5: X and Y are jointly normal random variables
with zero mean so that
1 x 2 2 rxy y 2
1 2 (1 r 2 ) 12 1 2 22
f XY ( x, y ) e
. (8-29)
21 2 1 r 2
Thus
1 2 1 r 2 /
f Z ( z) 2 , (8-30)
2 ( z r 1 / 2 ) 1 (1 r )
2 2 2
y z
2 z y
1
2
f XY ( z y 2
, y ) f XY ( z y 2
, y ) dy. (8-34)
z
X 2 Y 2 z
z
x
z
Fig. 8.11 19
PILLAI
Example 8.7 : X and Y are independent normal r.vs with zero
Mean and common variance 2 . Determine f Z (z) for Z X 2 Y 2 .
Solution: Direct substitution of (8-29) with r 0, 1 2
Into (8-34) gives
e z / 2
2
z 1 1 z 1
y ) / 2
fZ ( z) 2 e( z y dy
2 2 2
dy
2 z y 2 2 2
y z 2 0
zy 2
e z / 2 z cos
2
/2 1 z / 2 2
2 0
z cos
d
2 2
e U ( z ), (8-35)
z
z y2
z
2
f XY ( z 2 y 2 , y ) f XY ( z 2 y 2 , y ) dy. (8-36)
2z /2 z cos z
d 2 e z / 2 U ( z ),
z 2 / 2 2
2 2
e (8-37)
2 0 z cos
As a result
1 1 1/ 1 / , / 2 / 2,
f ( ) fU (tan ) (8-40)
| d / du | (1 / sec ) tan 1 0,
2 2
otherwise .
2 2 /2
ze ( z ) / 2 /2 z cos( ) / 2
2 2 2
3/2
e d e z cos( ) / 2
d
2 2
/2 /2
ze ( z ) / 2 z
2 2 2
I 0 2
, (8-41)
2 2
where
1 2 1
cos( )
I 0 ( ) e d e cos d (8-42)
2 0 0
(a ) P( X z, X Y ) (b) P(Y z, X Y ) (c )
26
Fig. 8.12 PILLAI
Fig. 8.12 (c) represents the total region, and from there
FZ ( z) P X z,Y z FXY ( z, z). (8-46)
and hence
f Z ( z) FX ( z) fY ( z) f X ( z) FY ( z). (8-47)
Similarly
Y , X Y,
W min( X , Y ) (8-48)
X , X Y.
Thus
FW ( w) Pmin( X , Y ) w PY w, X Y X w, X Y .
27
PILLAI
Once again, the shaded areas in Fig. 8.13 (a)-(b) show the
regions satisfying the above inequalities and Fig 8.13 (c)
shows the overall region.
y y
y xw
x y x y
( w, w)
yw
x x x
and hence
fW ( w) f X ( w) fY ( w) f X ( w) FY ( w) FX ( w) fY ( w).
Hence
f Z ( z ) y f XY ( yz , y )dy x f XY ( x, xz)dx y f XY ( yz , y ) f XY ( y , yz ) dy
0 0 0
y e dy 2
2
( yz y ) ( y yz ) (1 z ) y
2
e 2
ye dy ue u dy
0 0 (1 z )2 0
2 f Z (z )
, 0 z 1, 2
(1 z )2 (8-54)
0, otherwise .
z
1
Fig. 8.15
Example 8.13 (Discrete Case): Let X and Y be independent
Poisson random variables with parameters 1 and 2
respectively. Let Z X Y . Determine the p.m.f of Z. 31
PILLAI
Solution: Since X and Y both take integer values 0, 1, 2,,
the same is true for Z. For any n 0, 1, 2, , X Y n gives
only a finite number of options for X and Y. In fact, if X = 0,
then Y must be n; if X = 1, then Y must be n-1, etc. Thus the
event { X Y n} is the union of (n + 1) mutually exclusive
events Ak given by
Ak X k , Y n k , k 0,1,2,, n. (8-55)
As a result
n
P( Z n ) P( X Y n ) P X k , Y n k
k 0
n
P( X k , Y n k ) . (8-56)
k 0
k 0 k! (n k )! n! k 0 k! ( n k )!
( ) n
e ( 1 2 ) 1 2
, n 0, 1, 2, , . (8-57)
n!
Thus Z represents a Poisson random variable with
parameter 1 2 , indicating that sum of independent Poisson
random variables is also a Poisson random variable whose
parameter is the sum of the parameters of the original
random variables.
As the last example illustrates, the above procedure for
determining the p.m.f of functions of discrete random
variables is somewhat tedious. As we shall see in Lecture 10,
the joint characteristic function can be used in this context
to solve problems of this type in an easier fashion. 33
PILLAI