Jointly Distributed Random Variables: Jeff Chak Fu WONG
Jointly Distributed Random Variables: Jeff Chak Fu WONG
Department of Mathematics
The Chinese University of Hong Kong
MATH3280
Introductory Probability
Joint Distribution Functions
Thus far, we have concerned ourselves only with probability dis-
tributions for single random variables. However, we are often
interested in probability statements concerning two or more ran-
dom variables.
In order to deal with such probabilities, we define,
Definition 1
For any two random variables X and Y , the joint cumulative
probability distribution function (joint CDF) of X and Y by
FX (a) = P {X ≤ a}
= P {X ≤ a, Y < ∞}
= P lim {X ≤ a, Y ≤ b}
b→∞
FY (b) = P {Y ≤ b}
= lim F (a, b)
a→∞
≡ F (∞, b)
p(x, y) = P {X = x, Y = y}.
Similarly, X
pY (y) = p(x, y)
x:p(x,y)>0
In particular, X
F (a, b) = p(x, y)
(x,y)
X≤a,Y ≤b
The figure below shows a sketch of what the joint PMF of two
discrete r.v.s could look like.
35 12 30
p(1, 0) = / =
1 2 3 220
5 12 10 345 12
p(0, 0) = / = 60
p(1, 1) = / =
3 3 220 1 1 1 3 220
45 12 40 34 12
p(0, 1) = / = 18
p(1, 2) = / =
1 2 3 220 1 2 3 220
45 12 30 35 12
p(0, 2) = / = 15
p(2, 0) = / =
2 1 3 220 2 1 3 220
4 12 4 34 12
p(0, 3) = / = 12
p(2, 1) = / =
3 3 220 2 1 3 220
3 12 1
p(3, 0) = / =
3 3 220
Note that the probability mass function of X is obtained by com-
puting the row sums, whereas the probability mass function of Y
is obtained by computing the column sums.
Table 1: P {X = i, Y = j}
j
i 0 1 2 3 Row sum = P {X = i}
10 40 30 4 84
0 220 220 220 220 220
30 60 18 108
1 220 220 220
0 220
15 12 27
2 220 220
0 0 220
1 1
3 220
0 0 0 220
56 112 48 4
Column sum = P {Y = j} 220 220 220 220
∂2
f (a, b) = F (a, b)
∂a∂b
wherever the partial derivatives are defined. Another interpretation
of the joint density function, obtained from Equation (4), is
Z d+db Z a+da
P {a < X < a + da, b < Y < b + db} = f (x, y)dxdy
b a
≈ f (a, b)dadb
and in turn, as B = R,
Z
P {X ∈ A} = fX (x)dx
A
Z 1Z ∞
P {X > 1, Y < 1} = 2e−x e−2y dxdy
0 1
Z 1
∞
= 2e−2y −e−x 1
dy
0
Z 1
−1
=e 2e−2y dy
0
= e−1 (1 − e−2 )
(b)
ZZ
P {X < Y } = 2e−x e−2y dxdy
(x,y): x<y
Z ∞Z y
= 2e−x e−2y dxdy
Z0 ∞ 0
Z aZ ∞
P {X < a} = 2e−2y e−x dydx
Z0 a 0
∞ −x
= −e−2y 0
e dydx
Z0 a
= e−x dx
0
= 1 − e−a
■
Example 3
The joint density of X and Y is given by
(
e−(x+y) 0 < x < ∞, 0 < y < ∞
f (x, y) =
0 otherwise
Find the density function of the random variable X/Y.
Solution. Since f (x, y) = 0 of (x, y) ∈
/ (0, ∞) × (0, ∞), we may
assume X, Y always take positive values. So is X/Y .
For a > 0,
ZZ
X
FX/Y (a) = P ≤ a = P {X ≤ aY } = f (x, y)dxdy
Y
x≤ay
ZZ
= e−(x+y) dxdy
x≤ay
ZZ
= e−(x+y) dxdy
(x,y)∈(0,∞)×(0,∞):x≤ay
Z ∞ Z ay
−(x+y)
= e dxdy
0 0
Z ∞
= (1 − e−ay )e−y dy
0
∞
−y e−(a+1)y 1
= −e + =1−
a+1 a+1
0
Differentiation shows that the density function of X/Y is given by
1
0<a<∞
fX/Y (a) = (a + 1)2
0 otherwise
■
Proposition 1
Suppose X and Y have a joint density function f . Then the
marginal densities of X and Y are given by
Z ∞
fX (a) = f (a, y)dy, a ∈ R
−∞
and Z ∞
fY (b) = f (x, b)dx, b∈R
−∞
FX (a) = P {X ≤ a}
Z a Z ∞
= f (x, y)dy dx
−∞ −∞
Z ∞
Let g(x) = f (x, y)dy.
−∞
Then Z a
FX (a) = g(x)dx
−∞
Taking the derivative gives
d
fX (a) = FX (a)
dx Z
a
d
= g(x)dx
dx −∞
= g(a)
Z ∞
= f (a, y)dy
−∞
FX (b) = P {Y ≤ b}
Z b Z ∞
= f (x, y)dx dy
−∞ −∞
Z ∞
Let g(y) = f (x, y)dx.
−∞
Then Z b
FY (b) = g(y)dy
−∞
Taking the derivative gives
d
fY (b) = FY (b)
dy
Z b
d
= g(y)dy
dy −∞
= g(b)
Z ∞
= f (x, b)dy
−∞
F (a1 , a2 , · · · , an ) = P {X1 ≤ a1 , X2 ≤ a2 , · · · , Xn ≤ an }
Further, the n random variables are said to be jointly continuous if
there exists a function f (x1 , x2 , . . . , xn ) , called the joint prob-
ability density function, such that, for any set C in n-space, i.e.,
C ⊆ Rn (dx = dx1 dx2 · · · dxn )
P {(X1 , X2 , · · · , Xn ) ∈ C}
Z Z
= ··· f (x1 , · · · , xn )dx1 dx2 · · · dxn
(x1 ,··· ,xn )∈C
P {X1 ∈ A1 , X2 , ∈ A2 , . . . , Xn ∈ An }
= P {(x1 , · · · , xn ) ∈ C}
Z Z Z
= ··· f (x1 , · · · , xn )dx1 dx2 · · · dxn
An An−1 A1
P {X ∈ A, Y ∈ B} = P {X ∈ A}P {Y ∈ B} (6)
In other words, X and Y are independent if, for all A and B, the
events EA = {X ∈ A} and FB = {Y ∈ B} are independent.
It can be shown by using the three axioms of probability that Equa-
tion (6) will follow if and only if, for all a, b,
P {X ≤ a, Y ≤ b} = P {X ≤ a}P {Y ≤ b}
Hence, in terms of the joint distribution function F of X and Y, X
and Y are independent if
= P {Y ∈ B}P {X ∈ A}
and
and
Z ∞
fY (b) = f (x, b)dx
−∞
Z 1−b
= (24xb)dx
0
1−b
x2
= 24b
2 0
= 12b(1 − b)2 , if 0 < b < 1