Topic 5 Multivariate Distributions
Topic 5 Multivariate Distributions
Multivariate distributions
In Chapters 2 and 3, we focused on univariate random variable which are simply single random
variables defined on a given sample space. In many random experiments, there could be more
than one random variable of interest defined on the same sample space. In this chapter therefore,
we extend the definition of distribution functions and density functions to two or more random
variablesthatoccurjointly.
• determine marginal densities and marginal cumulative distributions from their joint dis-
tributions.
1
2 CHAPTER 5. MULTIVARIATE DISTRIBUTIONS
Definition 1
If X and Y are discrete random variables, the function given by f (x, y) = P (X = x, Y = y) for
each pair (x, y) within the range of X and Y is called the joint probability function of X and Y.
The joint probability function of X and Y has the following properties.
(i) f (x, y) 0.
P P
(ii) all x all y f (x, y) = 1.
(iii) If (x, y) is not one of the possible pairs of random variable (X, Y ), then, f (x, y) = 0.
P
(iv) For any A ⇢ R2 , P [(X, Y ) 2 A] = (x,y)2A f (x, y).
Example 5.1
Let X be a random variable with values 1, 2, 3 and Y be another random variable with values
1, 2, 3, 4 and their joint probability function is specified in the following table.
X Y
1 2 3 4
1 0.1 0 0.1 0
2 0.3 0 0.1 0.2
3 0 0.2 0 0
Then,
(i) P (X = 2, Y = 3) = 0.1.
(ii) P (X 2, Y 2) = P (X = 2, Y = 2) + P (X = 2, Y = 1) + P (X = 3, Y = 1)
+ P (X = 3, Y = 1) = 0.5.
P3
(iii) P (Y = 3) = x=1 f (x, 3) = P (X = 1, Y = 3)+P (X = 2, Y = 3)+P (X = 3, Y = 3) = 0.2.
Example 5.2
Determine the value of k for which the function given by f (x, y) = kxy; x = 1, 2, 3; y = 1, 2, 3
can serve as a joint probability function for two random variables X and Y.
Solution
3 X
X 3
f (x, y) = 1.
x=1 y=1
Example 5.3
Suppose X and Y have a discrete distribution with joint probability function given by
⇢
c|x + y| x = 2, 1, 0, 1, 2; y = 2, 1, 0, 1, 2
f (x, y) =
0 elsewhere.
Determine
(c) P (Y < 12 ).
(d) P (X = 1).
Solution
(a) The table below shows the joint probability function of X and Y.
X Y
2 1 0 1 2
2 4c 3c 2c c 0
1 3c 2c c 0 c
0 2c c 0 c 2c
1 c 0 c 2c 3c
2 0 c 2c 3c 4
P2 P2
Since x= 2 y= 2 f (x, y) = 1 ) 10c + 7c + 6c + 7c + 10 = 1 ) c = 1
40 .
(c)
✓ ◆
1
P Y < = P (Y = 0) + P (Y = 1) + P (Y = 2)
2
2
X 2
X 2
X
= f (x, 0) + f (x, 1) + f (x, 2)
x= 2 x= 2 x= 2
23
= 6c + 7c + 10c = .
40
P2
(d) P (X = 1) = y= 2 f (1, y) = 7
40 .
4 CHAPTER 5. MULTIVARIATE DISTRIBUTIONS
(e)
P (|X Y | 1) = f ( 2, 2) + f ( 2, 1) + f ( 1, 2) + f ( 1, 1) + f ( 1, 0)
+f (0, 1) + f (0, 0) + f (0, 1) + f (1, 0) + f (1, 1) + f (1, 2)
+f (2, 1) + f (2, 2)
28
= .
40
Two random variables X and Y are said to have a continuous distribution with a joint probability
density function f (X, Y ) if
(ii) The probability that (X, Y ) will lie on any specified straight line in the plane is 0.
Example 5.4
Given the joint probability density function
⇢ 3
f (x, y) = 5 x(y + x) 0 < x < 1, 0 < y < 2
0 elsewhere,
Z 1 ✓ ◆2
2 3 y2
= x + xy dx
0 5 2 y=1
Z 1 ✓ ◆
2 3 3 11
= x + x dx = .
0 5 2 80
Example 5.5
Suppose that a point (x, y) is chosen at random from the region S in the xy plane containing
all points (x, y) such that x 0, y 0 and 4y + x 4.
(b) Suppose that S0 is a subset of the region S having area ↵, determine P ((X, Y ) 2 S0 ).
Solution
The region over which the probability density function is positive is shown in Figure 5.1.
(a) To say that a a point (x, y) is chosen at random from the region S is equivalent to saying
thatthedensityfunctionisuniformlydistributedovertheshadedregioninthefigure 5.1, i.e.
⇢
k x 0, y 0, 4y + x 4
f (x, y) =
0 otherwise
4 x
OR
Z 1 Z 4(1 y)
kdydx = 1
y=0 x=0
Z 1
) k (4 4y)dy = 1
y=0
⇥ ⇤1
) k 4y 2y 2 0
=1
) 2k = 1
1
) k= .
2
(b)
Z Z
P [(X, Y ) 2 S0 ) = f (x, y)dydx
S0
Z Z
1
= dydx
S 2
Z Z0
1
= dydx
2 S0
1
= · Area of S0 .
2
Example 5.6
Consider two continuous random variables X and Y with joint probability density function
⇢
cy 2 0 x < 2, 0 y 1
f (x, y) =
0 elsewhere.
(iii) To compute P (X + Y < 2), identify the region over which the density is positive and
X + Y < 2. This is the triangular region over which an element parallel to the x axis is
erected. See Figure 5.2. Note that x runsfromtheline x+y=2 to x= 2 ,hence the lower
limit x= 2 y and y runs from y=0 to y=1.
In other words,
Z Z
1 2
3 2
P (X + Y > 2) = y dxdy
y=0 x=2 y 2
Z 1
3 2 2
= y [x]x=2 dy
y=0 2 y
Z 1
3 3 3
= y dy = .
y=0 2 8
y=2
y=1
x=2 x
Definition 2
The joint cumulative distribution function F (X, Y ) for two random variables X and Y is defined
as follows:
F (x, y) = P (X x, Y y) for 1 < x, y < 1.
8 CHAPTER 5. MULTIVARIATE DISTRIBUTIONS
(a) For a discrete case, if f (X, Y ) is the probability function of two random variables X and
Y, then,
XX
F (x, y) = P (X x, Y y) = f (s, t) for 1 < x, y < 1.
sx ty
(b) For a continuous case, if f (X, Y ) is the joint probability density function of X and Y ,
then,
F (x, y) = P (X x, Y y)
Z x Z y
= f (u, v)dvdu for 1 < x, y < 1.
1 1
Theorem 1
If X and Y are jointly distributed random variables with a joint cumulative function F, and if
a, b, c, d 2 R, then,
Proof
Example 5.7
Given ⇢ 1
16 xy(x + y), 0 x, y 2
F (x, y) =
0 otherwise.
Determine P 1
2 < X < 1, 41 < Y < 1
2 .
Solution
UsingTheorem1,weget,
✓ ◆ ◆ ✓ ✓ ◆ ✓ ◆ ✓ ◆
1 1 1 1 1 1 1 1 1
P < X < 1, < Y < = F 1, +F , F 1, F ,
2 4 2 2 2 4 4 2 2
✓ ◆ ✓ ◆ ✓ ◆ ✓ ◆
1 3 1 1 3 1 1 5 1 1
= ⇥ ⇥
32 2 16 8 4 16 4 4 16 4
9
= .
512
5.3. JOINT CUMULATIVE DISTRIBUTIONS 9
Remark 1
lim
(i) F (x, y) = limy! 1 F (x, y) = 0.
x! 1
lim F (x, y) = 1.
(ii) x!1
y!1
(iii) Given two random variables X and Y jointly distributed with a joint cumulative dis-
tribution function F, then, the joint density function can be derived from F as follows:
@2
f (x, y) = @x@y F (x, y).
Example 5.8
Given ⇢ 1
16 xy(x + y), 0 x, y 2
F (x, y) =
0 otherwise.
Determine the joint probability density function of X and Y.
Solution
2
For 0 < x < 2 and 0 < y < 2, @ @x@y
F (x,y)
= 18 (x + y).
@ 2 F (x,y)
@x@y = 0 for x < 0, y < 0 and
@ 2 F (x,y)
@x@y = 0 for x > 2, y > 2.
Thus, ⇢
8 (x + y) for 0 < x, y < 2,
1
f (x, y) =
0 otherwise.
Example 5.9
If the joint probability density function of X and Y is given by
⇢
x + y for 0 < x, y < 1
f (x, y) =
0 elsewhere,
• If x 1 and y 1, F (x, y) = 1.
Therefore,
8
> 0 if x < 0 or y < 0,
>
>
>
>
>
>
2 xy(x + y) 0 < x < 1, 0 y 1,
> 1
>
>
>
>
<
F (x, y) = 2 y(y + 1) for x 1, 0 < y < 1,
1
>
>
>
>
>
>
2 x(x + 1) for 0 < x < 1, y 1,
> 1
>
>
>
>
>
:
1 for x 1, y 1.
Example 5.10
Given F (x, y) = 1
126 xy(x
2 + y); 0 x < 3, 1 y 4. Determine
(a) P (1 X 2, 0 Y 2).
(d) P (Y X).
Solution
P (1 X 2, 0 Y 2) = P (1 X 2, 1 Y 2)
= F (2, 2) + F (1, 1) F (1, 2) F (2, 1)
10
= .
126
3y(9+y)
(b) FY (y) = limx!3 F (x, y) = 126 , 1 y 4.
@2 3x2 +2y
(c) f (x, y) = @x@y F (x, y) = 126 0 x 3, 1 y 4.
P (Y > X) = 1 P (Y X)
Z 3 Z x
(3x2 + 2y)
= 1 dydx
x=1 y=1 126
5.4. MARGINAL PROBABILITY DISTRIBUTIONS 11
4
3
1 3 x
Figure 5.3: Region 1 < x < 3, 1 < x < 4
.
Z
1 3
x
= 1 3x2 y + y 2 dx
126 x=1
y=1
Z
1 3
= 1 (3x3 2x2 1)dx
126 x=1
✓ ◆ 3
1 3 4 2 3
= 1 x x x
126 4 3 x=1
= 0.6614.
Definition 3
Let X and Y be jointly distributed discrete random variables with a joint probability function
f (X, Y ). We define the marginal probability function of X as
X X
fX (x) = P (X = x) = P (X = x, Y = y) = f (x, y).
y y
In other words, for any given value x of X, the value of fX (x) is found by summing f (x, y) over
P
all possible values of y. Similarly, fY (y) = x f (x, y).
Definition 4
12 CHAPTER 5. MULTIVARIATE DISTRIBUTIONS
If X, Y are continuous jointly distributed random variables with a joint probability density
function f (X, Y ), then, the marginal probability density function of X is defined as
Z 1
fX (x) = f (x, y)dy, 1 < x < 1.
1
Likewise, Z 1
fY (y) = f (x, y)dx, 1 < y < 1.
1
You can verify that the marginal functions f (X) and f (Y ) are indeed probability density func-
tions.
Example 5.11
The values of the joint probability distribution of X and Y are shown in the table below.
X Y
0 1 2
0 1
16
1
16
1
8
1 1
16
1
16
1
16
2 1
16
1
8
3
8
The marginal probability function f (X) of X can be determined by summing the values in
each row of this table. Likewise, the marginal probability function f (Y ) of Y can be determined
by summing values in each column of this table. That is,
P2
(i) fX (0) = y=0 f (x = 0, y) = 1
16 + 1
16 + 1
8 = 4
16 , fX (1) = 16 , fX (2)
4
= 9
16 .
Example 5.12
X and Y are continuous random variables jointly distributed with joint probability density
function ⇢
x+y 0 < x, y < 1
f (x, y) =
0 otherwise.
R1 R1
fX (x) = 1 f (x, y)dy = y=0 (x + y)dy = (x + 21 ) for 0 < x < 1.
Similarly,
R1 R1
fY (y) = 1 f (x, y)dx = x=0 (x + y)dx = ( 12 + y) for 0 < y < 1.
5.4. MARGINAL PROBABILITY DISTRIBUTIONS 13
Example 5.13
The joint probability density function of X and Y is
⇢
cx2 y x2 y 1
f (x, y) =
0 elsewhere.
• Since X cannot take on any value outside the interval [ 1, 1] it follows that fX (x) = 0
for x > 1 or x < 1.
R1 R1
• For 1 < x < 1, fX (x) = 1 f (x, y)dy = x2 ( 4 )x ydy
21 2
= ( 21
8 )x (1
2 x4 ).
• It can be shown that f (x, y) = 0, for y < 0 or y > 1 and for 0 y 1 fY (y) =
R1 R py 21 2 5
1 f (x, y)dx = y 4 x ydx = 2 y .
p 7 2
Note: As an exercise, you are required to sketch the marginal probability density function
of X and the marginal probability density function of Y.
Theorem 2
Let F (X, Y ) be the joint distribution function of X and Y. Then,
and
FY (y) = lim F (x, y).
x!1
Example 5.14
Given ⇢ 1
16 xy(x + y), 0 x, y 2
F (x, y) =
0 otherwise.
Determine FX (x) and FY (y).
Solution
FX (x) = limy!2 F (x, y) = 18 x(x + 2) for 0 x 2.
Similarly,
FY (y) = limx!2 F (x, y) = 18 y(y + 2) for 0 y 2.
14 CHAPTER 5. MULTIVARIATE DISTRIBUTIONS
Definition 5
Suppose that X and Y are two random variables having a discrete joint distribution for which
the joint probability function is f (X, Y ). Let fX and fY be the marginal probability functions
of X and Y respectively.
f (x,y)
The function given by fX|Y (x|y) = fY (y) , fY (y) > 0 for each value x in the range of X is called
the conditional probability of X given Y = y.
For each fixed value y in the range of Y, the function fX|Y (x|y) is indeed a probability
function over all possible values of x, since
Definition 6
The conditional cumulative distribution function of X given Y = y is defined as follows:
P (Xx, Y =y)
FX|Y (x|y) = P (X x|Y = y) = P (Y =y) provided that P (Y = y) > 0.
y)
Similarly, FY |X (y|x) = P (Y y|X = x) = P (X=x,Y
P (X=x) provided that P (X = x) > 0.
Example 5.15
The values of the joint probability function of X and Y are shown in the table below.
X Y
1 2 3 4
1 0.1 0 0.1 0
2 0.3 0 0.1 0.2
3 0 0.2 0 0
f (2, y)
(i) fY |X (y|2) = fX (2) = 6 f (2, y).
10
f (2,3)
(ii) fY |X (3|2) = fX (2) = 16 .
f (2,4)
(iii) fX|Y (2|4) = fY (4) = 1.
5.5. CONDITIONAL DISTRIBUTIONS 15
P (Y 2,X=3)
(iv) FY |X (2|3) = P (X=3) = 1.
Example 5.16
The joint frequency function of X and Y is given in the table below.
X Y
1 2 3 4
1 0.10 0.05 0.02 0.02
2 0.05 0.20 0.05 0.02
3 0.02 0.05 0.20 0.04
4 0.02 0.02 0.04 0.10
Solution
P
(a) fX (x) = all y f (x, y). That is, sum over rows. So,
fX (1) = 0.19, fX (2) = 0.32, fX (3) = 0.31 and fX (4) = 0.18.
(b)
f (x, y = 1) f (x, y = 1) f (x, y = 1)
fX|Y (x|y = 1) = =P = .
fY (y = 1) all x f (x, 1) 0.19
The table below shows the conditional probability function of X given Y = 1.
X 1 2 3 4
fX|Y (x|y = 1) 10
19
5
19
2
19
2
19
P
(c) Let Z = X Y and P (Z = z) = (x,y):x y=z f (x, y)
The probability function of Z is shown in the table below.
z 3 2 1 0 1 2 3
P (Z = z) 0.02 0.04 0.14 0.6 0.14 0.04 0.02
Therefore,
P (X Y is odd ) = P (Z = 3) + P (Z = 1) + P (Z = 1) + P (Z = 3) = 0.32.
16 CHAPTER 5. MULTIVARIATE DISTRIBUTIONS
Definition 7
Suppose that X and Y are two random variables having a joint continuous distribution for which
the joint probability density function is f (X, Y ). Let fX and fY be the marginal probability
f (x,y)
density functions of X and Y respectively. The function given by fX|Y (x|y) = fY (y) , fY (y) > 0
for 1 < x < 1 is called the conditional density of X given Y = y.
f (x,y)
Similarly, fY |X (y|x) = fX (x) , fY (x) > 0 for 1 < y < 1 is the conditional density of Y
given X = x. Indeed, you can verify that for each fixed value x, fY |X (y|x) is a probability density
function over all possible values of Y . Similarly, fX|Y (x|y) is a probability density function over
all possible values of X .
Definition 8
The conditional cumulative distribution function for the continuous random variables X given
Y can be derived from the conditional densities as follows:
Z x
FX|Y (x|y) = fX (u|y)du, fY (y) > 0.
1
Similarly,
Z y
FY |X (y|x) = fY (v|x)dv, fX (x) > 0
1
is the conditional cumulative distribution function for the continuous random variables Y given
X.
Example 5.17
Suppose that a student’s score X on a mathematics exam is a number between 0 and 1, and
that his score Y on a music exam is also a number between 0 and 1. Suppose further that in a
population of all university students at Makerere University, the scores X and Y are distributed
according to the following joint probability density function.
⇢
5 (2x + 3y) for 0 x 1 and 0 y 1
2
f (x, y) =
0 otherwise.
(a) What proportion of university students obtain a score greater than 0.8 on the mathematics
exam?
(b) If a student’s score on the music test is 0.3, what is the probability that his score on the
mathematics test will be greater than 0.8?
5.5. CONDITIONAL DISTRIBUTIONS 17
Solution
(a)
Z Z
1 1
2
P (X > 0.8) = (2x + 3y)dydx
0.8 0 5
Z 1 Z ✓ ◆
1
2 3 1
2 3
= 2xy + y 2 dx = 2x + dx = 0.264.
0.8 5 2 0 0 5 2
(b)
Z 1
2
fy (y) = (2x + 3y)dy
0 5
2
= (1 + 3y) for 0 y 1.
5
fX,Y (x, y)
fX|Y (x|y) =
fY (y)
2x + 3y
= for 0 x 1.
1 + 3y
Z 1
P (X > 0.8|Y = 0.3) = fX|Y (x|y = 0.3)dx
0.8
Z
1 1
1
= (x2 + 0.9x)
0.8 1.9 0.8
= 0.0.28.
Example 5.18
Suppose that the joint probability density function of two random variables X and Y is as
follows: ⇢
16 (4 2x y) for x > 0, y > 0 and 2x + y < 4,
3
f (x, y) =
0 otherwise.
Determine
(i) the conditional probability density function of Y for any given value of X and
Solution
(i)
f (x, y)
fY |X (y|x) = .
fX (x)
18 CHAPTER 5. MULTIVARIATE DISTRIBUTIONS
4 2x y
= for 0 < y < 4 2x.
2(2 x)2
(ii) If X = 0.5, then,
2
fY |X (y|x = 0.5) = (3 y) for 0 < y < 3.
9
Therefore,
Z 3
2 1
P (Y 2|X = 0.5) = (3 y)dy = .
y=2 9 9
Example 5.19
Suppose that the joint probability density function of X and Y is specified as follows:
⇢ 2
cx y for x2 y 1
f (x, y) =
0 otherwise.
Determine
(i) fY |X (y|x).
(ii) P Y 1
4 |X = 1
2 .
(iii) P Y 3
4 |X = 1
2 .
Solution
(i)
f (x, y)
fY |X (y|x) = .
fX (x)
Z 1
fX (x) = cx2 ydy
2
✓x ◆
21
= x2 (1 x4 ) for 1 < x < 1.
8
Therefore,
21 2
4 x y
fY |X (y|x) =
8 x (1 x4 )
21 2
2y
= for x2 y 1.
(1 x4 )
5.6. CONDITIONAL EXPECTATIONS 19
(iii)
✓ ◆ Z
3 1 1
32y 7
P Y |X = = dy = .
4 2 3 15 15
4
Definition 9
Let X and Y be two random variables with a joint continuous density function f (X, Y ) and
let fX and fY be the marginal probability density functions of X and Y respectively. For any
fX (x) > 0 the conditional expectation of Y given that X = x is defined as
Z 1
E(Y |x) = yfY |X (y|x)dy.
1
Similarly, for any fY (y) > 0, the conditional expectation of X given Y is defined as
Z 1
E(X|y) = xfX|Y (x|y)dx
1
where fY |X (y|x) is the conditional probability density function of Y given that X = x and
fX|Y (x|y) is the conditional probability density function of X given that Y = y.
If X and Y are discrete random variables jointly distributed, then,
X X
E(Y |x) = yfY |X (y|x) and E(X|y) = xfX|Y (x|y).
y x
In general, if r(X, Y ) is any function of X and Y, then, the conditional expectation E(r(X, Y )|X) =
R1
1 r(x, y)fY |X (y|x)dy.
Example 5.20
X and Y are two random variables, jointly distributed with probability density function given
by
f (x, y) = x + y, 0 x, y < 1.
and
Z 1
E(Y |X = x) = yfY |X (y|x)dy
0
Z
y(x + y)
1
= dy
0 x + 12
3x + 2
= .
6x + 3
Note that the conditional expectation of Y given X denoted by E(Y |X) is itself a random
variable with its own probability distribution.
Example 5.21
The joint frequency function of X and Y is given in the table below.
X Y
1 2 3 4
1 0.10 0.05 0.02 0.02
2 0.05 0.20 0.05 0.02
3 0.02 0.05 0.20 0.04
4 0.02 0.02 0.04 0.10
Find the conditional probability function of X given Y = 1 and use it to determine the
conditional mean of X given Y = 1.
Solution
The conditional probability function of X given Y = 1 was given in Example 5.16 on page 15.
4
X
E [X|Y = 1] = xfX|Y (x|y = 1)
x=1
10 5 2 2
= 1⇥ +2⇥ +3⇥ +4⇥
19 19 19 19
34
= .
19
Theorem 3
For any two random variables X and Y, E[E(Y |X)] = E(Y ).
Proof
5.6. CONDITIONAL EXPECTATIONS 21
Without loss of generality, let us assume that both X and Y are continuous jointly distributed
random variables. In addition, we note that E(Y |X) is a function of X only. So if we let
h(X) = E(Y |X), then,
But
Z Z
1 1
f (x, y)
h(x) = E(Y |x) = yfY |x (y|x)dy = y dy.
1 1 fX (x)
Therefore,
Z 1
E[E(Y |X)] = h(x)fX (x)dx
1
Z 1 Z 1
f (x, y)
= y dy fX (x)dx
1 1 fX (x)
Z 1 Z 1
= yf (x, y)dydx
1 1
= E(Y ).
Example 5.22
Suppose that E(Y |X) = aX + b where a, b 2 R. Determine the value of E(XY ) in terms of
E(X) and E(X 2 ).
Solution
We know that E[E(XY |X)] = E(XY ).
Since X is given, it is no longer a random variable and so we can get it out of the (inside)
expectation. Thus
and
E(XY ) = E[E(XY |X)] = E(aX 2 + bX) = aE(X 2 ) + bE(X).
Let X and Y be continuous random variables jointly distributed with a joint probability density
function f (X, Y ). For any given X = x, we define the variance of the conditional distribution of
Y given X = x as follows:
V ar(Y |x) = E{[Y E(Y |x)]2 |x} = E(Y 2 |x) [E(Y |x)]2 .
22 CHAPTER 5. MULTIVARIATE DISTRIBUTIONS
Theorem 4
For any two random variables X and Y, V ar(Y ) = E[V ar(Y |X)] + V ar[E(Y |X)].
Proof
E[V ar(Y |X)] = E[E(Y 2 |X) E{[E(Y |X)]2 }] = E(Y 2 ) E{[E(Y |X)]2 }. (5.1)
V ar[E(Y |X)] = E{[E(Y |X)]2 } {E[E(Y |X)]}2 = E{[E(Y |X)]2 } [E(Y )]2 . (5.2)
Example 5.23
Suppose that X and Y are continuous random variables for which the joint probability density
function is ⇢
3 (x + y) for 0 x 1 and 0 y 2
1
f (x, y) =
0 otherwise.
Z 2
1 2
fX (x) = (x + y)dy = (x + 1) for 0 x 1.
y=0 3 3
Z Z
2
f (x, y) 2
(x + y) 6x + 8
E[Y |X = x] = y dy = y dy = .
y=0 fx (x) y=0 2(x + 1) 6(x + 1)
Therefore,
6X + 8
E[Y |X] = for 0 < X < 1.
6(X + 1)
Z 2
f (x, y)
E[Y |X = x] =
2
y2 dy
y=0 fx (x)
Z 2
(x + y) 8x + 12
= y2 dy = .
y=0 2(x + 1) 6(x + 1)
8x + 12 6x + 8 2 3x2 + 6x + 2
V ar[Y |X = x] = = .
6(x + 1) 6(x + 1) 9(x + 1)2
Therefore,
3X 2 + 6X + 2
V ar[Y |X] = for 0 < X < 1.
9(X + 1)2
5.7. INDEPENDENT RANDOM VARIABLES 23
Definition 10
Two random variables X and Y are said to be independent if for any two sets A and B of
real numbers, we have that
P (X 2 A, Y 2 B) = P (X 2 A)P (X 2 B).
Example 5.24
Suppose that X and Y have a continuous joint distribution for which the joint probability
density function is defined as follows:
⇢ 3 2
2 y for 0 x 2 and 0 y 1,
f (x, y) =
0 otherwise
Solution
(i)
✓ ◆ Z 1Z 1✓ ◆
1 3 7
P {X < 1}, {Y > } = y 2 dydx = .
2 0 1 2 16
2
Z 1 Z 1 ✓ ◆
3 1
P ({X < 1}) = y 2 dydx = .
x=0 y=0 2 2
✓ ◆ Z 2 Z 1 ✓ ◆
1 3 7
P {Y > } = y 2 dydx = .
2 x=0 y= 1 2 8
2
Since
✓ ◆ ✓ ◆
1 1
P {X < 1}, {Y > } = P ({X < 1})P {Y > } ,
2 2
(ii) Let A = {X(!)|a < X(!) < b} and B = {X(!)|c < Y (!) < d}.
Z Z
b d
3 2 1
P (X 2 A, Y 2 B) = y dydx = (d3 c3 )(b a)
x=a y=c 2 2
24 CHAPTER 5. MULTIVARIATE DISTRIBUTIONS
Z Z
b 1
3 2 b a
P (X 2 A) = y dydx =
x=a y=0 2 2
Z Z
2 d
3 2
P (Y 2 B) = y dydx = (d3 c3 )
x=0 y=c 2
(b a)
P (X 2 A)P (Y 2 B) = (d3 c3 ) = P (X 2 A)P (Y 2 B).
2
Since sets A and B were chosen arbitrarily, it follows then that X and Y are independent.
Theorem 5
Let F (X, Y ) be the joint cumulative distribution function of X and Y and let FX and FY be
the marginal cumulative distribution functions of X and Y respectively. Then X and Y are said
to be independent if and only if for any real numbers x and y, F (x, y) = FX (x)FY (y).
Proof
Let
Thus,
F (x, y) = P (X x, Y y)
= P (X 2 A, Y 2 B)
= P (X 2 A)P (Y 2 B)
= P (X x)P (Y y)
= FX (x)FY (y).
Example 5.25
Given a joint cumulative distribution function
F (x, y) = (1 e x
)(1 e 4y
), x 0, y 0,
of X and Y,
Solution
5.7. INDEPENDENT RANDOM VARIABLES 25
(a)
Since FX,Y (x, y) = FX (x)FY (y), it implies that X and Y are independent.
(b)
✓ ◆ ✓ ◆ ✓ ◆
1 1 1
P 1 < X < 10, Y = lim F (10, y) F 1, lim F (1, y) F 10,
2 y!1 2 y!1 2
= 1 e 10
+ (1 e 1
)(1 e 2
) (1 e 1
) (1 e 10
)(1 e 2
)
= e 3
(1 e 9
)
= 0.0498.
(c)
@ 2 FX,Y (x, y)
fX,Y (x, y) =
@x@y
dFX (x) dFY (y)
= ⇥
dx dy
= 4e x+4y
, x > 0, y > 0.
Collorary 1
Let f (X, Y ) be the joint probability density function of X and Y. If the marginal densities for
X and Y are fX and fY respectively, then, X and Y are independent if and only if f (x, y) =
fX (x)fY (y) for any real number numbers x and y.
Example 5.26
Suppose that X and Y have a continuous joint distribution for which the joint probability
density function is defined as follows:
⇢ 3 2
( 2 )y for 0 x 2 and 0 y 1,
f (x, y) =
0 otherwise.
Example 5.27
Suppose that the joint probability density function of X and Y is as follows:
⇢
2xe y for 0 x 1, 0 < y < 1
f (x, y) =
0 otherwise
f (x, y) = 2xe y
= (2x)(e y
) = fX (x)fY (y).
Definition 11
Let X and Y be two discrete random variables with values 1, 2, . . . , r for X and 1, 2, . . . , s for
Y and let f (X, Y ) be their joint probability function. In addition, let fX (X) and fY (Y ) be the
marginal function of X and Y respectively. Then, X and Y are said to be independent if and
only if f (x, y) = fX (x)fY (y) for all pairs of X and Y.
Example 5.28
Suppose that X and Y have a discrete joint distribution for which the joint probability function
is defined as follows:
⇢
30 (x + y) for x = 0, 1, 2 and y = 0, 1, 2, 3,
1
f (x, y) =
0 otherwise
Solution
(a)
3
X 1
fX (x) = f (x, y) = [(x + 0) + (x + 1) + (x + 2) + (x + 3)]
30
y=0
2
= [2x + 3] for x = 0, 1, 2.
30
X2
1
fY (y) = f (x, y) = [(0 + y) + (1 + y) + (2 + y)]
30
x=0
1
= [1 + y] for y = 0, 1, 2, 3.
10
5.8. EXPECTATIONS OF FUNCTIONS OF RANDOM VARIABLES 27
(b) To show that X and Y are independent, we must show that f (x, y) = f (x)f (y) for all
pairs of x and y in the range of X and Y. If we can find a single pair where the relation
fails to hold, then we conclude immediately that X and Y are not independent.
Note that
6 1
fX (0)fY (0) =
· 6= f (0, 0) = 0.
30 10
Therefore, X and Y are not independent. Actually you can also observe that ,
2
fX (x)fY (y) = (2x + 3)(1 + y)
300
4 2y 1
= (x + y) + (2x + 1) +
300 300 50
2 2y 1
= f (x, y) + (2x + 1) +
5 300 50
implying that X and Y are not independent.
Definition 12
Let X and Y be continuous random variables jointly distributed with joint probability function
f (X, Y ) and let g(X, Y ) be a function of X and Y. Then,
Z Z
E(g(X, Y )) = g(x, y)f (x, y)dydx. (5.3)
In general, if X1 , . . . , Xn are n-random variables such that each expectation exists, then,
Example 5.29
Suppose that an automobile dealer pays an amount X (in thousands of dollars) for a used car
and then sells it for an amount Y. Suppose that the random variables X and Y have the following
joint probability density function
⇢ 1
36 x for 0 < x < y < 6
f (x, y) =
0 otherwise
5.8.1 Covariance
Theorem 5
For any two random variables X and Y which are jointly distributed we have that
Proof
Cov(X, Y ) = E[XY µx Y µy X + µx µy ]
= E(XY ) µx EY µy EX + µX µY
= E(XY ) E(X)E(Y ).
5.8. EXPECTATIONS OF FUNCTIONS OF RANDOM VARIABLES 29
5.8.2 Correlation
Cov(X, Y )
⇢X,Y = .
X Y
It can be shown that 1 ⇢X,Y 1. The random variables X and Y are said to be positively
correlated if ⇢X,Y > 0 and they are said to be negatively correlated if ⇢X,Y < 0. If ⇢X,Y = 0 we
say that X and Y are uncorrelated.
Example 5.30
Let X and Y have the joint density
6
f (x, y) = (x + y)2 , 0 x 1, 0 y 1.
7
Find the covariance and correlation of X and Y.
Solution
Z 1Z 1
6
E(X) = x(x + y)2 dydx
0 0 7
Z 1Z 1
6
= x x2 + 2xy + y 2 dydx
0 0 7
Z 1Z 1 Z 1Z 1 Z 1Z 1
6 3 12 2 6 2
= x dydx + x ydydx + xy dydx
0 0 7 0 0 7 0 0 7
9
= .
14
Z 1Z 1
6
E(Y ) = y(x + y)2 dydx
0 0 7
Z 1Z 1
6
= y x2 + 2xy + y 2 dydx
0 0 7
Z 1Z 1 Z 1Z 1 Z 1Z 1
6 2 12 2 6 3
= x ydydx + xy dydx + y dydx
0 0 7 0 0 7 0 0 7
9
= .
14
Z 1Z 1
6
E(XY ) = xy(x + y)2 dydx
0 0 7
Z 1Z 1
6
= xy x2 + 2xy + y 2 dydx
0 0 7
Z 1Z 1 Z 1Z 1 Z 1Z 1
6 3 12 2 2 6 3
= x ydydx + x y dydx + xy dydx
0 0 7 0 0 7 0 0 7
119
= .
294
30 CHAPTER 5. MULTIVARIATE DISTRIBUTIONS
Z 1Z 1
6 2
E(X ) =
2
x (x + y)2 dydx
0 0 7
Z 1Z 1
6 2 2
= x (x + 2xy + y 2 )dydx
0 0 7
Z 1Z 1 Z 1Z 1 Z 1Z 1
6 4 12 3 6 2 2
= x dydx + x ydydx + x y dydx
0 0 7 0 0 7 0 0 7
4949
= .
10290
Z 1Z 1
6 2
E(Y 2 ) = y (x + y)2 dydx
0 0 7
Z 1Z 1
6 2 2
= y x + 2xy + y 2 dydx
0 0 7
Z 1Z 1 Z 1Z 1 Z 1Z 1
6 2 2 12 3 6 4
= x y dydx + xy dydx + y dydx
0 0 7 0 0 7 0 0 7
4949
= .
10290
Cov(X, Y ) = E(XY ) E(X)E(Y )
119 9 9
= · = 0.0085.
294 14 14
V ar(X) = E(X 2 ) (E(X))2
✓ ◆2
4949 9
= = 0.068.
10290 14
(X) = 0.26.
V ar(Y ) = E(Y 2 ) (E(Y ))2
✓ ◆2
4949 9
= = 0.068.
10290 14
(Y ) = 0.26.
Cov(X, Y )
⇢(X, Y ) = = 0.13.
X Y
Example 5.31
Suppose that X and Y have a continuous joint distribution for which the joint probability
density function is as follows:
⇢
3 (x + y) for 0 x 1; 0 y 2,
1
f (x, y) =
0 otherwise
Determine the product moment E(X r Y s ) and hence the value of V ar(2X 3Y + 8).
Solution
Z 2Z 1
1
E(X Y ) =
r s
xr y s (x + y)dxdy
3 0 0
5.8. EXPECTATIONS OF FUNCTIONS OF RANDOM VARIABLES 31
Z 2Z 1
1
= (xr+1 y s + xr y s+1 )dxdy
3 0 0
Z ✓ ◆1
1 1 2
1
= x y +
r+2 s r+1 s+1
x y dy
3 0 r+2 r+1 0
Z ✓ s ◆
1 2 y y s+1
= + dy
3 0 r+2 r+1
2
1 y s+1 y s+2
= +
3 (r + 2)(s + 1) (r + 1)(s + 2) 0
1 2s+1 2s+2
= + .
3 (r + 2)(s + 1) (r + 1)(s + 2)
Example 5.32
Let X and Y have the joint density
f (x, y) = e y
0xy
Theorem 6
Let X1 and X2 be independent random variables such that E(Xi), i = 1 , 2 is the expectation
of
Xi . Then, E[X1 X2 ] = E(Xi )E(X2 ).
Proof
Z 1 Z 1
E(X1 X2 ) = x1 x2 f (x1 , x2 )dx1 dx2
1 1
Z 1 Z 1
= x1 fX1 (x1 )fX2 (x2 )dx1 dx2 (because of independence)
1 1
Z 1 Z 1
= x1 fX1 (x1 )dx1 fX2 (x2 )dx2
1 1
= E(X1 )E(X2 ).
where
n
Y
Xi = X1 ⇥ X2 ⇥ . . . ⇥ Xn .
i=1
Theorem 7
If X and Y are independent random variables with 0 < 2
X < 1 and 0 < 2
Y < 1 then,
Cov(X, Y ) = ⇢X,Y = 0.
Proof
Cov(X, Y ) = E(XY ) E(X)E(Y ) = E(X)E(Y ) E(X)E(Y ) = 0.
Example 5.33
The joint probability density function of X and Y is given by f (x, y) = e (x+y) , x > 0, y > 0.
Then, fX (x) = e x for x > 0 and fY (y) = e y for y > 0. It is easy to check that X and Y are
independent.
Z 1 Z 1
E(XY ) = xye (x+y)
dxdy
Zx=0
1
x=0
Z 1
= xe x dx xye y
dy
x=0 x=0
= E(X)E(Y ).
34 CHAPTER 5. MULTIVARIATE DISTRIBUTIONS
Example 5.34
A random variable X is uniformly distributed over the set { 1, 0, 1}. Let Y = X 2 . The table
below shows the joint distribution of X and Y.
X Y
0 1
1 0 1
3
1
3
0 1
3 0 1
3
1 0 1
3
1
3
1
3
2
3 1
Theorem 8
Let X and Y be random variables such that V ar(X) and V ar(Y ) do exist and let a, b and c be
constants. Then,
Proof
Suppose that E(X) = µ1 and E(Y ) = µ2 .
You can easily verify that E(aX + bY + c) = aµ1 + bµ2 + c. Then,
In particular, if a = b = 1, c = 0 then,
Similarly, if a = 1, b = 1, c = 0, then
Theorem 9
Let X1 , X2 , . . . , Xn be n independent random variables and let b, a1 , a2 , . . . , an be arbitrary
constants. Suppose that for each i, V ar(Xi ) exists, then,
Theorem 10
Let a and b be constants, then, Cov(aX, bY ) = abCov(X, Y ).
Proof
Example 5.35
Suppose that X and Y are independent random variables for which
V ar(X) = V ar(Y ) = 3. Find the value of V ar(X Y ) and V ar(2X 3Y + 1).
Solution
Definition 13
The random variables X1 , X2 , . . . , Xn are said to have a multinomial distribution if and only if
their joint probability distribution can be written as
✓ ◆
n
f (x1 , x2 , . . . , xk , p1 , p2 , . . . , pk , n) = px1 1 px2 2 . . . pxk k
x1 , x2 , . . . , xk
Pk Pk
for each i, xi 2 {0, 1, . . . , n}, i=1 xi = n, i=1 pk = 1 and
✓ ◆
n n!
= .
x1 , x2 , . . . , xk x1 !x2 ! . . . xk !
Many times we write X = (X1 , X2 , . . . , Xk ), p = (p1 , p2 , . . . , pk ) and we say that a vector X has
a multinomial distribution with parameters n and p.
Example 5.36
Find the probability of obtaining 2 ones, 1 two, 1 three, 2 fours, 3 fives and 1 six in 10 rolls a of
a balanced dies.
Solution
Let X1 denote ones, X2 denote twos, X3 denote threes, X4 denote fours, X5 denote fives,
X6 denote sixs. Then
02 3 2 31
X1 2
B6
B6 X2 7 6
7 6 1 7C
7C ✓ ◆ ✓ ◆2 ✓ ◆1 ✓ ◆ 1 ✓ ◆ 2 ✓ ◆3 ✓ ◆1
B6 X3 7 6 1 7C 10 1 1 1 1 1 1
PB
B6
6 7=6
7 6
7C =
7C
B6 X4 7 6 2 7C 2, 1, 1, 2, 3, 1 6 6 6 6 6 6
@4 X5 5 4 3 5A
X6 1
= 0.0025.
Remark 2
If k = 2 , then, x1 + x2 = n and p1 + p2 = 1 which implies that x1 = n x2 and
p2 = 1 p1 .
5.10. FURTHER EXAMPLES 37
Therefore,
✓ ◆
n
f (x1 , x2 , p1 , p2 ) = px1 1 px2 2
x1 , x2
✓ ◆
n
= px1 1 (1 p 1 )n x2
x1 , n x1
✓ ◆
n
= px1 1 (1 p1 )n x2
x1
= B(n, x1 , p1 ).
Suppose that the random vector X has a multinomial distribution with parameters n and p.
ByRemark2,themarginaldistributionofeachcomponentXi isabinomialdistributionwith
parameters n and pi . Therefore,
Example 5.37
Suppose that the joint probability density function of two random variables X and Y is
⇢
c(x2 + y 2 ) 0 y 1 x2
f (x, y) =
0 elsewhere.
Determine
38 CHAPTER 5. MULTIVARIATE DISTRIBUTIONS
(b) P 0 X 1
2 .
(c) P (Y X + 1).
(d) P (Y = X 2 ).
Solution
The region over which the probability density function is positive is shown in Figure 5.4.
y
1 1 x
(a)
Z 1 Z 1 x2
c (x2 + y 2 )dydx = 1
1 0
Z 1 ✓ ◆ 1 x2
y3
) c x +
2
dx = 1
1 3 0
Z 1 ✓ ◆
(1 x2 )3
) c x (1
2
x )+
2
dx = 1
1 3
7
) c= .
4
(b)
✓ ◆ Z 1 Z (1 x2 )
1 7 2
P 0X = (x2 + y 2 )dydx
2 4 0 1
Z 1
7 2 (1 x6 )
= dx = 0.29.
4 0 3
5.10. FURTHER EXAMPLES 39
(c)
Z Z 1 x2
7 0
P (Y X + 1) = (x2 + y 2 )dydx
4 1 x+1
Z 1 x2
7 0
y3
= x y+
2
dx
4 1 3 x+1
Z
7 0 6 5
= (x + 4x3 + 6x2 + 3x)dx = .
4 1 24
(d) P (Y = X 2 ) = 0.
Example 5.38
Let X and Y be two jointly distributed random variables with a probability density function.
e (x+y) x, y > 0,
f (x, y) =
0 elsewhere.
Find
Solution
R1 R1
(a) P (X > 1) = x=1 y=0 e
(x+y) dydx = 0.36.
(b)
(c)
P (X < Y, X < 2Y )
P (X < Y |X < 2Y ) =
P (X < 2Y )
P (X < Y )
=
P (X < 2Y )
R1 Ry (x+y) dxdy
y=0 x=0 e
= R 1 R 2y
(x+y) dxdy
y=0 x=0 e
R1
(e y e 2y )dy 3
= Ry=0
1 = .
y=0 (e
y e )dy
3y 4
(d)
P (0 < X < 1, Y = 2)
P (0 < X < 1|Y = 2) =
fY (y = 2)
R 1
e 2 0 e x dx
= =1 e 1
= 0.63.
e 2
(e)
P (X + Y < m) = P (X < m Y )
Z m ✓Z m y ◆
= x
e dx e y
dy
y=0 x=0
Z m
⇥ ⇤
x m y
= e 0
e y
dy
y=0
Z m
⇥ ⇤
= e y
1 ey m
dy
y=0
Z m
⇥ ⇤
= e y
e m
y=0
m
= e y
e m
y y=0
= 1 me m
e m
.
So, P (X + Y < m) = 1
2 implies that (m + 1)e m 1
2 = 0. Using linear interpolation (with
initial guesses 1 and 2) m is found to be 1.715.
Example 5.39
Suppose that the joint probability density function of X and Y is as follows:
⇢ 15
x2 for 0 y 1 x2
f (x, y) = 4
0 otherwise
5.10. FURTHER EXAMPLES 41
Solution
TheregionoverwhichtheprobabilitydensityfunctionispositiveisshowninFigure5.5.
1 1 x
(a)
Z 1 x2
15 15 2
fX (x) = x2 dy = x (1 x2 ) 1 x 1.
4 0 4
Z p
15 1 y
5 3
fY (y) = x2 dx = (1 y) 2 , 0 y 1.
4 p
1 y 2
(b) Since fX,Y (x, y) 6= fX (x)fY (y) it implies that X and Y are not independent.
Example 5.40
Suppose that X and Y are random variables such that V ar(X) = 9, V ar(Y ) = 4, and ⇢X,Y =
1
6. Determine
(i) V ar(X + Y ).
(ii) V ar(X 3Y + 4)
42 CHAPTER 5. MULTIVARIATE DISTRIBUTIONS
Solution
(i)
(ii)
Example 5.41
Let X and Y have the joint density function
⇢
k(x y) 0 y x 1
f (x, y) =
0 elsewhere.
(a) Sketch the region over which the density is positive and use it in determining limits of
integration to answer the following questions.
(b) Find k.
Solution
y=1
x=1 x
Figure 5.6: Region 0 y x 1.
R1 Rx
(b) Since x=0 y=0 k(x y)dydx = 1 it implies that
Z x
1
1
k (x y)2 dx = 1
x=0 2 y=0
Z
1 2 1
) k x dx = 1
0 2
✓ ◆
1 3 1
) k x |x=0 = 1
6
) k = 6.
(c)
Z x
fX (x) = 6(x y)dy
y=0
x
= 3(x y)2 y=0
= 3x2 for 0 x 1.
(d)
f (x, y)
f (y|x) =
fX (x)
6(x y)
=
3x2
2(x y)
= for 0 y x 1.
x2
✓ ◆
1 2 12 y 1
f y|x = = = 4(1 2y) for 0 y .
2 1
4
2
44 CHAPTER 5. MULTIVARIATE DISTRIBUTIONS
(e)
Z x
6(x y) x
E(Y |X = x) = y dy = . (5.9)
0 3x2 3
(f)
✓ ◆ Z 1
1 2
E Y |X = = 4y(1 2y)dy
2 y=0
Z 1 ✓ ◆
2 y2 2 3 12 1
= 4(y 2y )dy = 4
2
y | = .
y=0 2 3 0 6
(h)
Exercise
1. Two random variables are jointly distributed with probability function shown in the table
below.
X Y
0 1 2
0 1
16
1
16
1
8
1 1
16
1
16
1
16
2 1
16
1
16
3
8
Find
(b) P (X = 0).
(c) P (Y 1, X = 2).
1
f (x, y) = (x + y) for x = 0, 1, 2, 3; y = 0, 1, 2.
30
Construct a table showing the values of the distribution function of the two random vari-
ables for all possible pairs (x, y).
3. The values of the joint probability distribution of X and Y are shown in the table below.
46 CHAPTER 5. MULTIVARIATE DISTRIBUTIONS
X Y
0 1 2
0 1
12
1
6
1
24
1 1
4
1
4
1
40
2 1
8
1
20 0
3 1
20 0 0
Determine
(a) P (X = 1, Y = 1).
(b) P (X = 0, 1 Y 3).
(c) P (X + Y 1).
(d) P (X > Y ).
4. X and Y are discrete random variables jointly distributed with probability function
f (x, y) = c|x + y| x = 1, 2, 3; y = 1, 0, 1, 2.
Find
5. Suppose that a point (X, Y ) is picked at random from a circle S defined as S = {(x, y) :
(x 1)2 + (y + 2)2 9}. Determine
(a) the conditional probability density function of Y for any given value of X.
(b) P (Y > 0|X = 2).
6. Consider tossing two tetrahedral with sides numbered 1 to 4. Let X be the smaller of the
two down turned numbers and Y the larger. Find
(c) fY |X (y|x).
7. Suppose that the joint probability density function of two random variables X and Y is
as follows: ⇢
c(x + y 2 ) for 0 x 1 and 0 y 1,
f (x, y) =
0 otherwise.
Determine
(a) the conditional probability density function of X for any given value of Y.
(b) P (X 12 |Y = 12 ).
X Y
1 2 3 4
1 0.10 0.05 0.02 0.02
2 0.05 0.20 0.05 0.02
3 0.02 0.05 0.20 0.04
4 0.02 0.02 0.04 0.10
(b) Find the probability mass function of the random variable E(Y |X).
9. Given that X and Y are jointly distributed with a probability density function
k(2x + y) 0 < x < 1, 0 < y < 2
f (x) =
0 otherwise
10. Suppose that X and Y have a continuous joint distribution for which the joint probability
density function is as follows:
⇢
x + y for 0 x 1 and 0 y 1
f (x, y) =
0 otherwise.
11. Suppose that an automobile dealer pays an amount X (in thousands of dollars) for a used
car and then sells it for an amount Y. Suppose that the random variables X and Y have
the following joint probability density function
⇢ 1
36 x for 0 < x < y < 6
f (x, y) =
0 otherwise
If it is known that he sold the automobile at 3 dollars, what is its expected cost price?
12. Suppose that X and Y have a continuous joint distribution for which the joint p.d.f. is as
follows:
⇢
6(x y) 0 y x 1
f (x, y) =
0 elsewhere.
13. Suppose that in a certain drug the concentration of a particular chemical is a random
variable with a continuous distribution for which the probability density function g is as
follows: ⇢
( 38 )x2 for 0 x 2
g(x) =
0 otherwise
Suppose that the concentrations X and Y of the chemical in two separate batches of the
drug are independent random variables for each of which the probability density function
is g. Determine
14. Suppose that a point (X, Y ) is chosen at random from a rectangles defined as follows:
15. The joint frequency function of X and Y is given in the table below.
X Y
1 2 3 4
1 0.10 0.05 0.02 0.02
2 0.05 0.20 0.05 0.02
3 0.02 0.05 0.20 0.04
4 0.02 0.02 0.04 0.10
16. Given that X and Y are jointly distributed with a probability density function
k(2x + y) 0 < x < 1, 0 < y < 2
f (x) =
0 otherwise,
find k and the correlation between X and Y.
17. Suppose that X and Y have a continuous joint distribution for which the joint probability
density function is as follows:
⇢
x + y for 0 x 1 and 0 y 1
f (x, y) =
0 otherwise.
Find the correlation of X and Y.
18. Suppose that X, Y and Z are three random variables such that V ar(X) = 1, V ar(Y ) =
4, V ar(Z) = 8, Cov(X, Y ) = 1, Cov(X, Z) = 1, and Cov(Y, Z) = 2. Determine V ar(X +
Y + Z) and V ar(3X Y 2Z + 1).
xyz
19. Given the joint distribution f (x, y, z) = 108 , x = 1, 2, 3; y = 1, 2, 3; z = 1, 2. Find the
conditional expectation of the random variable U = Z 2 given that X = 1 and Y = 2.
20. Let X, Y and Z be three random variables such that Cov(X, Z) and Cov(Y, Z) exist, and
let a, b, and c be any constants. Show that Cov(aX+bY +c, Z) = aCov(X, Z)+bCov(Y, Z).
21. Suppose that X and Y are two random variables which may be dependent and that
V ar(X) = V ar(Y ). Assuming that 0 < V ar(X + Y ) < 1 and 0 < V ar(X Y ) < 1,
show that the random variables X + Y and X Y are uncorrelated.
22. Suppose that the random variables X and Y are jointly distributed with a joint cumulative
distribution function FX,Y defined as
⇢ 2
x x2 e y for 0 x 1, 0 y < 1.
FX,Y (x, y) =
0 otherwise.
50 CHAPTER 5. MULTIVARIATE DISTRIBUTIONS
23. Suppose that X1 , . . . , Xm and Y1 , . . . , Yn are random variables such that Cov(Xi , Yj ) exists
for i = 1, . . . , m and j = 1, . . . , n; and suppose that a1 , . . . , am and b1 , . . . bn are constants,
P Pn Pm Pn
show that Cov( m i=1 ai Xi , j=1 bj Yj ) = i=1 j=1 ai bj Cov(Xi , Yj ).
24. Suppose that X and Y are negatively correlated. Is V ar(X + Y ) larger or smaller than
V ar(X Y ).
25. Suppose that f (x, y) = xe x(y+1) , 0 x < 1, 0 y < 1. Find the marginal densities
of X and Y. Are X and Y independent?
References
3. John Rice(1995). Mathematical Statistics and Data Analysis, Duxbury Press, 2nd ed.