L15 Conditional Distributions and Independent Random Variables
L15 Conditional Distributions and Independent Random Variables
1. Conditional Distributions
Definition 1. Let Z = (X, Y ) be a random vector of discrete type with support EZ , joint
d.f. FZ and joint p.m.f. fZ . Then X and Y are discrete type random variables.
In the similar manner, we can define the conditional probability mass function and
conditional cumulative distribution function of Y , given X = x, provided P (X = x) > 0.
Definition 2. Let Z = (X, Y ) be a random vector of continuous type with joint c.d.f. FZ
and joint p.d.f. fZ . Then X and Y are continuous type random variables. Let y ∈ R be
such that fY (y) > 0, where fY (y) > 0 is the marginal p.d.f. of Y .
The function fX|Y (.|y) : R −→ R defined as
fZ (x, y)
fX|Y (x|y) = , ∀ x ∈ R,
fY (y)
is called the conditional probability density function of X, given Y = y.
Also, the conditional cumulative distribution function of X, given Y = y, is defined as
Zx
FX|Y (x|y) = fX|Y (t|y)dt
−∞
Zx
fZ (t, y)
= dt
fY (y)
−∞
1
In the similar manner, we can define the conditional probability density function and
conditional cumulative distribution function of Y , given {X = x}, provided fX (x) > 0,
where fX (x) > 0 is the marginal p.d.f. of X.
Solution:
where FXi1 ,...,Xik is the joint c.d.f. of (Xi1 , Xi2 , . . . , Xik ) and FXij is the marginal c.d.f. of
Xij , for 1 ≤ j ≤ k.
Theorem 6. Let X = (X1 , X2 , . . . , Xn ) : S −→ Rn be a n−dimensional (n ≥ 2) random
vector with joint c.d.f. FX . Let FXi be the marginal c.d.f. of Xi , for 1 ≤ i ≤ n. Then the
random variables X1 , X2 , . . . , Xn are independent if and only if
n
Y
FX (x1 , x2 , · · · , xn ) = FXi (xi ), ∀ (x1 , x2 , · · · , xn ) ∈ Rn .
i=1
i.e,
X1 and X2 are independent if and only if ∀ x2 ∈ D, the conditional distribution of X1 ,
given X2 = x2 , is the same as unconditional distribution of X1 .
Example 10. Let Z = (X, Y, Z) be a random vector with joint p.m.f.
(
xyz
, if (x, y, z) ∈ {1, 2} × {1, 2, 3} × {1, 3}
f (x, y, z) = 72
0, otherwise
Solution:
respectively. Clearly f (x, y, z) = fX (x)fY (y)fZ (z), for all (x, y, z) ∈ R3 . Thus
X, Y and Z are independent.
(2) Let X = (X, Y ). The support of X is EX = {(x, z) ∈ R2 | (x, y, z) ∈ EZ for some y ∈
R} = {1, 2} × {1, 3}. For (x, z) ∈ EX , R(x,z) = {x ∈ R | (x, y, z) ∈ EZ } = {1, 2, 3}.
4
So the marginal p.m.f. of X is
P
f (x, y, z), if (x, z) ∈ EX
fX (x, z) = y∈R(x,z)
0, otherwise
(
xz
, if (x, z) ∈ {1, 2} × {1, 3}
= 12
0, otherwise
Thus fX (x, z) = fX (x)fZ (z), for all (x, z) ∈ R2 . Thus X and Z are independent.
Example 11. Let Z = (X, Y ) be a random vector with joint p.d.f.
(
1
x
, if 0 < y < x < 1
fZ (x, y) =
0, otherwise.
Are X and Y independent?
Alternative solution: The support of Z is EZ = {(x, y) ∈ R2 | 0 < y < x < 1}, and
the support of X and Y are (0, 1). Hence, EZ 6= EX × EY . Therefore, X and Y are not
independent.