Continuation of Random Variable
Continuation of Random Variable
distribution ”
Joint Probability Distribution
Joint probability distribution shows probability distribution for two or more random variables.
Example: A coin is tossed three times. If X denotes the number of heads and Y denotes the
number of tails in the last two tosses, then find the joint probability distribution of X and Y.
Solution: The outcomes of the experiment and the associated probabilities are shown below:
Outcome X Y P(X,Y)
HHH 3 0 1/8
HHT 2 1 1/8
HTH 2 1 1/8
HTT 1 2 1/8
THH 2 0 1/8
THT 1 1 1/8
TTH 1 1 1/8
TTT 0 2 1/8
It is easy to see that X assumes value 0,1,2, and 3, while Y assumes values 0,1, and 2. The joint
probability distribution can be written as
X Values
Y values 0 1 2 3 Row Sum
0 0 0 1/8 1/8 2/8
1 0 2/8 2/8 0 4/8
2 1/8 1/8 0 0 4/8
Column sum 1/8 3/8 3/8 1/8 1
3
Joint Distribution for continuous variables:
Let X and Y be two continuous random variables. Then the function f(x,y) is called the joint
probability density function of X and Y if
1. 𝑓(𝑥, 𝑦) ≥ 0, 𝑓𝑜𝑟 𝑎𝑙𝑙 (𝑥, 𝑦)
∞ ∞
2.∫−∞ ∫−∞ 𝑓(𝑥, 𝑦)𝑑𝑥𝑑𝑦 = 1
Clearly , 𝑓(𝑥, 𝑦) ≥ 0 , for all values of x and y in the given range. And
∞ ∞ 1 2
𝑥𝑦
∫ ∫ 𝑓(𝑥, 𝑦)𝑑𝑥𝑑𝑦 = ∫ ∫ (𝑥 2 + ) 𝑑𝑥𝑑𝑦
−∞ −∞ 0 0 3
4
2
𝑦 𝑦2
=[ + ] =1
3 12 0
𝑥(1 + 3𝑦 2 )
𝑓(𝑥, 𝑦) = { , 0 ≤ 𝑥 ≤ 2, 0 ≤ 𝑦 ≤ 2
4
0, 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
1 1
Find 𝑃 [0 < 𝑋 < 1, 4 < 𝑌 < 2].
Solution:
1
1 1 2 1 𝑥(1+3𝑦 2 )
𝑃 [0 < 𝑋 < 1, 4 < 𝑌 < 2] = ∫ ∫0 1 𝑑𝑥𝑑𝑦
4
4
1 1
2 1 3𝑦 2 𝑦 𝑦3 2 23
= ∫ (8 +
1 ) 𝑑𝑦 = [ 8 + ] = 512
8 8 1
4 4
*Marginal distribution:
When the distribution of the random variable (Say, X or Y) is derived from a joint probability
distribution (Say, f(x,y)), then the resulting distribution is known as a marginal distribution (of
X or Y).
When the random variable X and Y are discrete, the marginal distribution of X is
𝑔(𝑥) = ∑ 𝑓(𝑥, 𝑦)
𝑦
ℎ(𝑦) = ∑ 𝑓(𝑥, 𝑦)
𝑥
∞
ℎ(𝑦) = ∫ 𝑓(𝑥, 𝑦)𝑑𝑥 , 𝑓𝑜𝑟 − ∞ < 𝑦 < ∞
−∞
∞ ∞ ∞
* ∫−∞ 𝑔(𝑥)𝑑𝑥 = ∫−∞ ∫−∞ 𝑓(𝑥, 𝑦)𝑑𝑥𝑑𝑦 =1
5
𝑏 ∞ 𝑏
𝑃[𝑎 < 𝑋 < 𝑏] = 𝑃[𝑎 < 𝑋 < 𝑏, −∞ < 𝑌 < ∞] = ∫ ∫ 𝑓(𝑥, 𝑦)𝑑𝑥𝑑𝑦 = ∫ 𝑔(𝑥)𝑑𝑥
𝑎 −∞ 𝑎
1
1 2 3
𝑔(1) = 𝑃(𝑋 = 1) = ∑ 𝑓(1, 𝑦) = 𝑓(1,0) + 𝑓(1,1) = + =
8 8 8
𝑦=0
1
2 1 3
𝑔(2) = 𝑃[𝑋 = 2] = ∑ 𝑓(2, 𝑦) = 𝑓(2,0) + 𝑓(2,1) = + =
8 8 8
𝑦=0
1
1 1
𝑔(3) = 𝑃[𝑋 = 3] = ∑ 𝑓(3, 𝑦) = 𝑓(3,0) + 𝑓(3,1) = +0=
8 8
𝑦=0
Similarly, for Y
3
1 2 1 4 1
=0+ + + = =
8 8 8 8 2
3
1 2 1 4 1
= + + +0= =
8 8 8 8 2
Marginal distribution of X:
x 0 1 2 3 Sum
6
f(x) 1/8 3/8 3/8 1/8 1
Marginal distribution of Y:
Y 0 1 Sum
f(y) 4/8 4/8 1
And
4 4
1 4 1 𝑦2
∫ ℎ(𝑦)𝑑𝑦 = ∫ (5 − 𝑦)𝑑𝑦 = [5𝑦 − ] = 1
2 4 2 4 2 2
7
Here 𝑔(𝑥) 𝑎𝑛𝑑 ℎ(𝑦) satisfy all the conditions of a density function.
Now
1 2 3−𝑥
𝑃(𝑋 + 𝑌 < 3) = ∫ ∫ (6 − 𝑥 − 𝑦)𝑑𝑦𝑑𝑥
8 0 2
3−𝑥
1 2 𝑦2
= ∫ [6𝑦 − 𝑥𝑦 − ] 𝑑𝑥
8 0 2 2
1 2 𝑥2 7
= ∫ − 4𝑥 + 𝑑𝑥
8 0 2 2
2
1 𝑥3 7𝑥 1
= [ − 2𝑥 2 + ] =
8 6 2 0 24
3 5
3 5 1 2 2 9
𝑃 (𝑋 < , 𝑌 < ) = ∫ ∫ (6 − 𝑥 − 𝑦)𝑑𝑦𝑑𝑥 =
2 2 8 0 2 32
Conditional Distribution:
For two random variables X and Y, the conditional probability distribution of Y for given X is
defined as
{𝑃(𝑋=𝑥),𝑃(𝑌=𝑦)} 𝑓(𝑥,𝑦)
𝑃(𝑌 = 𝑦|𝑋 = 𝑥) = =
𝑃(𝑋=𝑥) 𝑔(𝑥)
8
∑ 𝑓(𝑥|𝑦), 𝑖𝑓 𝑋 𝑖𝑠 𝑑𝑖𝑠𝑐𝑟𝑒𝑡𝑒
𝑥
𝑃(𝑎 < 𝑋 < 𝑏|𝑌 = 𝑦) = 𝑏
∫ 𝑓(𝑥|𝑦)𝑑𝑥, 𝑖𝑓 𝑋 𝑖𝑠 𝑐𝑜𝑛𝑡𝑖𝑛𝑢𝑜𝑢𝑠
{ 𝑎
*
Values of X
Values of Y 0 1 2
0 3/28 3/28 3/28
1 6/28 6/28 0
2 1/28 0 0
Find 𝑓(𝑥|1), 𝑓(𝑦|1), 𝑎𝑛𝑑 𝑃(𝑋 = 0|𝑌 = 1).
𝑓(𝑥,1) 𝑓(1,𝑦)
Solution: 𝑓(𝑥|1) = 𝑎𝑛𝑑 𝑓(𝑦|1) =
ℎ(1) 𝑔(1)
6 6 3
Now, ℎ(1) = ∑2𝑥=0 𝑓(𝑥, 1) = 𝑓(0,1) + 𝑓(1,1) + 𝑓(2,1) = 28 + 28 + 0 = 7
7 7 6 1
𝑓(1|1) = 𝑓(1,1) = × =
3 3 28 2
7 7
𝑓(2|1) = 𝑓(2,1) = × 0 = 0
3 3
9
Again,
2
3 6 15
𝑔(𝑥) = ∑ 𝑓(1, 𝑦) = 𝑓(1,0) + 𝑓(1,1) + 𝑓(1,2) = + +0=
28 28 28
𝑦=0
𝑓(1,𝑦) 28
Hence 𝑓(𝑦|1) = = 15 𝑓(1, 𝑦), 𝑓𝑜𝑟 𝑦 = 0,1,2
𝑔(1)
Therefore,
28 28 9 3
𝑓(0|1) = 𝑓(1,0) = × =
15 15 28 5
28 28 6 2
𝑓(1|1) = 𝑓(1,1) = × =
15 15 28 5
28 28
𝑓(2|1) = 𝑓(1,2) = ×0=0
15 15
Hence, the conditional distribution of Y given X=1 is
y 0 1 2
f(y|1) 3/5 2/5 0
𝑓(0,1) 1
And 𝑃(𝑋 = 0|𝑌 = 1) = = 𝑓(0|1) = 2
ℎ(1)
1
, 𝑓𝑜𝑟 0 ≤ 𝑥 ≤ 𝑦 ≤ 2
* 𝑓(𝑥, 𝑦) = {2
0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
a. Find the marginal density of Y and hence the conditional density of X.
b. Find 𝑃(𝑋 ≤ 0.5 |𝑌 = 1.5).
Solution: a. The marginal density of Y is:
∞ 𝑦
1 1
ℎ(𝑦) = ∫ 𝑓(𝑥, 𝑦)𝑑𝑥 = ∫ 𝑑𝑦 = 𝑦
−∞ 0 2 2
1
Thus ℎ(𝑦) = 𝑦, 𝑓𝑜𝑟 0 ≤ 𝑦 ≤ 2
2
10
*Independence of Random Variables:
Two random variables X and Y with marginal densities g(x) and h(y), respectively, are said to be
independent if and only if
𝑓(𝑥|𝑦) = 𝑔(𝑥) 𝑜𝑟 𝑓(𝑦|𝑥) = ℎ(𝑦)
Where f(x|y) is the conditional density of X for given Y.
If X and Y are independent, then
𝑓(𝑥, 𝑦) = 𝑔(𝑥). ℎ(𝑦)
For all values of x and y.
Values of X
Values of Y 𝑥1 𝑥2 𝑥3 Row Sum
𝑦1 𝑓(𝑥1 , 𝑦1 ) 𝑓(𝑥2 , 𝑦1 ) 𝑓(𝑥3 , 𝑦1 ) ℎ(𝑦1 )
𝑦2 𝑓(𝑥1 , 𝑦2 ) 𝑓(𝑥2 , 𝑦2 ) 𝑓(𝑥3 , 𝑦2 ) ℎ(𝑦2 )
Column Sum 𝑔(𝑥1 ) 𝑔(𝑥2 ) 𝑔(𝑥3 ) 1
For independence,
𝑓(𝑥1 , 𝑦1 ) = 𝑔(𝑥1 ). ℎ(𝑦1 )
𝑓(𝑥2 , 𝑦1 ) = 𝑔(𝑥2 ). ℎ(𝑦1 )
.
.
.
𝑓(𝑥3 , 𝑦2 ) = 𝑔(𝑥3 ). ℎ(𝑦2 )
* Suppose X and Y have the following joint probability function:
Values of X
Values of Y 2 4 Row Sum
1 0.10 0.15 0.25
3 0.20 0.30 0.50
5 0.10 0.15 0.25
Column Sum 0.40 0.60 1
Check if X and Y are independent.
Solution:
i) f(2,1)=0.10 and g(2)=0.40, h(1)=0.25, hence g(2).h(1)=0.10=f(2,1).
ii) f(4,1)=0.15 and g(4)=0.60, h(1)=0.25, hence g(4).h(1)=0.15=f(4,1).
.
11
.
.
vi) f(4,5)=0.15 and g(4)=0.60, h(5)=0.25, hence g(4).h(5)=0.15
For all points of the values (x,y) of the random variables X and Y, f(x,y)=g(x).h(y). Hence, the
variables are independent.
Alternatively, we can use the concept of conditional probability in this case.
𝑓(2,1) 0.10
𝑃(𝑌 = 1|𝑋 = 2) = 𝑓(1|2) = = = 0.25 = ℎ(1)
𝑔(2) 0.40
𝑓(2,3) 0.20
𝑃(𝑌 = 3|𝑋 = 2) = 𝑓(3|2) = = = 0.50 = ℎ(3)
𝑔(2) 0.40
.
.
.
𝑓(4,5) 0.15
𝑃(𝑌 = 5|𝑋 = 4) = 𝑓(5|4) = = = 0.25 = ℎ(5)
𝑔(4) 0.60
Here the conditional distributions of Y for all X’s are equal to the marginal distribution of Y. So,
X and Y are independent.
𝑥+𝑦
* 𝑓(𝑥, 𝑦) = , 0 < 𝑥 < 2, 0 < 𝑦 < 2
8
1 2 𝑦+1
ℎ(𝑦) = ∫ (𝑥 + 𝑦)𝑑𝑥 =
8 0 4
(𝑥+1)(𝑦+1)
Thus 𝑔(𝑥). ℎ(𝑦) = ≠ 𝑓(𝑥, 𝑦).
4
12
𝑥+𝑦
𝑓(𝑥, 𝑦) 𝑥+𝑦
𝑓(𝑥|𝑦) = = 8 = ≠ 𝑔(𝑥)
ℎ(𝑦) 𝑦 + 1 2(𝑦 + 1)
4
Since the conditional distributions are not equal to the marginal distributions, the variables are
not independent.
− ○ −
13