0% found this document useful (0 votes)
19 views17 pages

Random Variables - 2D

The document discusses two-dimensional random variables, defining key concepts such as joint probability distribution functions, marginal probability mass functions, and conditional probability functions for both discrete and continuous cases. It also explains the independence of random variables and provides examples and exercises related to these concepts. The document serves as a comprehensive overview of the mathematical framework for analyzing two-dimensional random variables.

Uploaded by

boranana777
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
19 views17 pages

Random Variables - 2D

The document discusses two-dimensional random variables, defining key concepts such as joint probability distribution functions, marginal probability mass functions, and conditional probability functions for both discrete and continuous cases. It also explains the independence of random variables and provides examples and exercises related to these concepts. The document serves as a comprehensive overview of the mathematical framework for analyzing two-dimensional random variables.

Uploaded by

boranana777
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 17

Two-dimensional Random Variables

Let 𝑆 be the sample space associated with a random experiment 𝐸. Let 𝑋 = 𝑋(𝑆) and
𝑌 = 𝑌(𝑆) be two functions each assigning a real number to each 𝑠 ∈ 𝑆. Then (𝑋, 𝑌) is called
a two-dimensional random variable.

BASIC TERMINOLOGIES

Joint Probability distribution function

Let (𝑋, 𝑌) be a two-dimensional discrete random variable. With each possible outcome
(𝑥𝑖 , 𝑦𝑖 ) we associate a number 𝑝(𝑥𝑖 , 𝑦𝑗 ) representing 𝑃(𝑋 = 𝑥𝑖 , 𝑌 = 𝑦𝑗 ) and satisfying the
following conditions:

(i) 𝑝(𝑥𝑖 , 𝑦𝑗 ) ≥ 0 for all 𝑖, 𝑗


(ii) ∑∞ ∞
𝑗=1 ∑𝑖=1 𝑝(𝑥𝑖 , 𝑦𝑗 ) = 1

Joint Probability density function


Let (𝑋, 𝑌) be a continuous random variable assuming all values in some region 𝑅 of the
Euclidean plane. The joint probability density function 𝑓 is a function satisfying the following
conditions:
(i) 𝑓(𝑥, 𝑦) ≥ 0 for all (𝑥, 𝑦) ∈ 𝑅,
(ii) ∬𝑅 𝑓(𝑥, 𝑦)𝑑𝑥𝑑𝑦 = 1

Joint Cumulative distribution function


For two-dimensional random variable (𝑋, 𝑌), the cdf 𝐹(𝑥, 𝑦) is defined as
𝐹(𝑥, 𝑦) = 𝑃(𝑋 ≤ 𝑥, 𝑌 ≤ 𝑦)
(i) For Discrete random variable:

𝐹(𝑥, 𝑦) = ∑ ∑ 𝑃(𝑥𝑖 , 𝑦𝑗 )
𝑥𝑖≤𝑥 𝑦𝑗 ≤𝑦

(ii) For Continuous random variable:


𝑥 𝑦
𝐹(𝑥, 𝑦) = ∫ ∫ 𝑓(𝑥, 𝑦)𝑑𝑦 𝑑𝑥
−∞ −∞
Note:
If 𝐹(𝑥, 𝑦) is the pdf of a two-dimensional random variable with joint pdf 𝑓(𝑥, 𝑦), then
𝜕 2 𝐹(𝑥, 𝑦)
= 𝑓(𝑥, 𝑦)
𝜕𝑥𝜕𝑦

Marginal Probability mass function

If (𝑋, 𝑌) is a two-dimensional discrete random variable with joint pmf 𝑃(𝑥𝑖 , 𝑦𝑗 ). Since 𝑋 = 𝑥𝑖
must occur with 𝑌 = 𝑦𝑗 for some 𝑗 and can occur with 𝑌 = 𝑦𝑗 for only one 𝑗, we have

𝑝(𝑥𝑖 ) = 𝑃(𝑋 = 𝑥𝑖 , 𝑌 = 𝑦1 𝑜𝑟 𝑋 = 𝑥𝑖 , 𝑌 = 𝑦2 𝑜𝑟 ⋯ ) = ∑ 𝑝(𝑥𝑖 , 𝑦𝑗 ).


𝑗=1

The function 𝑝 defined above is called marginal pmf of 𝑋.


Similarly, marginal pmf of 𝑌 is

𝑞(𝑦𝑗 ) = 𝑃(𝑌 = 𝑌𝑗 ) = ∑ 𝑝(𝑥𝑖 , 𝑦𝑗 )


𝑖=1

Marginal Probability density function


Let 𝑓(𝑥, 𝑦) be the joint pdf of the continuous two-dimensional random variable (𝑋, 𝑌). We
define 𝑔(𝑥) and ℎ(𝑦), the marginal probability density functions of 𝑋 and 𝑌, respectively, as
follows:

𝑔(𝑥) = ∫ 𝑓(𝑥, 𝑦)𝑑𝑦
−∞

ℎ(𝑦) = ∫ 𝑓(𝑥, 𝑦)𝑑𝑥
−∞

These marginal pdf’s correspond to the basic pdf’s of the one-dimensional random variables
𝑋 and 𝑌, respectively.

Example:
Two production lines manufacture a certain type of item. Suppose that the capacity (on any
given day) is 5 items for line I and 3 items for line II. Assume that the number of items actually
produced by either production line is a random variable. Let (𝑋, 𝑌) represent the two-
dimensional random variable yielding the number of items produced by line I and line II,
respectively. The following table gives the joint probability distribution of (𝑋, 𝑌). Each entry
represents 𝑝(𝑥𝑖 , 𝑦𝑗 ) = 𝑃(𝑋 = 𝑥𝑖 , 𝑌 = 𝑦𝑗 ).

X 0 1 2 3 4 5 Sum
Y
0 0 0.01 0.03 0.05 0.07 0.09 0.25
1 0.01 0.02 0.04 0.05 0.06 0.08 0.26
2 0.01 0.03 0.05 0.05 0.05 0.06 0.25
3 0.01 0.02 0.04 0.06 0.06 0.05 0.24
Sum 0.03 0.08 0.16 0.21 0.24 0.28 1.00
Thus
𝑃(2,3) = 𝑃(𝑋 = 2, 𝑌 = 3) = 0.04, etc. Hence if 𝐵 is defined as
𝐵 ={More items are produced by line I then by line II}
We find that
𝑃(𝐵) = 0.01 + 0.03 + 0.05 + 0.07 + 0.09 + 0.04 + 0.05 + 0.06 + 0.08 + 0.05 + 0.05
+ 0.06 + 0.06 + 0.05
= 0.75

Conditional probability mass function


Let (𝑋, 𝑌) be a discrete two-dimensional random variable with joint pmf 𝑝(𝑥𝑖 , 𝑦𝑗 ). Let 𝑝(𝑥𝑖 )
and 𝑞(𝑦𝑗 ) be the marginal pmf’s of 𝑋 and 𝑌, respectively.
The conditional pmf of 𝑋 given 𝑌 is defined as
𝑝(𝑥𝑖 | 𝑦𝑗 ) = 𝑃(𝑋 = 𝑥𝑖 | 𝑌 = 𝑦𝑗 )
𝑝(𝑥𝑖 ,𝑦𝑗 )
= if 𝑞(𝑦𝑗 ) > 0
𝑞(𝑦𝑗 )
The conditional pmf of 𝑌 given 𝑋 is defined as
𝑝(𝑦𝑗 | 𝑥𝑖 ) = 𝑃(𝑌 = 𝑦𝑗 | 𝑋 = 𝑥𝑖 )
𝑝(𝑥𝑖 ,𝑦𝑗 )
= if 𝑝(𝑥𝑖 ) > 0
𝑝(𝑥𝑖 )

Conditional probability density function


Let (𝑋, 𝑌) be a continuous two-dimensional random variable with joint pdf 𝑓(𝑥, 𝑦). Let 𝑔(𝑥)
and ℎ(𝑦) be the marginal pdf’s of 𝑋 and 𝑌, respectively.
The conditional pdf of 𝑋 for given 𝑌 = 𝑦 is defined by
𝑓(𝑥, 𝑦)
𝑔(𝑥 | 𝑦) = , ℎ(𝑦) > 0
ℎ(𝑦)
The conditional pdf of 𝑌 for given 𝑋 = 𝑥 is defined by
𝑓(𝑥, 𝑦)
ℎ(𝑦 | 𝑥) = , 𝑔(𝑥) > 0
𝑔(𝑥)
Independent Random Variables
Just as we defined the concept of independence between two events 𝐴 and 𝐵, we shall now
define independent random variables. Intuitively, we intend to say that 𝑋 and 𝑌 are
independent random variables if the outcome of 𝑋, say, in no way influences the outcome of
𝑌. This is an extremely important notion and there are many situations in which such an
assumption is justified.
Let (𝑋, 𝑌) be a two-dimensional discrete random variable. We say that 𝑋 and 𝑌 are
independent random variables if and only if
(i) For a discrete random variable:
𝑝(𝑥𝑖 , 𝑦𝑗 ) = 𝑝(𝑥𝑖 )𝑞(𝑦𝑗 ) ∀ 𝑖, 𝑗
i.e., 𝑃(𝑋 = 𝑥𝑖 , 𝑌 = 𝑦𝑗 ) = 𝑃(𝑋 = 𝑥𝑖 )𝑃(𝑌 = 𝑦𝑗 ) ∀ 𝑖, 𝑗

(ii) For a continuous random variable:


𝑓(𝑥, 𝑦) = 𝑔(𝑥)ℎ(𝑦) ∀ (𝑥, 𝑦)
where 𝑓(𝑥, 𝑦) is the joint pdf, and 𝑔(𝑥) and ℎ(𝑦) are the marginal pdf’s of 𝑋 and
𝑌, respectively.

Exercise
1. A fair coin is tossed 3 times. Let 𝑋: 0 or 1 according as 𝐻 or 𝑇 on the first toss. 𝑌:
Number of head. Find the probability distribution of 𝑋 and 𝑌.

2. Suppose that 3 balls are randomly selected from urn containing 3 red, 4 white, 5 black
balls. If 𝑋 and 𝑌 denotes the number of red balls and the number of white balls chosen,
then find the probability distribution of 𝑋 and 𝑌.

3. Evaluate the conditional probability 𝑃(𝑋 = 2|𝑌 = 2) from the following table

X 0 1 2 3 4 5 Sum
Y
0 0 0.01 0.03 0.05 0.07 0.09 0.25
1 0.01 0.02 0.04 0.05 0.06 0.08 0.26
2 0.01 0.03 0.05 0.05 0.05 0.06 0.25
3 0.01 0.02 0.04 0.06 0.06 0.05 0.24
Sum 0.03 0.08 0.16 0.21 0.24 0.28 1.00
Solution:
𝑃(𝑋=2,𝑌=2) 0.05
𝑃(𝑋 = 2 | 𝑌 = 2) = = = 0.20.
𝑃(𝑌=2) 0.25

4. A two-dimensional continuous random variable (𝑋, 𝑌) has joint pdf given by


𝑥𝑦
𝑥 2 + , 0 ≤ 𝑥 ≤ 1, 0 ≤ 𝑦 ≤ 2,
3
𝑓(𝑥, 𝑦) = {
0, 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒

Find 𝑔(𝑥 | 𝑦) and ℎ(𝑦 | 𝑥).


Solution:
2
𝑥𝑦 2
𝑔(𝑥) = ∫ (𝑥 2 + ) 𝑑𝑦 = 2𝑥 2 + 𝑥
0 3 3
1
𝑥𝑦 𝑦 1
ℎ(𝑦) = ∫ (𝑥 2 + ) 𝑑𝑥 = +
0 3 6 3
𝑥𝑦
𝑥2+ 6𝑥 2 +2𝑥𝑦
3
Hence 𝑔(𝑥 | 𝑦) = 1 𝑦 = , 0 ≤ 𝑥 ≤ 1, 0 ≤ 𝑦 ≤ 2;
+ 2+𝑦
3 6

𝑥𝑦
𝑥2+ 3𝑥 2 +𝑥𝑦 3𝑥+𝑦
3
ℎ(𝑦 | 𝑥) = 2 = = , 0 ≤ 𝑦 ≤ 2, 0 ≤ 𝑥 ≤ 1.
2
2𝑥 + 𝑥 6𝑥 2 +2𝑥 6𝑥+2
3

5. Suppose that a machine is used for a particular task in the morning and for a different
task in the afternoon. Let 𝑋 and 𝑌 represent the number of times the machine breaks
down in the morning and in the afternoon, respectively. Following table gives the joint
probability distribution of (𝑋, 𝑌). Check whether 𝑋 and 𝑌 are independent random
variables.

X 0 1 2 𝒒(𝒚𝒋 )
Y
0 0.1 0.2 0.2 0.5
1 0.04 0.08 0.08 0.2
2 0.06 0.12 0.12 0.3
𝒑(𝒙𝒊 ) 0.2 0.4 0.4 1.00
Solution:
𝑃(𝑋 = 0, 𝑌 = 0) = 0.1; 𝑝(𝑋 = 0)𝑞(𝑌 = 0) = 0.2 × 0.5 = 0.1
⟹ 𝑃(𝑋 = 0, 𝑌 = 0) = 𝑝(𝑋 = 0)𝑞(𝑌 = 0)

Similarly, 𝑝(𝑥𝑖 , 𝑦𝑗 ) = 𝑝(𝑥𝑖 )𝑞(𝑦𝑗 ) ∀ 𝑥, 𝑦

Thus 𝑋 and 𝑌 are independent random variables.

6. Let 𝑋 and 𝑌 be the life lengths of two electronic devices. Suppose that their joint pdf
is given by
𝑓(𝑥, 𝑦) = 𝑒 −(𝑥+𝑦) , 𝑥 ≥ 0, 𝑦 ≥ 0
Check whether 𝑋 and 𝑌 are independent random variables.
Solution:
∞ ∞
−𝑥
𝑔(𝑥) = ∫ 𝑓(𝑥, 𝑦)𝑑𝑦 = 𝑒 ∫ 𝑒 −𝑦 𝑑𝑦 = 𝑒 −𝑥 , 𝑥 ≥ 0
0 0
∞ ∞
−𝑦
ℎ(𝑦) = ∫ 𝑓(𝑥, 𝑦)𝑑𝑥 = 𝑒 ∫ 𝑒 −𝑥 𝑑𝑥 = 𝑒 −𝑦 , 𝑦 ≥ 0
0 0
Hence, 𝑓(𝑥, 𝑦) = 𝑔(𝑥) × ℎ(𝑦), i.e., 𝑋 and 𝑌 are independent random variables

7. Suppose that the following table represents the joint probability distribution of the
discrete random variable (𝑋, 𝑌). Evaluate all the marginal and conditional
distributions.

X 1 2 3
Y
1 1 1 0
12 6
2 0 1 1
9 5
3 1 1 2
18 4 15

8. Suppose that the two-dimensional random variable (𝑋, 𝑌) has joint pdf
𝑘𝑥(𝑥 − 𝑦), 0 < 𝑥 < 2, −𝑥 < 𝑦 < 𝑥
𝑓(𝑥, 𝑦) = {
0, 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒

1
(i) Evaluate the constant 𝑘. (Ans: )
8
(ii) Find the marginal pdf of 𝑋.
𝑥3
,0 < 𝑥 < 2
Ans: 𝑔(𝑥) = { 4
0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
(iii) Find the marginal pdf of 𝑌.
1 𝑦 5𝑦 3
− + , −2 ≤ 𝑦 ≤ 0
3 4 48
Ans: ℎ(𝑦) = 1 𝑦 𝑦3
− + ,0 ≤ 𝑦 ≤ 2
3 4 48
{ 0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒

9. Suppose that the joint pdf of the two-dimensional random variable (𝑋, 𝑌) is given by
𝑥𝑦
𝑥2 + , 0 < 𝑥 < 1, 0 < 𝑦 < 2,
3
𝑓(𝑥, 𝑦) = {
0, 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
(i) Check if 𝑓(𝑥, 𝑦) is a valid pdf
(ii) Evaluate 𝑃(𝐵) where 𝐵 = {𝑋 + 𝑌 ≥ 1}
(iii) Find the marginal pdf of 𝑋 and 𝑌
(iv) Find 𝑔(𝑥|𝑦)
(v) Find ℎ(𝑦|𝑥)
1
(vi) Evaluate 𝑃 (𝑋 > )
2
(vii) Evaluate 𝑃(𝑌 < 𝑋)
1 1
(viii) Evaluate 𝑃 (𝑌 < | 𝑋< )
2 2
Solution:
∞ ∞
(i) To check that ∫−∞ ∫−∞ 𝑓(𝑥, 𝑦) 𝑑𝑥 𝑑𝑦 = 1.

∞ ∞ 2 1
𝑥𝑦
∫ ∫ 𝑓(𝑥, 𝑦)𝑑𝑥 𝑑𝑦 = ∫ ∫ (𝑥 2 + ) 𝑑𝑥 𝑑𝑦
−∞ −∞ 0 0 3

2 𝑥=1
𝑥3 𝑥2
=∫ ( + ) 𝑑𝑦
0 3 6 𝑥=0

2
1 𝑦
= ∫ ( + ) 𝑑𝑦
0 3 6

2
1 𝑦2
=( 𝑦+ )
3 12 0
2 4
= + = 1.
3 12
(ii) Let 𝐵 = {𝑋 + 𝑌 ≥ 1}. (See below figure)
𝑃(𝐵) = 1 − 𝑃(𝐵̅) where 𝐵̅ = {𝑋 + 𝑌 < 1}.

1 1−𝑥 𝑥𝑦
Hence 𝑃(𝐵) = 1 − ∫0 ∫0 (𝑥 2 + 3
) 𝑑𝑦 𝑑𝑥
1
2 (1
𝑥(1 − 𝑥)2
= 1 − ∫ [𝑥 − 𝑥) + ] 𝑑𝑥
0 6
7 65
=1− = .
72 72

(iii) To find the marignal pdf of 𝑋:



𝑔(𝑥) = ∫ 𝑓(𝑥, 𝑦)𝑑𝑦
−∞
2
𝑥𝑦
= ∫ (𝑥 2 + ) 𝑑𝑦
0 3
2𝑥
= 2𝑥 2 + ,0 < 𝑥 < 1
3
To find the marignal pdf of 𝑌:

ℎ(𝑦) = ∫ 𝑓(𝑥, 𝑦)𝑑𝑥
−∞
1
𝑥𝑦
= ∫ (𝑥 2 + ) 𝑑𝑥
0 3
𝑦 1
= + ,0 < 𝑦 < 2
6 3
𝑓(𝑥,𝑦) 6𝑥 2 +2𝑥𝑦
(iv) 𝑔(𝑥|𝑦) = = , 0 < 𝑥 < 1, 0 < 𝑦 < 2
ℎ(𝑦) 2+𝑦
𝑓(𝑥,𝑦) 3𝑥+𝑦
(v) ℎ(𝑦|𝑥) = = , 0 < 𝑥 < 1, 0 < 𝑦 < 2
𝑔(𝑥) 6𝑥+2

1 2 1 𝑥𝑦 2 7 3𝑦 5
(vi) 𝑃 (𝑋 > ) = ∫0 ∫1 (𝑥 2 + ) 𝑑𝑥𝑑𝑦 = ∫0 (24 + 24 ) 𝑑𝑦 = 6
2 2 3

1 𝑥 𝑥𝑦 1 7𝑥 3 7
(vii) 𝑃(𝑌 < 𝑋) = ∫0 ∫0 (𝑥 2 + ) 𝑑𝑦𝑑𝑥 = ∫0 𝑑𝑥 =
3 6 24
5
(viii) Ans:
32

10.For what value of 𝑘 is 𝑓(𝑥, 𝑦) = 𝑘𝑒 −(𝑥+𝑦) a joint pdf of (𝑋, 𝑌) over the region 0 < 𝑥 <
1, 0 < 𝑦 < 1? (Ans: 2.5027)

11.Suppose that the continuous two-dimensional random variable (𝑋, 𝑌) is uniformly


distributed over the square whose vertices are (1,0), (0,1), (−1,0), and (0, −1). Find
the marginal pdf’s of 𝑋 and 𝑌.
(Ans: 𝑔(𝑥) = 1 − |𝑥|, −1 < 𝑥 < 1, ℎ(𝑦) = 1 − |𝑦| , − 1 < 𝑦 < 1)

12.Suppose that the joint pdf of (𝑋, 𝑌) is given by


𝑒 −𝑦 , 𝑥 > 0, 𝑦 > 𝑥
𝑓(𝑥, 𝑦) = {
0, 𝑜𝑡ℎ𝑒𝑟𝑖𝑠𝑒
−𝑥
(i) Find the marginal pdf of 𝑋. (Ans: 𝑔(𝑥) = 𝑒 , 𝑥 > 0)
(ii) Find the marginal pdf of 𝑌. (Ans: ℎ(𝑦) = 𝑦𝑒 −𝑦 , 𝑦 > 𝑥 > 0)
(iii) Evaluate 𝑃(𝑋 > 2|𝑌 < 4). (Ans: 0.0885)

1
13.Find 𝑐 for which 𝑓(𝑥, 𝑦) = 𝑐𝑥 + 𝑐𝑦 2 , 𝑥 = 1,2, 𝑦 = 1,2,3 is the joint pmf. (Ans: )
37

2
0≤𝑥≤𝑦≤𝑎
14.If 𝑓(𝑥, 𝑦) = {𝑎2 . Find 𝑓(𝑦/𝑥) and 𝑓(𝑥/𝑦).
𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒

1 1
(Ans: 𝑓(𝑦|𝑥) = , 0 ≤ 𝑥 ≤ 𝑦 ; 𝑓(𝑥|𝑦) = , 𝑥 ≤ 𝑦 ≤ 𝑎)
𝑎−𝑥 𝑦

8𝑥𝑦 0 < 𝑥 < 𝑦 < 1.


15. If 𝑓(𝑥, 𝑦) = { Find the marginal pdf of 𝑋 and 𝑌. Check
0 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
whether they are independent.
(Ans: 𝑔(𝑥) = −4𝑥(𝑥 2 − 1), 0 < 𝑥 < 𝑦, ℎ(𝑦) = 4𝑦 3 , 𝑥 < 𝑦 < 1; Not independent)

16.Suppose that a manufacturer of light bulbs is concerned about the number of bulbs
ordered from him during the months of January and February. Let 𝑋 and 𝑌 denote the
number of bulbs ordered during these two months, respectively. We shall assume that
(𝑋, 𝑌) is a two-dimensional continuous random variable with the following joint pdf.
(Refer to the figure below). Find 𝑃(𝑋 ≥ 𝑌).
𝑐, if 5000 ≤ 𝑥 ≤ 10,000 and 4000 ≤ 𝑦 ≤ 9000
𝑓(𝑥, 𝑦) = {
0, 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
Solution:
∞ ∞
To determine 𝑐 we use the fact that ∫−∞ ∫−∞ 𝑓(𝑥, 𝑦) 𝑑𝑥 𝑑𝑦 = 1.

∞ ∞ 9000 10,000
∫ ∫ 𝑓(𝑥, 𝑦) 𝑑𝑥 𝑑𝑦 = ∫ ∫ 𝑓(𝑥, 𝑦) 𝑑𝑥 𝑑𝑦 = 𝑐[5000]2
−∞ −∞ 4000 5000
Thus 𝑐 = (5000)2 .

If 𝐵 = {𝑋 ≥ 𝑌}, we shall compute 𝑃(𝐵) by evaluating 1 − 𝑃(𝐵̅), where 𝐵̅ = {𝑋 < 𝑌}.


Hence
9000 𝑦
1
𝑃(𝐵) = 1 − ∫ ∫ 𝑑𝑥 𝑑𝑦
(5000)2 5000 5000
9000
1
=1− ∫ [𝑦 − 5000]𝑑𝑦
(5000)2 5000
17
=
25

EXPECTATION OF A TWO-DIMENSIONAL RANDOM VARIABLE


For DRV:
𝐸(𝑋) = ∑∞ ∞
𝑖=1 ∑𝑗=1 𝑥𝑖 𝑃 (𝑥𝑖 , 𝑦𝑖 )

= ∑∞ ∞
𝑖=1 𝑥𝑖 {∑𝑗=1 𝑃 (𝑥𝑖 , 𝑦𝑖 )}

= ∑∞
𝑖=1 𝑥𝑖 𝑝(𝑥𝑖 ) where 𝑝(𝑥𝑖 ) is marginal pmf of 𝑥.

Therefore,
𝐸(𝑋) = ∑∞ ∞
𝑖=1 𝑥𝑖 𝑝(𝑥𝑖 ) and 𝐸(𝑌) = ∑𝑖=1 𝑦𝑗 𝑞(𝑦𝑖 )
For CRV:
∞ ∞ ∞
𝐸(𝑋) = ∫ ∫ 𝑥 𝑓(𝑥, 𝑦)𝑑𝑥 𝑑𝑦 = ∫ 𝑥 𝑔(𝑥)𝑑𝑥
−∞ −∞ −∞
∞ ∞ ∞
𝐸(𝑌) = ∫ ∫ 𝑦 𝑓(𝑥, 𝑦)𝑑𝑥 𝑑𝑦 = ∫ 𝑦 ℎ(𝑦) 𝑑𝑦
−∞ −∞ −∞

∞ ∞
𝐸(𝑋 ) = ∫ ∫ 𝑥 2 𝑓(𝑥, 𝑦) 𝑑𝑥 𝑑𝑦
2
−∞ −∞

∞ ∞
𝐸(𝑋𝑌) = ∫ ∫ 𝑥𝑦 𝑓(𝑥, 𝑦) 𝑑𝑥 𝑑𝑦
−∞ −∞

Properties:
1. 𝐸(𝑐) = 𝑐
2. 𝐸(𝑐𝑋) = 𝑐𝐸(𝑥)
3. 𝐸(𝑋 + 𝑌) = 𝐸(𝑋) + 𝐸(𝑌)
Theorem:
Let (𝑋, 𝑌) be a two-dimensional random variable and suppose that 𝑋 and 𝑌 are independent.
Then 𝐸(𝑋𝑌) = 𝐸(𝑋)𝐸(𝑌).
Proof:
+∞ +∞
𝐸(𝑋𝑌) = ∫ ∫ 𝑥𝑦 𝑓(𝑥, 𝑦)𝑑𝑥 𝑑𝑦
−∞ −∞
+∞ +∞
=∫ ∫ 𝑥𝑦 𝑔(𝑥)ℎ(𝑦)𝑑𝑥 𝑑𝑦
−∞ −∞
+∞ +∞
=∫ 𝑥 𝑔(𝑥)𝑑𝑥 ∫ 𝑦 ℎ(𝑦)𝑑𝑦
−∞ −∞

= 𝐸(𝑋)𝐸(𝑌)
Theorem:
Let (𝑋, 𝑌) be a two-dimensional random variable, and if 𝑋 and 𝑌 are independent, then
𝑉(𝑋 + 𝑌) = 𝑉(𝑋) + 𝑉(𝑌).
2
Proof: 𝑉(𝑋 + 𝑌) = 𝐸(𝑋 + 𝑌)2 − (𝐸(𝑋 + 𝑌))

= 𝐸(𝑋 + 𝑌)2 − (𝐸(𝑋) + 𝐸(𝑌))2


2 2
= 𝐸(𝑋 2 + 2𝑋𝑌 + 𝑌 2 ) − (𝐸 (𝑋)) − 2𝐸(𝑋)𝐸(𝑌) − (𝐸 (𝑌))
2 2
= 𝐸(𝑋 2 ) + 2𝐸(𝑋𝑌) + 𝐸(𝑌 2 ) − (𝐸 (𝑋)) − 2𝐸(𝑋)𝐸(𝑌) − (𝐸(𝑌))
2 2
= 𝐸(𝑋 2 ) − (𝐸(𝑋)) + 𝐸(𝑌 2 ) − (𝐸 (𝑌))

= 𝑉(𝑋) + 𝑉(𝑌)

Exercise
𝑥𝑦
𝑥2 + 0 ≤ 𝑥 ≤ 1, 0 ≤ 𝑦 ≤ 2
1. If 𝑓(𝑥, 𝑦) = { 3 . Find 𝐸(𝑋), 𝐸(𝑌) and 𝑉(𝑌).
0 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒

Solution:

𝐸(𝑋) = ∫−∞𝑥 𝑔(𝑥) 𝑑𝑥
∞ 2 13
= ∫−∞𝑥 {2𝑥 2 + 𝑥} 𝑑𝑥 =
3 18
∞ 10
𝐸(𝑌) = ∫−∞ 𝑦 ℎ(𝑦) 𝑑𝑦 =
19
∞ 14
𝐸(𝑌 2 ) = ∫−∞ 𝑦 2 ℎ(𝑦) 𝑑𝑦 =
9

𝑉(𝑌) = 𝐸(𝑌 2 ) – {𝐸(𝑌)}2 = 0.3209

CONDITIONAL EXPECTATION
Definitions:
(a) If (𝑋, 𝑌) is a two-dimensional discrete random variable, we define the conditional
expectation of 𝑋 for given 𝑌 = 𝑦𝑗 as

𝐸(𝑋|𝑦𝑗 ) = ∑ 𝑥𝑖 𝑝(𝑥𝑖 |𝑦𝑗 )
𝑖=1

(b) If (𝑋, 𝑌) is a two-dimensional continuous random variable, we define the conditional


expectation of 𝑋 for given 𝑌 = 𝑦 as

𝐸(𝑋|𝑦) = ∫ 𝑥 𝑔(𝑥|𝑦) 𝑑𝑥
−∞
Exercise

1. Suppose that (𝑋, 𝑌) is uniformly distributed over the semicircle indicated in the
following figure. Find 𝐸(𝑌|𝑥) and 𝐸(𝑋|𝑦).

Solution:
The region considered is .
The joint pdf of (𝑋, 𝑌) is given by
2
, (𝑥, 𝑦) ∈ 𝑠𝑒𝑚𝑖𝑐𝑖𝑟𝑐𝑙𝑒
𝑓(𝑥, 𝑦) = {𝜋

0, 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
√1−𝑥 2
2 2
𝑔(𝑥) = ∫ 𝑑𝑦 = √1 − 𝑥 2 , −1 ≤ 𝑥 ≤ 1
0 𝜋 𝜋
√1−𝑦 2
2 4
ℎ(𝑦) = ∫ 𝑑𝑥 = √1 − 𝑦 2 , 0 ≤ 𝑦 ≤ 1
−√1−𝑦 2 𝜋 𝜋
Hence,
1
𝑔(𝑥|𝑦) = , −√1 − 𝑦 2 ≤ 𝑥 ≤ √1 − 𝑦 2
2√1 − 𝑦2
1
ℎ(𝑦|𝑥) = , 0 ≤ 𝑦 ≤ √1 − 𝑥 2
√1 − 𝑥2
Therefore,
√1−𝑥 2
1
𝐸(𝑌|𝑥) = ∫ 𝑦 ℎ(𝑦|𝑥)𝑑𝑦 = √1 − 𝑥 2
0 2
Similarly,
√1−𝑦 2
𝐸(𝑋|𝑦) = ∫ 𝑥 𝑔(𝑥|𝑦) 𝑑𝑥 = 0
−√1−𝑦 2
CORRELATION COEFFICIENT
Correlation coefficient is a parameter which measures the degree of association between
two random variables 𝑋 and 𝑌
Let (𝑋, 𝑌) be a two-dimensional random variable. The correlation coefficient 𝜌𝑥𝑦 between 𝑋
and 𝑌 is defined as

𝐸[(𝑋 − 𝐸(𝑋))(𝑌 − 𝐸(𝑌))]


𝜌𝑥𝑦 =
√𝑉(𝑋)𝑉(𝑌)

where 𝐸[(𝑋 − 𝐸(𝑋))(𝑌 − 𝐸(𝑌))] = 𝜎𝑥𝑦 = 𝑐𝑜𝑣(𝑋, 𝑌) is called as covariance between 𝑋 and
𝑌.

𝐸[(𝑋 − 𝐸(𝑋))(𝑌 − 𝐸(𝑌))] = 𝐸[𝑋𝑌 − 𝑋𝐸(𝑋) − 𝑌𝐸(𝑋) + 𝐸(𝑋)𝐸(𝑌)]


= 𝐸(𝑋𝑌) − 𝐸(𝑋)𝐸(𝑌) − 𝐸(𝑌)𝐸(𝑋) + 𝐸(𝑋)𝐸(𝑌)
= 𝐸(𝑋𝑌) − 𝐸(𝑋)𝐸(𝑌)
NOTE:
1. If 𝑋 and 𝑌 are independent, then 𝐸(𝑋𝑌) = 𝐸(𝑋)𝐸(𝑌) ⟹ 𝜌𝑥𝑦 = 0 , i.e., 𝑋 and 𝑌 are
uncorrelated.
2. Converse of the above is not true, i.e., if 𝜌𝑥𝑦 = 0, then 𝑋 and 𝑌 need not be
independent
Hence, being uncorrelated and independent in general, are not equivalent.
For example,
1
Consider the random variable 𝑌 = 𝑋 2 here the pdf of 𝑋 is 𝑓(𝑥) = , −1 ≤ 𝑋 ≤ 1
2
3) 2
𝐸(𝑋𝑌) − 𝐸(𝑋)𝐸(𝑌) = 𝐸(𝑋 − 𝐸(𝑋)𝐸(𝑋 )
1 1
1 1
𝐸(𝑋) = ∫ 𝑥 ∙ 𝑑𝑥 = 0 ; 𝐸(𝑋 = ∫ 𝑥 3 ∙ 𝑑𝑥 = 0
3)
−1 2 −1 2
∴ 𝐸(𝑋𝑌) − 𝐸(𝑋)𝐸(𝑌) = 0 and hence 𝜌𝑥𝑦 = 0 but clearly 𝑋 and 𝑌 are not independent.

Theorem
If (𝑋, 𝑌) is a two-dimensional random variable, then −1 ≤ 𝜌𝑥𝑦 ≤ 1
Proof:
We know that 𝑉(𝑋) ≥ 0,
i.e., 𝐸(𝑋 2 ) − [𝐸(𝑋)]2 ≥ 0
𝐸(𝑋 2 ) ≥ [𝐸(𝑋)]2
𝐸(𝑋 2 ) ≥ 0 (since [𝐸(𝑋)]2 ≥ 0)
2
𝑋−𝐸(𝑋) 𝑌−𝐸(𝑌)
Consider 𝐸 { ± } ≥0
√𝑉(𝑋) √𝑉(𝑌)
2 2
(𝑋 − 𝐸(𝑋)) (𝑌 − 𝐸(𝑌)) (𝑋 − 𝐸(𝑋)) (𝑌 − 𝐸(𝑌))
𝐸[ + ± 2. . ]≥0
𝑉(𝑋) 𝑉(𝑌) √𝑉(𝑋) √𝑉(𝑌)
2 2
𝐸(𝑋 − 𝐸(𝑋)) 𝐸(𝑌 − 𝐸(𝑌)) 𝐸[(𝑋 − 𝐸(𝑋) (𝑌 − 𝐸(𝑌)]
+ ± 2. ≥0
𝑉(𝑋) 𝑉(𝑌) √𝑉(𝑋) √𝑉(𝑌)
𝑉(𝑋) 𝑉(𝑌)
+ ± 2𝜌𝑥𝑦 ≥ 0
𝑉(𝑋) 𝑉(𝑌)
2
(Since 𝐸(𝑋 − 𝐸(𝑋)) = 𝑉(𝑋))
2 ± 2𝜌𝑥𝑦 ≥ 0
2 ± 2𝜌𝑥𝑦 ≥ 0 that is 1 ± 𝜌𝑥𝑦 ≥ 0
1 + 𝜌𝑥𝑦 ≥ 0 and 1 − 𝜌𝑥𝑦 ≥ 0
𝜌𝑥𝑦 ≥ −1 and 𝜌𝑥𝑦 ≤ 1
Hence, −1 ≤ 𝜌 ≤ 1.

Theorem
If 𝑋 and 𝑌 are linearly related, then 𝜌𝑥𝑦 = ±1
Proof:
Let 𝑋 and 𝑌 be linearly related, say 𝑌 = 𝑎𝑋 + 𝑏
𝐶𝑜𝑣(𝑋, 𝑌) = 𝐸(𝑋𝑌) − 𝐸(𝑋)𝐸(𝑌)
= 𝐸[𝑋(𝑎𝑋 + 𝑏)] − 𝐸(𝑋)𝐸(𝑎𝑋 + 𝑏)
= 𝐸[𝑎𝑋 2 + 𝑏𝑋] − 𝐸(𝑋)[𝑎𝐸(𝑋) + 𝑏]
= 𝑎𝐸(𝑋 2 ) + 𝑏𝐸(𝑋) − 𝑎[𝐸(𝑋)]2 − 𝑏𝐸(𝑋)
2
= 𝑎 [𝐸(𝑋 2 ) − (𝐸 (𝑋)) ]
= 𝑎 𝑉(𝑋)
𝑉(𝑌) = 𝑉(𝑎𝑋 + 𝑏) = 𝑎2 𝑉(𝑋)
𝐶𝑜𝑣(𝑋, 𝑌)
𝜌𝑥𝑦 =
√𝑉(𝑋)𝑉(𝑌)
𝑎𝑉(𝑋)
=
√𝑉(𝑋)𝑎2 𝑉(𝑋)
𝑎𝑉(𝑋)
= = ±1
±𝑎𝑉(𝑋)

Exercise
1. If 𝑈 = 𝑎 + 𝑏𝑋 and 𝑉 = 𝑐 + 𝑑𝑌, then show that 𝜌𝑢𝑣 = ±𝜌𝑥𝑦

2. The random variable (𝑋, 𝑌) has a joint pdf given by


𝑓(𝑥, 𝑦) = 𝑥 + 𝑦, 0 ≤ 𝑥 ≤ 1 , 0 ≤ 𝑦 ≤ 1compute correlation between 𝑋 &𝑌.
Solution:
𝐸(𝑋𝑌) − 𝐸(𝑋)𝐸(𝑌)
𝜌=
√𝑉(𝑋)𝑉(𝑌)
1 1
7
𝐸(𝑋) = ∫ ∫ 𝑥(𝑥 + 𝑦)𝑑𝑥 𝑑𝑦 =
0 0 12
1 1
7
𝐸(𝑌) = ∫ ∫ 𝑦(𝑥 + 𝑦)𝑑𝑥 𝑑𝑦 =
0 0 12
1 1
1
𝐸(𝑋𝑌) = ∫ ∫ 𝑥𝑦(𝑥 + 𝑦)𝑑𝑥 𝑑𝑦 =
0 0 3
11 11
𝑉(𝑋) = , 𝑉(𝑌) =
144 144
1 7 7
− ×
𝜌𝑥𝑦 = 3 12 12 = − 1
11
√ 11 × 11
144 144
3. If (𝑋, 𝑌) has the joint density function 𝑓(𝑥, 𝑦) = 2 − 𝑥 − 𝑦, 0 < 𝑥 < 1, 0 < 𝑦 < 1,
1
compute 𝜌𝑥𝑦 . (Ans: − )
11

4. Show that 𝑉(𝑎𝑋 + 𝑏𝑌) = 𝑎2 𝑉(𝑋) + 𝑏 2 𝑉(𝑌) + 2𝑎𝑏 𝑐𝑜𝑣(𝑋, 𝑌). Also prove that when
𝑋 and 𝑌 are independent 𝑉(𝑎𝑋 + 𝑏𝑌) = 𝑉(𝑎𝑋 − 𝑏𝑌) = 𝑎2 𝑉(𝑋) + 𝑏 2 𝑉(𝑌)

5. Two independent random variables 𝑋1 and 𝑋2 have mean 5 and 10 and variances 4
and 9 respectively. Find the covariance between 𝑈 = 3𝑋1 + 4𝑋2 and 𝑉 = 3𝑋1 − 𝑋2 .
(Ans: 0)

6. Let 𝑋1 , 𝑋2 , 𝑋3 be uncorrelated random variables having the same standard deviation.


1
Find the correlation coefficient between 𝑋1 + 𝑋2 and 𝑋2 + 𝑋3 . (Ans: )
2

1 1
7. Given 𝐸(𝑋𝑌) = 43, 𝑃(𝑋 = 𝑥𝑖 ) = and 𝑃(𝑌 = 𝑦𝑗 ) = . Find 𝜌𝑥𝑦
5 5
𝑋 1 3 4 6 8
𝑌 1 2 24 12 5
22 44 126 750
(Ans: 𝐸(𝑋) = , 𝐸(𝑌) = 𝐸(𝑋 2 ) = , 𝐸(𝑌 2 ) = , 𝜌𝑥𝑦 = 0.2079)
5 5 5 5

8. If 𝑋, 𝑌 and 𝑍 are uncorrelated random variables with standard deviation 5,12,9


respectively, then evaluate 𝜌𝑢𝑣 where 𝑈 = 𝑋 + 𝑌 and 𝑉 = 𝑌 + 𝑍
(Ans: 𝑐𝑜𝑣(𝑋, 𝑌) = 𝑐𝑜𝑣(𝑌, 𝑍) = 𝑐𝑜𝑣(𝑋, 𝑍) = 0, 𝑉(𝑈) = 169, 𝑉(𝑉) = 225, 𝜌𝑢𝑣 =
0.7385)
9. Suppose that a two-dimensional random variable is uniformly distributed over the
triangular region 𝑅 = {(𝑥, 𝑦)|0 < 𝑥 < 𝑦 < 1}
(i) Find the pdf of (𝑋, 𝑌)
(ii) Find the marginal pdf of 𝑋 and 𝑌
(iii) Find 𝜌𝑥𝑦
Solution:

2, (𝑥, 𝑦) ∈ 𝑅
𝑓(𝑥, 𝑦) = {
0, 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
1
Marginal pdf of 𝑋, 𝑔(𝑥) = ∫𝑥 2 𝑑𝑦 = 2(1 − 𝑥), 0 ≤ 𝑥 ≤ 1
𝑦
Marginal pdf of 𝑌, ℎ(𝑦) = ∫0 2 𝑑𝑦 = 2𝑦, 0 ≤ 𝑦 ≤ 1
1 1 1 2
𝐸(𝑋) = ∫0 𝑥 𝑔(𝑥)𝑑𝑥 = ; 𝐸(𝑌) = ∫0 𝑦 ℎ(𝑦)𝑑𝑦 =
3 3
2) 1 2 1 2) 1 1 1 1
𝐸(𝑋 = ∫0 𝑥 𝑔(𝑥)𝑑𝑥 =
6
; 𝐸(𝑌 = ∫0 𝑦 2 ℎ(𝑦)𝑑𝑦 = ; 𝑉(𝑋) =
2 18
; 𝑉(𝑌) =
18
1 𝑦 1
𝐸(𝑋𝑌) = ∫0 ∫0 𝑥𝑦 𝑓(𝑥, 𝑦)𝑑𝑥𝑑𝑦 =
4
𝐸(𝑋𝑌) − 𝐸(𝑋)𝐸(𝑌) 1
𝜌𝑥𝑦 = =
√𝑉(𝑋) × 𝑉(𝑌) 2

You might also like