Random Variables - 2D
Random Variables - 2D
Let 𝑆 be the sample space associated with a random experiment 𝐸. Let 𝑋 = 𝑋(𝑆) and
𝑌 = 𝑌(𝑆) be two functions each assigning a real number to each 𝑠 ∈ 𝑆. Then (𝑋, 𝑌) is called
a two-dimensional random variable.
BASIC TERMINOLOGIES
Let (𝑋, 𝑌) be a two-dimensional discrete random variable. With each possible outcome
(𝑥𝑖 , 𝑦𝑖 ) we associate a number 𝑝(𝑥𝑖 , 𝑦𝑗 ) representing 𝑃(𝑋 = 𝑥𝑖 , 𝑌 = 𝑦𝑗 ) and satisfying the
following conditions:
𝐹(𝑥, 𝑦) = ∑ ∑ 𝑃(𝑥𝑖 , 𝑦𝑗 )
𝑥𝑖≤𝑥 𝑦𝑗 ≤𝑦
If (𝑋, 𝑌) is a two-dimensional discrete random variable with joint pmf 𝑃(𝑥𝑖 , 𝑦𝑗 ). Since 𝑋 = 𝑥𝑖
must occur with 𝑌 = 𝑦𝑗 for some 𝑗 and can occur with 𝑌 = 𝑦𝑗 for only one 𝑗, we have
∞
These marginal pdf’s correspond to the basic pdf’s of the one-dimensional random variables
𝑋 and 𝑌, respectively.
Example:
Two production lines manufacture a certain type of item. Suppose that the capacity (on any
given day) is 5 items for line I and 3 items for line II. Assume that the number of items actually
produced by either production line is a random variable. Let (𝑋, 𝑌) represent the two-
dimensional random variable yielding the number of items produced by line I and line II,
respectively. The following table gives the joint probability distribution of (𝑋, 𝑌). Each entry
represents 𝑝(𝑥𝑖 , 𝑦𝑗 ) = 𝑃(𝑋 = 𝑥𝑖 , 𝑌 = 𝑦𝑗 ).
X 0 1 2 3 4 5 Sum
Y
0 0 0.01 0.03 0.05 0.07 0.09 0.25
1 0.01 0.02 0.04 0.05 0.06 0.08 0.26
2 0.01 0.03 0.05 0.05 0.05 0.06 0.25
3 0.01 0.02 0.04 0.06 0.06 0.05 0.24
Sum 0.03 0.08 0.16 0.21 0.24 0.28 1.00
Thus
𝑃(2,3) = 𝑃(𝑋 = 2, 𝑌 = 3) = 0.04, etc. Hence if 𝐵 is defined as
𝐵 ={More items are produced by line I then by line II}
We find that
𝑃(𝐵) = 0.01 + 0.03 + 0.05 + 0.07 + 0.09 + 0.04 + 0.05 + 0.06 + 0.08 + 0.05 + 0.05
+ 0.06 + 0.06 + 0.05
= 0.75
Exercise
1. A fair coin is tossed 3 times. Let 𝑋: 0 or 1 according as 𝐻 or 𝑇 on the first toss. 𝑌:
Number of head. Find the probability distribution of 𝑋 and 𝑌.
2. Suppose that 3 balls are randomly selected from urn containing 3 red, 4 white, 5 black
balls. If 𝑋 and 𝑌 denotes the number of red balls and the number of white balls chosen,
then find the probability distribution of 𝑋 and 𝑌.
3. Evaluate the conditional probability 𝑃(𝑋 = 2|𝑌 = 2) from the following table
X 0 1 2 3 4 5 Sum
Y
0 0 0.01 0.03 0.05 0.07 0.09 0.25
1 0.01 0.02 0.04 0.05 0.06 0.08 0.26
2 0.01 0.03 0.05 0.05 0.05 0.06 0.25
3 0.01 0.02 0.04 0.06 0.06 0.05 0.24
Sum 0.03 0.08 0.16 0.21 0.24 0.28 1.00
Solution:
𝑃(𝑋=2,𝑌=2) 0.05
𝑃(𝑋 = 2 | 𝑌 = 2) = = = 0.20.
𝑃(𝑌=2) 0.25
𝑥𝑦
𝑥2+ 3𝑥 2 +𝑥𝑦 3𝑥+𝑦
3
ℎ(𝑦 | 𝑥) = 2 = = , 0 ≤ 𝑦 ≤ 2, 0 ≤ 𝑥 ≤ 1.
2
2𝑥 + 𝑥 6𝑥 2 +2𝑥 6𝑥+2
3
5. Suppose that a machine is used for a particular task in the morning and for a different
task in the afternoon. Let 𝑋 and 𝑌 represent the number of times the machine breaks
down in the morning and in the afternoon, respectively. Following table gives the joint
probability distribution of (𝑋, 𝑌). Check whether 𝑋 and 𝑌 are independent random
variables.
X 0 1 2 𝒒(𝒚𝒋 )
Y
0 0.1 0.2 0.2 0.5
1 0.04 0.08 0.08 0.2
2 0.06 0.12 0.12 0.3
𝒑(𝒙𝒊 ) 0.2 0.4 0.4 1.00
Solution:
𝑃(𝑋 = 0, 𝑌 = 0) = 0.1; 𝑝(𝑋 = 0)𝑞(𝑌 = 0) = 0.2 × 0.5 = 0.1
⟹ 𝑃(𝑋 = 0, 𝑌 = 0) = 𝑝(𝑋 = 0)𝑞(𝑌 = 0)
6. Let 𝑋 and 𝑌 be the life lengths of two electronic devices. Suppose that their joint pdf
is given by
𝑓(𝑥, 𝑦) = 𝑒 −(𝑥+𝑦) , 𝑥 ≥ 0, 𝑦 ≥ 0
Check whether 𝑋 and 𝑌 are independent random variables.
Solution:
∞ ∞
−𝑥
𝑔(𝑥) = ∫ 𝑓(𝑥, 𝑦)𝑑𝑦 = 𝑒 ∫ 𝑒 −𝑦 𝑑𝑦 = 𝑒 −𝑥 , 𝑥 ≥ 0
0 0
∞ ∞
−𝑦
ℎ(𝑦) = ∫ 𝑓(𝑥, 𝑦)𝑑𝑥 = 𝑒 ∫ 𝑒 −𝑥 𝑑𝑥 = 𝑒 −𝑦 , 𝑦 ≥ 0
0 0
Hence, 𝑓(𝑥, 𝑦) = 𝑔(𝑥) × ℎ(𝑦), i.e., 𝑋 and 𝑌 are independent random variables
7. Suppose that the following table represents the joint probability distribution of the
discrete random variable (𝑋, 𝑌). Evaluate all the marginal and conditional
distributions.
X 1 2 3
Y
1 1 1 0
12 6
2 0 1 1
9 5
3 1 1 2
18 4 15
8. Suppose that the two-dimensional random variable (𝑋, 𝑌) has joint pdf
𝑘𝑥(𝑥 − 𝑦), 0 < 𝑥 < 2, −𝑥 < 𝑦 < 𝑥
𝑓(𝑥, 𝑦) = {
0, 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
1
(i) Evaluate the constant 𝑘. (Ans: )
8
(ii) Find the marginal pdf of 𝑋.
𝑥3
,0 < 𝑥 < 2
Ans: 𝑔(𝑥) = { 4
0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
(iii) Find the marginal pdf of 𝑌.
1 𝑦 5𝑦 3
− + , −2 ≤ 𝑦 ≤ 0
3 4 48
Ans: ℎ(𝑦) = 1 𝑦 𝑦3
− + ,0 ≤ 𝑦 ≤ 2
3 4 48
{ 0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
9. Suppose that the joint pdf of the two-dimensional random variable (𝑋, 𝑌) is given by
𝑥𝑦
𝑥2 + , 0 < 𝑥 < 1, 0 < 𝑦 < 2,
3
𝑓(𝑥, 𝑦) = {
0, 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
(i) Check if 𝑓(𝑥, 𝑦) is a valid pdf
(ii) Evaluate 𝑃(𝐵) where 𝐵 = {𝑋 + 𝑌 ≥ 1}
(iii) Find the marginal pdf of 𝑋 and 𝑌
(iv) Find 𝑔(𝑥|𝑦)
(v) Find ℎ(𝑦|𝑥)
1
(vi) Evaluate 𝑃 (𝑋 > )
2
(vii) Evaluate 𝑃(𝑌 < 𝑋)
1 1
(viii) Evaluate 𝑃 (𝑌 < | 𝑋< )
2 2
Solution:
∞ ∞
(i) To check that ∫−∞ ∫−∞ 𝑓(𝑥, 𝑦) 𝑑𝑥 𝑑𝑦 = 1.
∞ ∞ 2 1
𝑥𝑦
∫ ∫ 𝑓(𝑥, 𝑦)𝑑𝑥 𝑑𝑦 = ∫ ∫ (𝑥 2 + ) 𝑑𝑥 𝑑𝑦
−∞ −∞ 0 0 3
2 𝑥=1
𝑥3 𝑥2
=∫ ( + ) 𝑑𝑦
0 3 6 𝑥=0
2
1 𝑦
= ∫ ( + ) 𝑑𝑦
0 3 6
2
1 𝑦2
=( 𝑦+ )
3 12 0
2 4
= + = 1.
3 12
(ii) Let 𝐵 = {𝑋 + 𝑌 ≥ 1}. (See below figure)
𝑃(𝐵) = 1 − 𝑃(𝐵̅) where 𝐵̅ = {𝑋 + 𝑌 < 1}.
1 1−𝑥 𝑥𝑦
Hence 𝑃(𝐵) = 1 − ∫0 ∫0 (𝑥 2 + 3
) 𝑑𝑦 𝑑𝑥
1
2 (1
𝑥(1 − 𝑥)2
= 1 − ∫ [𝑥 − 𝑥) + ] 𝑑𝑥
0 6
7 65
=1− = .
72 72
1 2 1 𝑥𝑦 2 7 3𝑦 5
(vi) 𝑃 (𝑋 > ) = ∫0 ∫1 (𝑥 2 + ) 𝑑𝑥𝑑𝑦 = ∫0 (24 + 24 ) 𝑑𝑦 = 6
2 2 3
1 𝑥 𝑥𝑦 1 7𝑥 3 7
(vii) 𝑃(𝑌 < 𝑋) = ∫0 ∫0 (𝑥 2 + ) 𝑑𝑦𝑑𝑥 = ∫0 𝑑𝑥 =
3 6 24
5
(viii) Ans:
32
10.For what value of 𝑘 is 𝑓(𝑥, 𝑦) = 𝑘𝑒 −(𝑥+𝑦) a joint pdf of (𝑋, 𝑌) over the region 0 < 𝑥 <
1, 0 < 𝑦 < 1? (Ans: 2.5027)
1
13.Find 𝑐 for which 𝑓(𝑥, 𝑦) = 𝑐𝑥 + 𝑐𝑦 2 , 𝑥 = 1,2, 𝑦 = 1,2,3 is the joint pmf. (Ans: )
37
2
0≤𝑥≤𝑦≤𝑎
14.If 𝑓(𝑥, 𝑦) = {𝑎2 . Find 𝑓(𝑦/𝑥) and 𝑓(𝑥/𝑦).
𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
1 1
(Ans: 𝑓(𝑦|𝑥) = , 0 ≤ 𝑥 ≤ 𝑦 ; 𝑓(𝑥|𝑦) = , 𝑥 ≤ 𝑦 ≤ 𝑎)
𝑎−𝑥 𝑦
16.Suppose that a manufacturer of light bulbs is concerned about the number of bulbs
ordered from him during the months of January and February. Let 𝑋 and 𝑌 denote the
number of bulbs ordered during these two months, respectively. We shall assume that
(𝑋, 𝑌) is a two-dimensional continuous random variable with the following joint pdf.
(Refer to the figure below). Find 𝑃(𝑋 ≥ 𝑌).
𝑐, if 5000 ≤ 𝑥 ≤ 10,000 and 4000 ≤ 𝑦 ≤ 9000
𝑓(𝑥, 𝑦) = {
0, 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
Solution:
∞ ∞
To determine 𝑐 we use the fact that ∫−∞ ∫−∞ 𝑓(𝑥, 𝑦) 𝑑𝑥 𝑑𝑦 = 1.
∞ ∞ 9000 10,000
∫ ∫ 𝑓(𝑥, 𝑦) 𝑑𝑥 𝑑𝑦 = ∫ ∫ 𝑓(𝑥, 𝑦) 𝑑𝑥 𝑑𝑦 = 𝑐[5000]2
−∞ −∞ 4000 5000
Thus 𝑐 = (5000)2 .
= ∑∞ ∞
𝑖=1 𝑥𝑖 {∑𝑗=1 𝑃 (𝑥𝑖 , 𝑦𝑖 )}
= ∑∞
𝑖=1 𝑥𝑖 𝑝(𝑥𝑖 ) where 𝑝(𝑥𝑖 ) is marginal pmf of 𝑥.
Therefore,
𝐸(𝑋) = ∑∞ ∞
𝑖=1 𝑥𝑖 𝑝(𝑥𝑖 ) and 𝐸(𝑌) = ∑𝑖=1 𝑦𝑗 𝑞(𝑦𝑖 )
For CRV:
∞ ∞ ∞
𝐸(𝑋) = ∫ ∫ 𝑥 𝑓(𝑥, 𝑦)𝑑𝑥 𝑑𝑦 = ∫ 𝑥 𝑔(𝑥)𝑑𝑥
−∞ −∞ −∞
∞ ∞ ∞
𝐸(𝑌) = ∫ ∫ 𝑦 𝑓(𝑥, 𝑦)𝑑𝑥 𝑑𝑦 = ∫ 𝑦 ℎ(𝑦) 𝑑𝑦
−∞ −∞ −∞
∞ ∞
𝐸(𝑋 ) = ∫ ∫ 𝑥 2 𝑓(𝑥, 𝑦) 𝑑𝑥 𝑑𝑦
2
−∞ −∞
∞ ∞
𝐸(𝑋𝑌) = ∫ ∫ 𝑥𝑦 𝑓(𝑥, 𝑦) 𝑑𝑥 𝑑𝑦
−∞ −∞
Properties:
1. 𝐸(𝑐) = 𝑐
2. 𝐸(𝑐𝑋) = 𝑐𝐸(𝑥)
3. 𝐸(𝑋 + 𝑌) = 𝐸(𝑋) + 𝐸(𝑌)
Theorem:
Let (𝑋, 𝑌) be a two-dimensional random variable and suppose that 𝑋 and 𝑌 are independent.
Then 𝐸(𝑋𝑌) = 𝐸(𝑋)𝐸(𝑌).
Proof:
+∞ +∞
𝐸(𝑋𝑌) = ∫ ∫ 𝑥𝑦 𝑓(𝑥, 𝑦)𝑑𝑥 𝑑𝑦
−∞ −∞
+∞ +∞
=∫ ∫ 𝑥𝑦 𝑔(𝑥)ℎ(𝑦)𝑑𝑥 𝑑𝑦
−∞ −∞
+∞ +∞
=∫ 𝑥 𝑔(𝑥)𝑑𝑥 ∫ 𝑦 ℎ(𝑦)𝑑𝑦
−∞ −∞
= 𝐸(𝑋)𝐸(𝑌)
Theorem:
Let (𝑋, 𝑌) be a two-dimensional random variable, and if 𝑋 and 𝑌 are independent, then
𝑉(𝑋 + 𝑌) = 𝑉(𝑋) + 𝑉(𝑌).
2
Proof: 𝑉(𝑋 + 𝑌) = 𝐸(𝑋 + 𝑌)2 − (𝐸(𝑋 + 𝑌))
= 𝑉(𝑋) + 𝑉(𝑌)
Exercise
𝑥𝑦
𝑥2 + 0 ≤ 𝑥 ≤ 1, 0 ≤ 𝑦 ≤ 2
1. If 𝑓(𝑥, 𝑦) = { 3 . Find 𝐸(𝑋), 𝐸(𝑌) and 𝑉(𝑌).
0 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
Solution:
∞
𝐸(𝑋) = ∫−∞𝑥 𝑔(𝑥) 𝑑𝑥
∞ 2 13
= ∫−∞𝑥 {2𝑥 2 + 𝑥} 𝑑𝑥 =
3 18
∞ 10
𝐸(𝑌) = ∫−∞ 𝑦 ℎ(𝑦) 𝑑𝑦 =
19
∞ 14
𝐸(𝑌 2 ) = ∫−∞ 𝑦 2 ℎ(𝑦) 𝑑𝑦 =
9
CONDITIONAL EXPECTATION
Definitions:
(a) If (𝑋, 𝑌) is a two-dimensional discrete random variable, we define the conditional
expectation of 𝑋 for given 𝑌 = 𝑦𝑗 as
∞
𝐸(𝑋|𝑦𝑗 ) = ∑ 𝑥𝑖 𝑝(𝑥𝑖 |𝑦𝑗 )
𝑖=1
1. Suppose that (𝑋, 𝑌) is uniformly distributed over the semicircle indicated in the
following figure. Find 𝐸(𝑌|𝑥) and 𝐸(𝑋|𝑦).
Solution:
The region considered is .
The joint pdf of (𝑋, 𝑌) is given by
2
, (𝑥, 𝑦) ∈ 𝑠𝑒𝑚𝑖𝑐𝑖𝑟𝑐𝑙𝑒
𝑓(𝑥, 𝑦) = {𝜋
0, 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
√1−𝑥 2
2 2
𝑔(𝑥) = ∫ 𝑑𝑦 = √1 − 𝑥 2 , −1 ≤ 𝑥 ≤ 1
0 𝜋 𝜋
√1−𝑦 2
2 4
ℎ(𝑦) = ∫ 𝑑𝑥 = √1 − 𝑦 2 , 0 ≤ 𝑦 ≤ 1
−√1−𝑦 2 𝜋 𝜋
Hence,
1
𝑔(𝑥|𝑦) = , −√1 − 𝑦 2 ≤ 𝑥 ≤ √1 − 𝑦 2
2√1 − 𝑦2
1
ℎ(𝑦|𝑥) = , 0 ≤ 𝑦 ≤ √1 − 𝑥 2
√1 − 𝑥2
Therefore,
√1−𝑥 2
1
𝐸(𝑌|𝑥) = ∫ 𝑦 ℎ(𝑦|𝑥)𝑑𝑦 = √1 − 𝑥 2
0 2
Similarly,
√1−𝑦 2
𝐸(𝑋|𝑦) = ∫ 𝑥 𝑔(𝑥|𝑦) 𝑑𝑥 = 0
−√1−𝑦 2
CORRELATION COEFFICIENT
Correlation coefficient is a parameter which measures the degree of association between
two random variables 𝑋 and 𝑌
Let (𝑋, 𝑌) be a two-dimensional random variable. The correlation coefficient 𝜌𝑥𝑦 between 𝑋
and 𝑌 is defined as
where 𝐸[(𝑋 − 𝐸(𝑋))(𝑌 − 𝐸(𝑌))] = 𝜎𝑥𝑦 = 𝑐𝑜𝑣(𝑋, 𝑌) is called as covariance between 𝑋 and
𝑌.
Theorem
If (𝑋, 𝑌) is a two-dimensional random variable, then −1 ≤ 𝜌𝑥𝑦 ≤ 1
Proof:
We know that 𝑉(𝑋) ≥ 0,
i.e., 𝐸(𝑋 2 ) − [𝐸(𝑋)]2 ≥ 0
𝐸(𝑋 2 ) ≥ [𝐸(𝑋)]2
𝐸(𝑋 2 ) ≥ 0 (since [𝐸(𝑋)]2 ≥ 0)
2
𝑋−𝐸(𝑋) 𝑌−𝐸(𝑌)
Consider 𝐸 { ± } ≥0
√𝑉(𝑋) √𝑉(𝑌)
2 2
(𝑋 − 𝐸(𝑋)) (𝑌 − 𝐸(𝑌)) (𝑋 − 𝐸(𝑋)) (𝑌 − 𝐸(𝑌))
𝐸[ + ± 2. . ]≥0
𝑉(𝑋) 𝑉(𝑌) √𝑉(𝑋) √𝑉(𝑌)
2 2
𝐸(𝑋 − 𝐸(𝑋)) 𝐸(𝑌 − 𝐸(𝑌)) 𝐸[(𝑋 − 𝐸(𝑋) (𝑌 − 𝐸(𝑌)]
+ ± 2. ≥0
𝑉(𝑋) 𝑉(𝑌) √𝑉(𝑋) √𝑉(𝑌)
𝑉(𝑋) 𝑉(𝑌)
+ ± 2𝜌𝑥𝑦 ≥ 0
𝑉(𝑋) 𝑉(𝑌)
2
(Since 𝐸(𝑋 − 𝐸(𝑋)) = 𝑉(𝑋))
2 ± 2𝜌𝑥𝑦 ≥ 0
2 ± 2𝜌𝑥𝑦 ≥ 0 that is 1 ± 𝜌𝑥𝑦 ≥ 0
1 + 𝜌𝑥𝑦 ≥ 0 and 1 − 𝜌𝑥𝑦 ≥ 0
𝜌𝑥𝑦 ≥ −1 and 𝜌𝑥𝑦 ≤ 1
Hence, −1 ≤ 𝜌 ≤ 1.
Theorem
If 𝑋 and 𝑌 are linearly related, then 𝜌𝑥𝑦 = ±1
Proof:
Let 𝑋 and 𝑌 be linearly related, say 𝑌 = 𝑎𝑋 + 𝑏
𝐶𝑜𝑣(𝑋, 𝑌) = 𝐸(𝑋𝑌) − 𝐸(𝑋)𝐸(𝑌)
= 𝐸[𝑋(𝑎𝑋 + 𝑏)] − 𝐸(𝑋)𝐸(𝑎𝑋 + 𝑏)
= 𝐸[𝑎𝑋 2 + 𝑏𝑋] − 𝐸(𝑋)[𝑎𝐸(𝑋) + 𝑏]
= 𝑎𝐸(𝑋 2 ) + 𝑏𝐸(𝑋) − 𝑎[𝐸(𝑋)]2 − 𝑏𝐸(𝑋)
2
= 𝑎 [𝐸(𝑋 2 ) − (𝐸 (𝑋)) ]
= 𝑎 𝑉(𝑋)
𝑉(𝑌) = 𝑉(𝑎𝑋 + 𝑏) = 𝑎2 𝑉(𝑋)
𝐶𝑜𝑣(𝑋, 𝑌)
𝜌𝑥𝑦 =
√𝑉(𝑋)𝑉(𝑌)
𝑎𝑉(𝑋)
=
√𝑉(𝑋)𝑎2 𝑉(𝑋)
𝑎𝑉(𝑋)
= = ±1
±𝑎𝑉(𝑋)
Exercise
1. If 𝑈 = 𝑎 + 𝑏𝑋 and 𝑉 = 𝑐 + 𝑑𝑌, then show that 𝜌𝑢𝑣 = ±𝜌𝑥𝑦
4. Show that 𝑉(𝑎𝑋 + 𝑏𝑌) = 𝑎2 𝑉(𝑋) + 𝑏 2 𝑉(𝑌) + 2𝑎𝑏 𝑐𝑜𝑣(𝑋, 𝑌). Also prove that when
𝑋 and 𝑌 are independent 𝑉(𝑎𝑋 + 𝑏𝑌) = 𝑉(𝑎𝑋 − 𝑏𝑌) = 𝑎2 𝑉(𝑋) + 𝑏 2 𝑉(𝑌)
5. Two independent random variables 𝑋1 and 𝑋2 have mean 5 and 10 and variances 4
and 9 respectively. Find the covariance between 𝑈 = 3𝑋1 + 4𝑋2 and 𝑉 = 3𝑋1 − 𝑋2 .
(Ans: 0)
1 1
7. Given 𝐸(𝑋𝑌) = 43, 𝑃(𝑋 = 𝑥𝑖 ) = and 𝑃(𝑌 = 𝑦𝑗 ) = . Find 𝜌𝑥𝑦
5 5
𝑋 1 3 4 6 8
𝑌 1 2 24 12 5
22 44 126 750
(Ans: 𝐸(𝑋) = , 𝐸(𝑌) = 𝐸(𝑋 2 ) = , 𝐸(𝑌 2 ) = , 𝜌𝑥𝑦 = 0.2079)
5 5 5 5
2, (𝑥, 𝑦) ∈ 𝑅
𝑓(𝑥, 𝑦) = {
0, 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
1
Marginal pdf of 𝑋, 𝑔(𝑥) = ∫𝑥 2 𝑑𝑦 = 2(1 − 𝑥), 0 ≤ 𝑥 ≤ 1
𝑦
Marginal pdf of 𝑌, ℎ(𝑦) = ∫0 2 𝑑𝑦 = 2𝑦, 0 ≤ 𝑦 ≤ 1
1 1 1 2
𝐸(𝑋) = ∫0 𝑥 𝑔(𝑥)𝑑𝑥 = ; 𝐸(𝑌) = ∫0 𝑦 ℎ(𝑦)𝑑𝑦 =
3 3
2) 1 2 1 2) 1 1 1 1
𝐸(𝑋 = ∫0 𝑥 𝑔(𝑥)𝑑𝑥 =
6
; 𝐸(𝑌 = ∫0 𝑦 2 ℎ(𝑦)𝑑𝑦 = ; 𝑉(𝑋) =
2 18
; 𝑉(𝑌) =
18
1 𝑦 1
𝐸(𝑋𝑌) = ∫0 ∫0 𝑥𝑦 𝑓(𝑥, 𝑦)𝑑𝑥𝑑𝑦 =
4
𝐸(𝑋𝑌) − 𝐸(𝑋)𝐸(𝑌) 1
𝜌𝑥𝑦 = =
√𝑉(𝑋) × 𝑉(𝑌) 2