Unit-Ii Multiple Random Variables
Unit-Ii Multiple Random Variables
UNIT-II
MULTIPLE RANDOM VARIABLES
2. 𝐹𝑋𝑌 (∞, ∞) = 1
Proof:
It is known that
𝐹𝑋𝑌 (𝑥, 𝑦) = P{𝑋 ≤ 𝑥, 𝑌 ≤ 𝑦}
𝐹𝑋𝑌 (∞, ∞) = P{𝑋 ≤ ∞, 𝑌 ≤ ∞}
= P{𝑋 ≤ ∞ ∩ 𝑌 ≤ ∞ }
= P{𝑆 ∩ 𝑆} = P{𝑆} = 1
𝐹𝑋𝑌 (∞, ∞) = 1
Proof:
We know that
𝜕 2 𝐹𝑋𝑌 (𝑥, 𝑦)
𝑓𝑋𝑌 (𝑥, 𝑦) =
𝜕𝑥 𝜕𝑦
= 𝐹𝑌 (∞) − 𝐹𝑌 (−∞) = 1
𝜕
= 𝐹 (𝑦)
𝜕𝑦 𝑌
∞ ∞
∫ ∫ 𝑓𝑋𝑌 (𝑥, 𝑦)𝑑𝑥 𝑑𝑦 = 1
−∞ −∞
Proof
𝜕 2 𝐹𝑋𝑌 (𝑥, 𝑦)
𝑤𝑒 𝑘𝑛𝑜𝑤 𝑓𝑋𝑌 (𝑥, 𝑦) =
𝜕𝑥 𝜕𝑦
Integrating on both sides w. r. to 𝑥
𝑥 𝜕 2 𝐹𝑋𝑌 (𝑥, 𝑦)
𝑥
∫ 𝑓𝑋𝑌 (𝑥, 𝑦)𝑑𝑥 = ∫ 𝑑𝑥
−∞ −∞ 𝜕𝑥 𝜕𝑦
𝜕 𝑥 𝜕
∫ 𝐹 (𝑥, 𝑦)𝑑𝑥
𝜕𝑦 −∞ 𝜕𝑥 𝑋𝑌
𝑦
= [𝐹𝑋𝑌 (𝑥, 𝑦)]−∞
= 𝐹𝑋𝑌 (𝑥, 𝑦) − 𝐹𝑋𝑌 (𝑥, −∞)
= 𝐹𝑋𝑌 (𝑥, 𝑦)
𝑥 𝑦
𝐹𝑋𝑌 (𝑥, 𝑦) = ∫ ∫ 𝑓𝑋𝑌 (𝑥, 𝑦)𝑑𝑥 𝑑𝑦
−∞ −∞
Proof:
Consider
𝑥2 𝑦2
= ∫ ∫ 𝑓𝑋𝑌 (𝑥, 𝑦)𝑑𝑥 𝑑𝑦
𝑥1 𝑦1
𝑥2 𝑦2 𝜕 2 𝐹 (𝑥, 𝑦)
𝑋𝑌
=∫ ∫ 𝑑𝑥 𝑑𝑦
𝑥1 𝑦1 𝜕𝑥 𝜕𝑦
Changing the order of integration
𝑦2
𝜕 𝑥2 𝜕
∫ ∫ 𝐹𝑋𝑌 (𝑥, 𝑦)𝑑𝑥 𝑑𝑦
𝑦1 𝜕𝑦 𝑥1 𝜕𝑥
𝑦2
𝜕
∫ (𝐹𝑋𝑌 (𝑥2 , 𝑦) − 𝐹𝑋𝑌 (𝑥1 , 𝑦))𝑑𝑦
𝑦1 𝜕𝑦
𝑦
= [(𝐹𝑋𝑌 (𝑥2 , 𝑦) − 𝐹𝑋𝑌 (𝑥1 , 𝑦))]𝑦12
= 𝐹𝑋𝑌 (𝑥2 , 𝑦2 ) − 𝐹𝑋𝑌 (𝑥1, 𝑦2 )−𝐹𝑋𝑌 (𝑥1, 𝑦2 )+𝐹𝑋𝑌 (𝑥1, 𝑦1 )
𝜕
= (𝐹 (∞, 𝑦) − 𝐹𝑋𝑌 (−∞, 𝑦))
𝜕𝑦 𝑋𝑌
𝜕
= 𝐹 (𝑦)
𝜕𝑦 𝑌
= 𝑓𝑌 (𝑦)
∞
𝒇𝒀 (𝒚) = ∫ 𝒇𝑿𝒀 (𝒙, 𝒚) 𝒅𝒙
−∞
CONDITIONAL PROBABILITY
It is the probability of an event ‘A’ based on the occurrence of another event
‘B’.
Let A and B are two events then the conditional probability of A upon the
occurrence of B is given as
𝑃 (𝐴 ∩ 𝐵 )
𝑃(𝐴⁄𝐵 ) =
𝑃 (𝐵 )
Similarly, the conditional probability of B upon the occurrence of A is given
as
𝑃(𝐴∩𝐵)
𝑃(𝐵⁄𝐴) =
𝑃(𝐴)
1. 𝐹𝑋 (−∞⁄𝐵 ) = 0
2. 𝐹𝑋 (∞⁄𝐵 ) = 1
3. 0 ≤ 𝐹𝑋 (𝑥⁄𝐵 ) ≤ 1
4. 𝐹𝑋 (𝑥2 /𝐵 ) ≥ 𝐹𝑋 (𝑥1 )/𝐵 𝑤ℎ𝑒𝑛 𝑥2 > 𝑥1
(𝑥1 < 𝑋 ≤ 𝑥2 )⁄
5. 𝑃 { 𝐵 } = 𝐹𝑋 (𝑥2 /𝐵 ) − 𝐹𝑋 (𝑥1/𝐵 )
Point conditioning:
Under this the random variable ‘Y’ takes a single value such that
𝑦 − ∆𝑦 ≤ 𝑌 ≤ 𝑦 + ∆𝑦
𝑃 (𝐴 ∩ 𝐵 )
𝑃(𝐴⁄𝐵 ) =
𝑃 (𝐵 )
𝐹𝑋 (𝑥⁄𝐵 ) = P{𝑋 ≤ 𝑥 ∩ 𝐵 }
P{(𝑋 ≤ 𝑥 ) ∩ (𝑦 − ∆𝑦 ≤ 𝑌 ≤ 𝑦 + ∆𝑦)}
𝐹𝑋 (𝑥⁄𝑦 − ∆𝑦 ≤ 𝑌 ≤ 𝑦 + ∆𝑦) =
P(𝑦 − ∆𝑦 ≤ 𝑌 ≤ 𝑦 + ∆𝑦)
𝑥 𝑥
∫ 𝑓𝑋𝑌 (𝑥, 𝑦)𝑑𝑥 2∆𝑦 ∫ 𝑓𝑋𝑌 (𝑥, 𝑦)𝑑𝑥
= −∞ = −∞
𝑓𝑌 (𝑦) 2∆𝑦 𝑓𝑌 (𝑦)
Conditional density function is
𝑑 𝑥
𝑑 𝐹𝑋 (𝑥⁄𝑌 ) 𝑑𝑥 ∫−∞ 𝑓𝑋𝑌 (𝑥, 𝑦)𝑑𝑥
𝑓𝑋 (𝑥⁄𝑌) = =
𝑑𝑥 𝑓𝑌 (𝑦)
𝑓𝑋𝑌 (𝑥, 𝑦)
𝑓𝑋 (𝑥⁄𝑌) =
𝑓𝑌 (𝑦)
Similarly
𝑦 𝑓𝑋𝑌 (𝑥, 𝑦)
𝑓𝑌 ( ⁄𝑋) =
𝑓𝑋 (𝑥 )
Interval conditioning:
Under this the random variable ‘Y’ takes the range of values such that
𝑦𝑎 ≤ 𝑌 ≤ 𝑦𝑏
𝑏 𝑦
∫𝑦 𝑓𝑋𝑌 (𝑥, 𝑦) 𝑑𝑦
𝐹𝑋 (𝑥⁄𝑦 ≤ 𝑌 ≤ 𝑦 ) = 𝑦𝑏 𝑎∞
𝑎 𝑏 ∫𝑦 ∫−∞ 𝑓𝑋𝑌 (𝑥, 𝑦)𝑑𝑥 𝑑𝑦
𝑎
∞ 𝑥
𝐹𝑊 (𝑤 ) = ∫ ∫ 𝑓𝑋𝑌 (𝑥, 𝑦)𝑑𝑥 𝑑𝑦
−∞ −∞
∞
𝑓𝑊 (𝑤 ) = ∫ 𝑓𝑌 (𝑦)[𝑓𝑋 (𝑤 − 𝑦) − 𝑓𝑋 (−∞)] 𝑑𝑦
−∞
∞
𝑓𝑊 (𝑤 ) = ∫ 𝑓𝑌 (𝑦) 𝑓𝑋 (𝑤 − 𝑦) 𝑑𝑦
−∞
JOINT MOMENTS:
Moments are the measure of deviation of a random variable from a reference
value.
Joint moments indicate the deviation of a multiple random variables from a
reference value.
Joint moments about the origin
The expected value of a function of a form 𝑔(𝑥, 𝑦) = 𝑋 𝑛 𝑌 𝑘 is called joint
moment about the origin.
𝑚𝑛𝑘 = 𝐸 [𝑋𝑛 𝑌 𝑘 ]
∞ ∞
𝑚𝑛𝑘 = ∫ ∫ 𝑥 𝑛 𝑦 𝑘 𝑓𝑋𝑌 (𝑥, 𝑦) 𝑑𝑥 𝑑𝑦
−∞ −∞
𝑚01 = 𝐸 [𝑋 0 𝑌1 ] = 𝐸 (𝑌)
Note: The first order joint moments are equal respective individual expected
value.
Second order joint moments:
𝑚02 = 𝐸 [ 𝑌 2 ]
𝑚20 = 𝐸 [ 𝑋 2 ]
The second order joint moment 𝑚11 is called as Correlation.
𝑅𝑋𝑌 = 𝑚11 = 𝐸 [𝑋 𝑌]
∞ ∞
𝑚11 = ∫ ∫ 𝑥 𝑦 𝑓𝑋𝑌 (𝑥, 𝑦) 𝑑𝑥 𝑑𝑦
−∞ −∞
𝜇01 = 𝜇10 = 1
3. The second order joint central moment
𝜇𝑛𝑘 = 𝐸 [(𝑋 − 𝑋̅)𝑛 (𝑌 − 𝑌̅ )𝑘 ]
𝜇20 = 𝐸 [(𝑋 − 𝑋̅)2 ] = 𝜎𝑋 2
𝜇02 = 𝐸 [ (𝑌 − 𝑌̅)2 ] = 𝜎𝑌 2
4. The second order joint central moment is called as covariance between
two random variables X and Y.
𝜇11 = 𝐸 [(𝑋 − 𝑋̅)(𝑌 − 𝑌̅)] = 𝐶𝑋𝑌
Covariance is a measure of change in random variable with another one.
It indicates how to random variables vary together.
Properties of Covariance:
1. If X and Y are two random variables then the Covariance between them
is given as
𝐶𝑋𝑌 = 𝑅𝑋𝑌 − 𝐸 [𝑋] 𝐸 [𝑌]
2. If X and Y are two statistical independent random variables then
𝐶𝑋𝑌 = 0
𝐶𝑋𝑌 = 𝑅𝑋𝑌 − 𝐸 [𝑋] 𝐸 [𝑌]
If two random variables are said to be statistical independent then
𝑅𝑋𝑌 = 𝐸 [𝑋 𝑌] = 𝐸 [𝑋] 𝐸 [𝑌]
𝐶𝑋𝑌 = 𝐸 [𝑋] 𝐸 [𝑌] − 𝐸 [𝑋] 𝐸 [𝑌] = 0
3. Let X and Y be two random variables then
𝑣𝑎𝑟[𝑋 + 𝑌] = 𝑣𝑎𝑟[𝑋] + 𝑣𝑎𝑟[𝑌] + 2𝐶𝑋𝑌
𝑣𝑎𝑟[𝑋 − 𝑌] = 𝑣𝑎𝑟[𝑋] + 𝑣𝑎𝑟[𝑌] − 2𝐶𝑋𝑌
Correlation coefficient:
It is defined as
1 ∞ −𝑗𝜔 𝑋 −𝑗𝜔 𝑌
𝑓𝑋𝑌 (𝑥, 𝑦) = ∫ 𝑒 1 𝑒 2 𝜙 (𝜔 𝜔 ) d𝜔 d𝜔
𝑋𝑌 1, 2 1 2
2𝜋 −∞
Joint characteristics function and joint density function are Fourier transform
pairs with the sign of the variable are reversed.
Properties of joint characteristics function:
1. The marginal characteristics function can be obtained from the
knowledge of joint characteristics function
𝜙𝑋 (𝜔1 ) = 𝜙𝑋𝑌 (𝜔1, 0)
𝜙𝑌 (𝜔2 ) = 𝜙𝑋𝑌 (0, 𝜔2 )
𝑝𝑟𝑜𝑜𝑓:
We know that
𝜙𝑋𝑌 (𝜔1, 𝜔2 ) = 𝐸[𝑒𝑗𝜔1 𝑋 𝑒𝑗𝜔2 𝑌 ]
Let 𝜔2 = 0
𝜙𝑋𝑌 (𝜔1, 0) = 𝐸[𝑒𝑗𝜔1 𝑋 ] = 𝜙𝑋 (𝜔1 )
Let 𝜔1 = 0
𝜙𝑋𝑌 (0, 𝜔2 ) = 𝐸[𝑒𝑗𝜔2 𝑌 ] = 𝜙𝑌 (𝜔2 )
𝑝𝑟𝑜𝑜𝑓:
We know that
𝜙𝑋𝑌 (𝜔1, 𝜔2 ) = 𝐸[𝑒𝑗𝜔1 𝑋 𝑒𝑗𝜔2 𝑌 ]
∞ ∞
3. If X and Y are two statistical independent random variables then the joint
characteristics function of sum of random variables is the product of
individual characteristics functions
4. The joint moments of multiple random variable can be obtained from the
knowledge of joint characteristic function is
𝜕 𝑛+𝑘
𝑚𝑛 = (−𝑗 )𝑛+𝑘 𝜙𝑋𝑌 (𝜔1, 𝜔2)|
𝜕 𝜔1 𝑛 𝜕 𝜔2 𝑘 𝜔1 =0,𝜔2 =0
Here
𝑋̅ = 𝐸 [𝑋]
𝑌̅ = 𝐸 [𝑌]
𝜎𝑋 2 = 𝐸 (𝑋 − 𝑋̅)2
𝜎𝑌 2 = 𝐸 (𝑌 − 𝑌̅ )2
𝐸 [(𝑋 − 𝑋̅) (𝑌 − 𝑌̅ )]
𝜌=
𝜎𝑋 𝜎𝑌
1 −1 (𝑥 − 𝑋̅)2 (𝑦 − 𝑌̅)2
𝑓𝑋𝑌 (𝑥, 𝑦) = 𝑒𝑥𝑝 { [ + ]}
2𝜋 𝜎𝑋 𝜎𝑌 2 𝜎𝑋 2 𝜎𝑌 2
Observe that if 𝜌 = 0, corresponding to uncorrelated X and Y, can be written
as
𝑓𝑋𝑌 (𝑥, 𝑦) = 𝑓𝑋 (𝑥 ) 𝑓𝑌 (𝑦)
Where 𝑓𝑋 (𝑥 ) 𝑎𝑛𝑑 𝑓𝑌 (𝑦) are the marginal density functions of X and Y
1 (𝑥 − 𝑋̅)2
𝑓𝑋 (𝑥 ) = 𝑒𝑥𝑝 [− ]
√2𝜋𝜎𝑋 2 2𝜎𝑋 2
1 (𝑦 − 𝑌̅ )2
𝑓𝑌 (𝑦) = 𝑒𝑥𝑝 [− ]
√2𝜋𝜎𝑌 2 2𝜎𝑌 2
Note:
Two random variables are said to be un-correlated if they are statistical
independent. However the reverse statement is not true for all cases. But for
Gaussian random variables the reverse statement also true.
N Random variables
N random variables X1,X2,.....xn are called jointly Gaussian if their joint density
function can be written as
|[𝐶𝑋 ]−1|1/2 [𝑥 − 𝑋̅]𝑡 [𝐶𝑋 ]−1 [𝑥 − 𝑋̅ ]
𝑓𝑋1 …….𝑋𝑁 (𝑥1. … … 𝑥𝑁 ) = 𝑒𝑥𝑝 {− [ ]}
(2𝜋) 𝑁/2 2
𝑥1 − 𝑋̅1
̅
[𝑥 − 𝑋̅] = 𝑥2 − 𝑋2
⋮
[𝑥𝑁 − 𝑋̅𝑁 ]
𝐶11 𝐶12 … 𝐶1𝑁
𝐶 𝐶22 … 𝐶2𝑁
[𝐶𝑋 ] = [ 21 ]
⋮ ⋮ ⋮
𝐶𝑁1 𝐶2𝑁 … 𝐶𝑁𝑁
[. ]−𝟏 For the matrix inverse
DESCRPTIVE QUESTIONS
f XY xy ,
xy
0 x 2, 0 y 3
9
0 , elsewhere
applies to two random variables X and Y.
Categorize whether X and Y are statistically independent or not.
6. Given
( x y) 2
f XY xy , 1 x 1, 3 y 3
40
0 , elsewhere
Determine variances of X & Y
11. Find & Sketch the density of W = X+Y, if X & Y are Statistically
independent and have marginal densities
1 1
f X ( x) [ u ( x) u ( x a )] f Y ( y ) [ u ( x) u ( x b)] assume b>a
a b
1. Calculate ‘𝑏′ value and also Joint distribution function for the given Joint
density function.
𝑏(𝑥 + 𝑦)2 ; −2 < 𝑥 ≤ 2 , −3 < 𝑦 ≤ 3
𝑓𝑋𝑌 (𝑥, 𝑦) = {
0 ; 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
Sol: Given,
∫ ∫ 𝒇𝑿𝒀 (𝒙, 𝒚) 𝒅𝒙 𝒅𝒚 = 𝟏
−∞ −∞
2 3
⇒ ∫ ∫ 𝑏(𝑥 + 𝑦)2 𝑑𝑥 𝑑𝑦 = 1
−2 −3
2 3
⇒ 𝑏 ∫ ∫(𝑥 2 + 𝑦 2 + 2𝑥𝑦) 𝑑𝑦 𝑑𝑥 = 1
−2 −3
2
27 27
⇒ 𝑏 ∫ [3𝑥 2 + + 9𝑥 − (−3𝑥 2 − + 9𝑥)] 𝑑𝑥 = 1
3 3
−2
⇒ 𝑏 ∫[6𝑥 2 + 18] 𝑑𝑥 = 1
−2
⇒ 6𝑏 ∫[𝑥 2 + 3] 𝑑𝑥 = 1
−2
8 8
⇒ 6𝑏 [ + 6 − (− − 6)] = 1
3 3
CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 23
UNIT-II MULTIPLE RANDOM VARIABLES
16 + 36
⇒ 6𝑏 [ ]=1
3
⇒ 2𝑏 [52] = 1
⇒ 104𝑏 = 1
𝟏
∴𝒃=
𝟏𝟎𝟒
Calculation of Joint distribution function:
We know that,
𝒙 𝒚
𝑥 𝑦
= ∫ ∫ 𝑏(𝑥 + 𝑦)2 𝑑𝑥 𝑑𝑦
−2 −3
𝑥 𝑦
= 𝑏 ∫ ∫(𝑥 2 + 𝑦 2 + 2𝑥𝑦) 𝑑𝑦 𝑑𝑥
−2 −3
𝑥
2
𝑦3 27
= 𝑏 ∫ [𝑥 𝑦 + + 𝑥𝑦 2 − (−3𝑥 2 − + 9𝑥)] 𝑑𝑥
3 3
−2
𝑥 3 𝑦 𝑥𝑦 3 𝑥 2 𝑦 2 3
9𝑥 2 8𝑦 2𝑦 3
= 𝑏[ + + + 𝑥 + 9𝑥 − − (− − + 2𝑦 2 − 8 − 18 − 18)]
3 3 2 2 3 3
𝟏 ( 𝒙 𝟑 + 𝟖) 𝒚 + ( 𝒙 + 𝟐) 𝒚𝟑 ( 𝒚𝟐 − 𝟗) 𝒙 𝟐
∴ 𝑭𝑿𝒀 (𝒙, 𝒚) = [ + + 𝒙𝟑 + 𝟗𝒙 − 𝟐𝒚𝟐 + 𝟒𝟒]
𝟏𝟎𝟒 𝟑 𝟐
⇒ 𝑓𝑋 (𝑥 ) = ∫(𝑥 + 𝑦) 𝑑𝑦
0
1
𝑦2
= [𝑥𝑦 + ]
2 0
1
=𝑥+
2
2𝑥 + 1
∴ 𝑓𝑋 (𝑥 ) =
2
And
⇒ 𝑓𝑌 (𝑦) = ∫(𝑥 + 𝑦) 𝑑𝑥
0
1
𝑥2
= [ + 𝑥𝑦]
2 0
1
=𝑦+
2
2𝑦 + 1
∴ 𝑓𝑌 (𝑦) =
2
∴ Conditional density functions;
𝒙 𝒇𝑿𝒀 (𝒙, 𝒚) 𝟐(𝒙 + 𝒚)
𝒇𝑿 ( ) = =
𝒚 𝒇𝒀 (𝒚) 𝟐𝒚 + 𝟏
𝒚 𝒇𝑿𝒀 (𝒙, 𝒚) 𝟐(𝒙 + 𝒚)
𝒇𝒀 ( ) = =
𝒙 𝒇𝑿 (𝒙) 𝟐𝒙 + 𝟏
3. Two random variables are such that 𝑌 = −4𝑋 + 20 the mean of 𝑋 is 4.
Check whether the given random variables are statistically independent or
not, when the variance of 𝑋 is 2.
Sol: Given,
𝑌 = −4𝑋 + 20
𝐸 [𝑋] = 4
𝜎𝑋 2 = 2
If two random variables are said to be statistically independent then,
𝑬[𝑿𝒀] = 𝑬[𝑿]𝑬[𝒀]
Calculation of 𝐸 [𝑌]:
𝐸 [𝑌] = 𝐸 [−4𝑋 + 20]
= −4𝐸 [𝑋] + 20
= −4(4) + 20
= −16 + 20
CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 26
UNIT-II MULTIPLE RANDOM VARIABLES
∴ 𝐸 [𝑌] = 4
Evaluation of𝐸 [𝑋2 ]:
We know that,
𝝈 𝑿 𝟐 = 𝒎𝟐 − 𝒎𝟏 𝟐
⇒ 2 = 𝑚2 − 42
⇒ 𝑚2 = 16 + 2
∴ 𝐸 [𝑋 2 ] = 18
Calculation of𝐸 [𝑋𝑌 ]:
𝐸 [𝑋𝑌 ] = 𝐸 [𝑋(−4𝑋 + 20)]
= 𝐸 [−4𝑋 2 + 20𝑋]
= −4𝐸 [𝑋2 ] + 20𝐸 [𝑋]
= −4(18) + 20(4)
∴ 𝐸 [𝑋𝑌] = 8
Now, 𝐸 [𝑋]𝐸 [𝑌] = 4 × 4 = 16
∴ 𝑬[𝑿𝒀] ≠ 𝑬[𝑿]𝑬[𝒀]
Hence, the given random variables 𝑋 and 𝑌 are not statistically independent.
4. Two random variables 𝑋 and 𝑌 have the joint PDF
𝐴𝑒 −(2𝑥+𝑦) ; 𝑥, 𝑦 ≥ 0
𝑓𝑋𝑌 (𝑥, 𝑦) = {
0 ; 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
Evaluate (i) 𝐴 (ii) Marginal pdfs 𝑓𝑋 (𝑥 ) & 𝑓𝑌 (𝑦)
Sol:Given,
𝐴𝑒 −(2𝑥+𝑦) ; 𝑥, 𝑦 ≥ 0
𝑓𝑋𝑌 (𝑥, 𝑦) = {
0 ; 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
(i)Calculation of “𝐴” value:
We know that,
∞ ∞
∫ ∫ 𝒇𝑿𝒀 (𝒙, 𝒚) 𝒅𝒙 𝒅𝒚 = 𝟏
−∞ −∞
⇒ ∫ ∫ 𝐴𝑒 −(2𝑥+𝑦) 𝑑𝑥 𝑑𝑦 = 1
0 0
∞
𝑒 −2𝑥
⇒ 𝐴 ∫ [0 − ] 𝑑𝑥 = 1
−1
0
⇒ 𝐴 ∫ 𝑒 −2𝑥 𝑑𝑥 = 1
0
1
⇒ 𝐴 [0 − (− )] = 1
2
𝐴
⇒ =1
2
∴ 𝑨=𝟐
Now,
𝑓𝑋 (𝑥 ) = ∫ 𝐴𝑒 −(2𝑥+𝑦) 𝑑𝑦
0
∞
𝑒 −(2𝑥+𝑦)
= 𝐴[ ]
−1 0
𝑒 −2𝑥
= 2 [0 − ]
−1
∴ 𝒇𝑿 (𝒙) = 𝟐ⅇ−𝟐𝒙
Now,
∞
𝑓𝑌 (𝑦) = ∫ 𝐴𝑒 −(2𝑥+𝑦) 𝑑𝑥
0
∞
𝑒 −(2𝑥+𝑦)
= 𝐴[ ]
−2 0
𝑒 −𝑦
= 2 [0 − ]
−2
∴ 𝒇𝒀 (𝒚) = ⅇ−𝒚
∫ ∫ 𝒇𝑿𝒀 (𝒙, 𝒚) 𝒅𝒙 𝒅𝒚 = 𝟏
−∞ −∞
1 2
⇒ ∫ ∫ 𝑐(2𝑥 + 𝑦) 𝑑𝑥 𝑑𝑦 = 1
0 0
⇒ 𝑐 ∫(4𝑥 + 2) 𝑑𝑥 = 1
0
⇒ 𝑐[2 + 2] = 1
𝟏
∴𝒄=
𝟒
(ii) Calculation of marginal density functions:
Formulas:
∞
Now,
2
𝑓𝑋 (𝑥 ) = ∫ 𝑐(2𝑥 + 𝑦)𝑑𝑦
0
𝑓𝑌 (𝑦) = ∫ 𝑐 (2𝑥 + 𝑦) 𝑑𝑥
0
1
2𝑥 2
= 𝑐[ + 𝑥𝑦]
2 0
1
= [1 + 𝑦]
4
𝟏
∴ 𝒇𝒀 (𝒚) = (𝒚 + 𝟏)
𝟒
𝑥𝑦
; 0 < 𝑥 < 2,0 < 𝑦 < 3
𝑓𝑋𝑌 (𝑥, 𝑦) = { 9
0 ; 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
Sol: Given,
𝑥𝑦
; 0 < 𝑥 < 2,0 < 𝑦 < 3
𝑓𝑋𝑌 (𝑥, 𝑦) = { 9
0 ; 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
Condition for statistical independence: 𝒇𝑿𝒀 (𝒙, 𝒚) = 𝒇𝑿 (𝒙)𝒇𝒀 (𝒚)
Calculation of Marginal pdfs:
Formulas:
Now,
3
𝑥𝑦
𝑓𝑋 (𝑥 ) = ∫ 𝑑𝑦
9
0
3
1 𝑦2
= [𝑥 ( )]
9 2 0
1 9𝑥
= [ ]
9 2
𝑥
∴ 𝑓𝑋 (𝑥 ) =
2
Now,
2
𝑥𝑦
𝑓𝑌 (𝑦) = ∫ 𝑑𝑥
9
0
2
1 𝑥2
= [( ) 𝑦]
9 2 0
1
= [2𝑦]
9
2𝑦
∴ 𝑓𝑌 (𝑦) =
9
𝑥 2𝑦
∴ 𝑓𝑋 (𝑥 )𝑓𝑌 (𝑦) = ( ) ( )
2 9
𝑥𝑦
=
9
∴ 𝒇𝑿 (𝒙)𝒇𝒀 (𝒚) = 𝒇𝑿𝒀 (𝒙, 𝒚)
Hence, the random variables 𝑋 and 𝑌 are statistically independent.
Now,
2
5 2
𝑓𝑋 (𝑥 ) = ∫ 𝑥 𝑦 𝑑𝑦
16
0
2
5 2 𝑦2
= [𝑥 ( )]
16 2 0
5
= [2𝑥 2 ]
16
5
∴ 𝑓𝑋 (𝑥 ) = 𝑥 2
8
Now,
2
5 2
𝑓𝑌 (𝑦) = ∫ 𝑥 𝑦 𝑑𝑥
16
0
5 8𝑦
= [ ]
16 3
5𝑦
∴ 𝑓𝑌 (𝑦) =
6
5 5𝑦
∴ 𝑓𝑋 (𝑥 )𝑓𝑌 (𝑦) = ( 𝑥 2 ) ( )
8 6
25 2
= (𝑥 𝑦)
48
∴ 𝒇𝑿 (𝒙)𝒇𝒀 (𝒚) ≠ 𝒇𝑿𝒀 (𝒙, 𝒚)
Hence, the given two random variables are not statistically independent.
Sol: Given,
𝑥+𝑦 ; 0<𝑥 <2, 0<𝑦<1
𝑓𝑋𝑌 (𝑥, 𝑦) = {
0 ; 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
Condition for statistical independence: 𝒇𝑿𝒀 (𝒙, 𝒚) = 𝒇𝑿 (𝒙)𝒇𝒀 (𝒚)
Calculation of Marginal pdfs:
Formulas:
∞
Now,
𝑓𝑋 (𝑥 ) = ∫(𝑥 + 𝑦) 𝑑𝑦
0
1
𝑦2
= [𝑥𝑦 + ]
2 0
1
∴ 𝑓𝑋 (𝑥 ) = 𝑥 +
2
Now,
2
𝑓𝑌 (𝑦) = ∫(𝑥 + 𝑦) 𝑑𝑥
0
2
𝑥2
= [ + 𝑥𝑦]
2 0
∴ 𝑓𝑌 (𝑦) = 2𝑦 + 2
1
∴ 𝑓𝑋 (𝑥 )𝑓𝑌 (𝑦) = (𝑥 + ) (2𝑦 + 2)
2
= 2𝑥𝑦 + 2𝑥 + 𝑦 + 1
∴ 𝒇𝑿 (𝒙)𝒇𝒀 (𝒚) ≠ 𝒇𝑿𝒀 (𝒙, 𝒚)
Hence, 𝑋 and 𝑌 are not statistically independent.
Calculation of Correlation Coefficient:
Formulas:
𝝁𝟏𝟏 𝑪𝑿𝒀
𝝆=[ ]=[ ]
√𝝁𝟐𝟎 𝝁𝟎𝟐 𝝈𝒙 𝝈𝒀
⇒ 𝑅𝑋𝑌 = ∫ ∫ 𝑥𝑦 (𝑥 + 𝑦) 𝑑𝑥 𝑑𝑦
0 0
2 1
⇒ 𝑅𝑋𝑌 = ∫ ∫(𝑥 2𝑦 + 𝑥𝑦 2 ) 𝑑𝑥 𝑑𝑦
0 0
To find𝐸 [𝑌]:
∞
𝐸 [𝑌] = ∫ 𝑦(2𝑦 + 2) 𝑑𝑦
0
1
= 2 ∫(𝑦 2 + 𝑦) 𝑑𝑦
0
1
𝑦3 𝑦2
= 2[ + ]
3 2 0
5
= 2[ ]
6
5
∴ 𝐸 [𝑌] =
3
Evaluation of𝐶𝑋𝑌 :
𝑪𝑿𝒀 = 𝑹𝑿𝒀 − 𝑬[𝑿]𝑬[𝒀]
11 5
=2−( )( )
3 3
= 2 − 6.1111
∴ 𝑪𝑿𝒀 = −𝟒. 𝟏𝟏𝟏𝟏
To find𝐸 [𝑋2 ]:
∞
𝐸 [𝑌 2 ] = ∫ 𝑦 2 (2𝑦 + 2) 𝑑𝑦
0
1
= 2 ∫[𝑦 3 + 𝑦 2 ] 𝑑𝑦
0
1
𝑦4 𝑦3
= 2[ + ]
4 3 0
1 1
= 2[ + ]
4 3
7
∴ 𝐸 [𝑌 2 ] =
6
Evaluation of𝜎𝑋 2 :
𝝈𝑿 𝟐 = 𝑬[𝑿𝟐 ] − (𝑬[𝑿])𝟐
2
16 11 2
𝜎𝑋 = −( )
3 3
𝟕𝟑
∴ 𝝈𝑿 𝟐 = −
𝟗
Evaluation of𝜎𝑌 2 :
𝝈𝒀 𝟐 = 𝑬[𝒀𝟐 ] − (𝑬[𝒀])𝟐
2
7 5 2
𝜎𝑌 = −( )
6 3
𝟐𝟗
∴ 𝝈𝒀 𝟐 = −
𝟏𝟖
Correlation coefficient:
𝑪𝑿𝒀
𝝆=
𝝈𝒙 𝝈𝒀
−4.1111
=
√(− 73) (− 29)
9 18
−4.1111
=
3.615
∴ 𝝆 = −𝟏. 𝟏𝟑𝟕
9. Given,
(𝑥 + 𝑦)2
𝑓𝑋𝑌 (𝑥, 𝑦) = { 40 ; −1 < 𝑥 < 1 , −3 < 𝑦 < 3
0 ; 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
Determine variance of𝑋 and 𝑌.
Sol: Given,
(𝑥 + 𝑦 ) 2
𝑓𝑋𝑌 (𝑥, 𝑦) = { 40 ; −1 < 𝑥 < 1 , −3 < 𝑦 < 3
0 ; 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
Calculation of Marginal pdfs:
Formulas:
∞
Now,
3
(𝑥 + 𝑦)2
𝑓𝑋 (𝑥 ) = ∫ 𝑑𝑦
40
−3
Variance of 𝑌:
𝝈 𝒀 𝟐 = 𝒎𝟐 − 𝒎 𝟏 𝟐
𝜎𝑌 2 = 5.16 − 02
∴ 𝝈𝒀 𝟐 = 𝟓. 𝟏𝟔
∴ 𝑹𝑿𝑾 = 𝟑. 𝟐𝟓
(iii)𝑅𝑌𝑊 :
𝑅𝑌𝑊 = 𝐸 [𝑌𝑊 ]
= 𝐸 [𝑌 (𝑋 − 2𝑌 + 1)]
= 𝐸 [𝑌𝑋 − 2𝑌 2 + 𝑌]
= 𝐸 [𝑌]𝐸 [𝑋] − 2𝐸 [𝑌 2 ] + 𝐸 [𝑌]
3
= (1) ( ) − 2(5) + 1
4
−33
=
4
∴ 𝑹𝒀𝑾 = −𝟖 ⋅ 𝟐𝟓
12. Two random variables 𝑋 and 𝑌 have means 𝑋̅ = 1, 𝑌̅ = 3 and
variances 𝜎𝑋 2 = 4 and 𝜎𝑌 2 = 1 and correlation coefficient 𝜌𝑋𝑌 =0.4. New
random variables 𝑊 and 𝑉 are defined such that 𝑊 = 𝑋 + 3𝑌 and 𝑉 =
−𝑋 + 2𝑌.
Find (i)Mean (ii)Variance of 𝑊 and 𝑉
Sol:Given,
𝑋̅ = 1,𝑌̅ = 3
𝜎𝑋 2 = 4, 𝜎𝑌 2 = 1
𝜌𝑋𝑌 = 0.4 and also
𝑊 = 𝑋 + 3𝑌, 𝑉 = −𝑋 + 2𝑌
(i) Mean of𝑊:
̅ = 𝐸 [𝑋 + 3𝑌]
𝐸 [𝑊 ] = 𝑊
= 𝐸 [𝑋] + 3𝐸 [𝑌]
= 𝑋̅ + 3𝑌̅
= 1 + 3(3)
∴ 𝑬[𝑾] = 𝟏𝟎
Mean of𝑉:
𝐸 [𝑉 ] = 𝑉̅ = 𝐸 [−𝑋 + 2𝑌]
= − 𝐸 [𝑋] + 2𝐸 [𝑌]
= −𝑋̅ + 2𝑌̅
= −1 + 2(3)
∴ 𝑬[𝑽] = 𝟓
(ii)Variance of𝑊 and 𝑉:
Given,
𝑪𝑿𝒀
𝜌𝑋𝑌 = 0.4 [∵ 𝝆𝑿𝒀 = ]
𝝈𝑿 𝝈𝒀
𝐶𝑋𝑌
⇒ = 0.4
𝜎𝑋 𝜎𝑌
𝑅𝑋𝑌 − 𝐸 [𝑋] 𝐸 [𝑌]
⇒ = 0.4
𝜎𝑋 𝜎𝑌
⇒ 𝐸 [𝑋𝑌] = 0.4(𝜎𝑋 𝜎𝑌 ) + 𝐸 [𝑋]𝐸 [𝑌]
⇒ 𝑅𝑋𝑌 = 𝐸 [𝑋𝑌 ] = 0.4(2 × 1) + 3
∴ 𝑅𝑋𝑌 = 𝐸 [𝑋𝑌 ] = 3.8
Now,
𝜎𝑋 2 = 𝑚2 − 𝑚12
⇒ 𝑚2 = 𝜎𝑋 2 + 𝑚12
⇒ 𝑚2 = 4 + 1
∴ 𝐸 [𝑋 2 ] = 5
And
𝜎𝑌 2 = 𝑚2 − 𝑚12
⇒ 𝑚2 = 𝜎𝑌 2 + 𝑚12
⇒ 𝑚2 = 1 + 9
∴ 𝐸 [𝑌 2 ] = 10
To find variance of𝑊:
Now, 𝑚2 of 𝑊 i.e,
𝐸 [𝑊 2 ] = 𝐸 [(𝑋 + 3𝑌)2]
= 𝐸 [𝑋2 + 9𝑌 2 + 6𝑋𝑌 ]
= 𝐸 [𝑋 2 ] + 9 𝐸 [𝑌 2 ] + 6𝐸 [𝑋𝑌]
= 5 + 90 + 6(3.8)
∴ 𝐸 [𝑊 2 ] = 117.8
Variance of 𝑊:
𝜎𝑊 2 = 𝑚2 − 𝑚12
= 𝐸 [𝑊 2 ] − 𝐸 [𝑊 ]2
= 117.8 − 102
∴ 𝝈𝑾 𝟐 = 𝟏𝟕. 𝟖
To find variance of𝑉:
Now, 𝑚2 of 𝑉 i.e,
𝐸 [𝑉 2 ] = 𝐸 [(−𝑋 + 2𝑌)2]
= 𝐸 [𝑋2 + 4𝑌 2 − 4𝑋𝑌 ]
= 𝐸 [𝑋 2 ] + 4 𝐸 [𝑌 2 ] − 4𝐸 [𝑋𝑌]
= 5 + 40 − 4(3.8)
∴ 𝐸 [𝑉 2 ] = 29.8
Variance of 𝑉:
𝜎𝑉 2 = 𝑚2 − 𝑚12
= 𝐸 [𝑉 2 ] − 𝐸 [𝑉 ]2
= 29.8 − 52
∴ 𝝈𝑽 𝟐 = 𝟒. 𝟖