0% found this document useful (0 votes)
9 views47 pages

Unit-Ii Multiple Random Variables

This document covers the concepts of multiple random variables, including joint distribution functions, marginal distributions, and properties of joint probability and density functions. It discusses operations on multiple random variables such as expected values, correlation, and central limit theorem. Additionally, it provides proofs and properties related to joint probability distributions and density functions, emphasizing their significance in statistical analysis.

Uploaded by

ANSHUL SINGH
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views47 pages

Unit-Ii Multiple Random Variables

This document covers the concepts of multiple random variables, including joint distribution functions, marginal distributions, and properties of joint probability and density functions. It discusses operations on multiple random variables such as expected values, correlation, and central limit theorem. Additionally, it provides proofs and properties related to joint probability distributions and density functions, emphasizing their significance in statistical analysis.

Uploaded by

ANSHUL SINGH
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 47

UNIT-II MULTIPLE RANDOM VARIABLES

UNIT-II
MULTIPLE RANDOM VARIABLES

Multiple Random Variables: Vector random variables, Joint distribution


function and properties, Marginal distribution functions, Joint density function and
properties, Marginal density functions, Joint Conditional distribution and density
functions, statistical independence, Distribution and density of sum of random
variables, Central limit theorem.
Operations on Multiple Random Variables: Expected value of a function of
random variables, Joint moments about the origin, Correlation, Joint central
moments, Covariance, Correlation coefficient, Joint characteristic function and
properties, Jointly Gaussian random variables-two and N random variables,
properties

VECTOR RANDOM VARIABLS

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 1


UNIT-II MULTIPLE RANDOM VARIABLES

Joint Probability distribution function


Consider two random variables X and Y with elements {𝑥 } and {𝑦} in 𝑥𝑦 plane.
Let two events 𝐴 = {𝑋 ≤ 𝑥 } and 𝐵 = {𝑌 ≤ 𝑦} then the Joint Probability
distribution function It gives information about probability of event {𝑋 ≤ 𝑥 }
𝐹𝑋𝑌 (𝑥, 𝑦) = P{𝑋 ≤ 𝑥, 𝑌 ≤ 𝑦} = 𝑃(𝐴 ∩ 𝐵)

Properties of Joint Probability distribution function


𝑖) 𝐹𝑋𝑌 (−∞, −∞) = 0
𝑖𝑖) 𝐹𝑋𝑌 (𝑥, −∞) = 0
𝑖𝑖𝑖) 𝐹𝑋𝑌 (−∞, 𝑦) = 0
Proof:
It is known that
𝑖) 𝐹𝑋𝑌 (𝑥, 𝑦) = P{𝑋 ≤ 𝑥, 𝑌 ≤ 𝑦}
𝐹𝑋𝑌 (−∞, −∞) = P{𝑋 ≤ −∞, 𝑌 ≤ −∞}
= P{𝑋 ≤ −∞ ∩ 𝑌 ≤ −∞}
𝐹𝑋𝑌 (−∞, −∞) = 0

𝑖𝑖) 𝐹𝑋𝑌 (𝑥, 𝑦) = P{𝑋 ≤ 𝑥, 𝑌 ≤ 𝑦}


𝐹𝑋𝑌 (𝑥, −∞) = P{𝑋 ≤ 𝑥, 𝑌 ≤ −∞}
= P{𝑋 ≤ 𝑥 ∩ 𝑌 ≤ −∞}
𝐹𝑋𝑌 (𝑥, −∞) = 0

𝑖𝑖𝑖) 𝐹𝑋𝑌 (𝑥, 𝑦) = P{𝑋 ≤ 𝑥, 𝑌 ≤ 𝑦}


𝐹𝑋𝑌 (−∞, 𝑦) = P{𝑋 ≤ −∞, 𝑌 ≤ 𝑦}
= P{𝑋 ≤ −∞ ∩ 𝑌 ≤ 𝑦}
𝐹𝑋𝑌 (−∞, 𝑦) = 0

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 2


UNIT-II MULTIPLE RANDOM VARIABLES

2. 𝐹𝑋𝑌 (∞, ∞) = 1
Proof:
It is known that
𝐹𝑋𝑌 (𝑥, 𝑦) = P{𝑋 ≤ 𝑥, 𝑌 ≤ 𝑦}
𝐹𝑋𝑌 (∞, ∞) = P{𝑋 ≤ ∞, 𝑌 ≤ ∞}
= P{𝑋 ≤ ∞ ∩ 𝑌 ≤ ∞ }
= P{𝑆 ∩ 𝑆} = P{𝑆} = 1
𝐹𝑋𝑌 (∞, ∞) = 1

3. The joint Probability distribution function is always define between 0 and 1.


i.e; 0 ≤ 𝐹𝑋𝑌 (𝑥, 𝑦) ≤ 1
4. Marginal distribution functions
𝐹𝑋 (𝑥 ) = 𝐹𝑋𝑌 (𝑥, ∞)
𝐹𝑌 (𝑦) = 𝐹𝑋𝑌 (∞, 𝑦)
Proof:
It is known that
𝐹𝑋𝑌 (𝑥, 𝑦) = P{𝑋 ≤ 𝑥, 𝑌 ≤ 𝑦}
𝐹𝑋𝑌 (𝑥, ∞) = P{𝑋 ≤ 𝑥, 𝑌 ≤ ∞}
= P{𝑋 ≤ 𝑥 ∩ 𝑌 ≤ ∞}
= P{𝑋 ≤ 𝑥 ∩ 𝑠}
= P{𝑋 ≤ 𝑥 }
𝐹𝑋𝑌 (𝑥, ∞) = 𝐹𝑋 (𝑥 )

𝐹𝑋𝑌 (𝑥, 𝑦) = P{𝑋 ≤ 𝑥, 𝑌 ≤ 𝑦}


𝐹𝑋𝑌 (∞, 𝑦) = P{𝑋 ≤ ∞, 𝑌 ≤ 𝑦}
= P{𝑠 ∩ 𝑌 ≤ 𝑦}
= P{𝑌 ≤ 𝑦}
𝐹𝑋𝑌 (∞, 𝑦) = 𝐹𝑌 (𝑦)

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 3


UNIT-II MULTIPLE RANDOM VARIABLES

Joint Probability Density Function:


It gives information about the joint occurrence of events at given values of X
and Y
 The joint probability density function is the partial derivatives of joint
distribution.
𝜕 2 𝐹𝑋𝑌 (𝑥, 𝑦)
𝑓𝑋𝑌 (𝑥, 𝑦) =
𝜕𝑥 𝜕𝑦
when X and Y are discrete random variables, joint density function is
𝑓𝑋𝑌 (𝑥, 𝑦) = ∑𝑚 𝑛
𝑖=1 ∑𝑗=1 𝑃(𝑋 = 𝑥𝑖 , 𝑌 = 𝑦𝑗 ) 𝛿 (𝑋 − 𝑥𝑖 , 𝑌 − 𝑦𝑗 )

Properties of Joint Probability density function


1. Joint probability density function is a non-negative quantity
𝑓𝑋𝑌 (𝑥, 𝑦) ≥ 𝑜
Proof
From the definition
𝜕 2 𝐹𝑋𝑌 (𝑥, 𝑦)
𝑓𝑋𝑌 (𝑥, 𝑦) =
𝜕𝑥 𝜕𝑦
As the distribution function is anon non-decreasing function slope is always
positive. Hence the joint probability density function is a non-negative quantity.

2. The area under the probability density function is unity.


∞ ∞
∫ ∫ 𝑓𝑋𝑌 (𝑥, 𝑦)𝑑𝑥 𝑑𝑦 = 1
−∞ −∞

Proof:
We know that
𝜕 2 𝐹𝑋𝑌 (𝑥, 𝑦)
𝑓𝑋𝑌 (𝑥, 𝑦) =
𝜕𝑥 𝜕𝑦

Integrate on both sides w.r.to 𝑥

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 4


UNIT-II MULTIPLE RANDOM VARIABLES
∞ ∞
𝜕 2 𝐹𝑋𝑌 (𝑥, 𝑦)
∫ 𝑓𝑋𝑌 (𝑥, 𝑦)𝑑𝑥 = ∫ 𝑑𝑥
−∞ −∞ 𝜕𝑥 𝜕𝑦
𝜕 ∞ 𝜕
= ∫ 𝐹 (𝑥, 𝑦)𝑑𝑥
𝜕𝑦 −∞ 𝜕𝑥 𝑋𝑌

𝜕
= [𝐹 (𝑥, 𝑦)]
𝜕𝑦 𝑋,𝑌 −∞
𝜕
= (𝐹 (∞, 𝑦) − 𝐹𝑋𝑌 (−∞, 𝑦))
𝜕𝑦 𝑋𝑌
Integrate on both sides w.r.to 𝑦
∞ ∞ ∞ 𝜕
∫ ∫ 𝑓𝑋𝑌 (𝑥, 𝑦)𝑑𝑥 𝑑𝑦 = ∫ 𝐹𝑌 (𝑦)𝑑𝑦
−∞ −∞ −∞ 𝜕𝑦
= [𝐹𝑌 (𝑦)]∞
−∞

= 𝐹𝑌 (∞) − 𝐹𝑌 (−∞) = 1
𝜕
= 𝐹 (𝑦)
𝜕𝑦 𝑌
∞ ∞
∫ ∫ 𝑓𝑋𝑌 (𝑥, 𝑦)𝑑𝑥 𝑑𝑦 = 1
−∞ −∞

3. The joint probability distribution function can be obtained from the


knowledge of joint density function.
𝑥 𝑦
𝐹𝑋𝑌 (𝑥, 𝑦) = ∫ ∫ 𝑓𝑋𝑌 (𝑥, 𝑦)𝑑𝑥 𝑑𝑦
−∞ −∞

Proof
𝜕 2 𝐹𝑋𝑌 (𝑥, 𝑦)
𝑤𝑒 𝑘𝑛𝑜𝑤 𝑓𝑋𝑌 (𝑥, 𝑦) =
𝜕𝑥 𝜕𝑦
Integrating on both sides w. r. to 𝑥
𝑥 𝜕 2 𝐹𝑋𝑌 (𝑥, 𝑦)
𝑥
∫ 𝑓𝑋𝑌 (𝑥, 𝑦)𝑑𝑥 = ∫ 𝑑𝑥
−∞ −∞ 𝜕𝑥 𝜕𝑦
𝜕 𝑥 𝜕
∫ 𝐹 (𝑥, 𝑦)𝑑𝑥
𝜕𝑦 −∞ 𝜕𝑥 𝑋𝑌

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 5


UNIT-II MULTIPLE RANDOM VARIABLES
𝑥
𝜕
= [𝐹 (𝑥, 𝑦)]
𝜕𝑦 𝑋,𝑌 −∞
𝜕
= (𝐹 (𝑥, 𝑦) − 𝐹𝑋𝑌 (−∞, 𝑦))
𝜕𝑦 𝑋𝑌
𝜕
= 𝐹 (𝑥, 𝑦)
𝜕𝑦 𝑋𝑌
Integrate on both sides w.r.to 𝑦
𝑦 𝑥 𝑦 𝜕
∫ ∫ 𝑓𝑋𝑌 (𝑥, 𝑦) 𝑑𝑥 𝑑𝑦 = ∫ 𝐹𝑋𝑌 (𝑥, 𝑦)𝑑𝑦
−∞ −∞ −∞ 𝜕𝑦

𝑦
= [𝐹𝑋𝑌 (𝑥, 𝑦)]−∞
= 𝐹𝑋𝑌 (𝑥, 𝑦) − 𝐹𝑋𝑌 (𝑥, −∞)
= 𝐹𝑋𝑌 (𝑥, 𝑦)
𝑥 𝑦
𝐹𝑋𝑌 (𝑥, 𝑦) = ∫ ∫ 𝑓𝑋𝑌 (𝑥, 𝑦)𝑑𝑥 𝑑𝑦
−∞ −∞

4. The probability of event {𝑥1 < 𝑋 ≤ 𝑥2 , 𝑦1 < 𝑌 ≤ 𝑦2 } can be obtained


from the knowledge of joint density function.
𝑥2 𝑦2
𝑃{𝑥1 < 𝑋 ≤ 𝑥2 , 𝑦1 < 𝑌 ≤ 𝑦2 } = ∫ ∫ 𝑓𝑋𝑌 (𝑥, 𝑦)𝑑𝑥 𝑑𝑦
𝑥1 𝑦1

Proof:
Consider
𝑥2 𝑦2
= ∫ ∫ 𝑓𝑋𝑌 (𝑥, 𝑦)𝑑𝑥 𝑑𝑦
𝑥1 𝑦1
𝑥2 𝑦2 𝜕 2 𝐹 (𝑥, 𝑦)
𝑋𝑌
=∫ ∫ 𝑑𝑥 𝑑𝑦
𝑥1 𝑦1 𝜕𝑥 𝜕𝑦
Changing the order of integration
𝑦2
𝜕 𝑥2 𝜕
∫ ∫ 𝐹𝑋𝑌 (𝑥, 𝑦)𝑑𝑥 𝑑𝑦
𝑦1 𝜕𝑦 𝑥1 𝜕𝑥

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 6


UNIT-II MULTIPLE RANDOM VARIABLES
𝑦2 𝜕 𝑥2
∫ [𝐹𝑋,𝑌 (𝑥, 𝑦)]𝑥 𝑑𝑦
𝑦1 𝜕𝑦 1

𝑦2
𝜕
∫ (𝐹𝑋𝑌 (𝑥2 , 𝑦) − 𝐹𝑋𝑌 (𝑥1 , 𝑦))𝑑𝑦
𝑦1 𝜕𝑦
𝑦
= [(𝐹𝑋𝑌 (𝑥2 , 𝑦) − 𝐹𝑋𝑌 (𝑥1 , 𝑦))]𝑦12
= 𝐹𝑋𝑌 (𝑥2 , 𝑦2 ) − 𝐹𝑋𝑌 (𝑥1, 𝑦2 )−𝐹𝑋𝑌 (𝑥1, 𝑦2 )+𝐹𝑋𝑌 (𝑥1, 𝑦1 )

Marginal density functions



𝒇𝑿 (𝒙) = ∫ 𝒇𝑿𝒀 (𝒙, 𝒚) 𝒅𝒚
−∞

𝒇𝒀 (𝒚) = ∫ 𝒇𝑿𝒀 (𝒙, 𝒚) 𝒅𝒙
−∞
𝜕 2 𝐹𝑋𝑌 (𝑥, 𝑦)
𝑤𝑒 𝑘𝑛𝑜𝑤 𝑓𝑋𝑌 (𝑥, 𝑦) =
𝜕𝑥 𝜕𝑦
Integrate on both sides w.r.to 𝑥
∞ ∞
𝜕 2 𝐹𝑋𝑌 (𝑥, 𝑦)
∫ 𝑓𝑋𝑌 (𝑥, 𝑦)𝑑𝑥 = ∫ 𝑑𝑥
−∞ −∞ 𝜕𝑥 𝜕𝑦
𝜕 ∞ 𝜕
= ∫ 𝐹 (𝑥, 𝑦)𝑑𝑥
𝜕𝑦 −∞ 𝜕𝑥 𝑋𝑌

𝜕
= [𝐹 (𝑥, 𝑦)]
𝜕𝑦 𝑋,𝑌 −∞

𝜕
= (𝐹 (∞, 𝑦) − 𝐹𝑋𝑌 (−∞, 𝑦))
𝜕𝑦 𝑋𝑌
𝜕
= 𝐹 (𝑦)
𝜕𝑦 𝑌
= 𝑓𝑌 (𝑦)

𝒇𝒀 (𝒚) = ∫ 𝒇𝑿𝒀 (𝒙, 𝒚) 𝒅𝒙
−∞

Integrate on both sides w.r.to 𝑦

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 7


UNIT-II MULTIPLE RANDOM VARIABLES
∞ ∞
𝜕 2 𝐹𝑋𝑌 (𝑥, 𝑦)
∫ 𝑓𝑋𝑌 (𝑥, 𝑦)𝑑𝑦 = ∫ 𝑑𝑦
−∞ −∞ 𝜕𝑥 𝜕𝑦
𝜕 ∞ 𝜕
= ∫ 𝐹 (𝑥, 𝑦)𝑑𝑦
𝜕𝑥 −∞ 𝜕𝑦 𝑋𝑌

𝜕
= [𝐹 (𝑥, 𝑦)]
𝜕𝑥 𝑋,𝑌 −∞
𝜕
= (𝐹 (𝑥, ∞) − 𝐹𝑋𝑌 (𝑥, −∞))
𝜕𝑥 𝑋𝑌
𝜕
= 𝐹 (𝑥 )
𝜕𝑥 𝑋
= 𝑓𝑋 (𝑦)

𝒇𝑿 (𝒙) = ∫ 𝒇𝑿𝒀 (𝒙, 𝒚) 𝒅𝒚
−∞

CONDITIONAL PROBABILITY
It is the probability of an event ‘A’ based on the occurrence of another event
‘B’.
Let A and B are two events then the conditional probability of A upon the
occurrence of B is given as
𝑃 (𝐴 ∩ 𝐵 )
𝑃(𝐴⁄𝐵 ) =
𝑃 (𝐵 )
Similarly, the conditional probability of B upon the occurrence of A is given
as
𝑃(𝐴∩𝐵)
𝑃(𝐵⁄𝐴) =
𝑃(𝐴)

CONDITIONAL DISTRIBUTION FUNCTION


The concept of conditional probability is extended for random variable also.
Let ‘X’ be a random variable then the conditional distribution function is
defined as
𝐹𝑋 (𝑥⁄𝐵 ) = P{𝑋 ≤ 𝑥 ∩ 𝐵 }
Properties of Probability Distribution Function

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 8


UNIT-II MULTIPLE RANDOM VARIABLES

1. 𝐹𝑋 (−∞⁄𝐵 ) = 0

2. 𝐹𝑋 (∞⁄𝐵 ) = 1

3. 0 ≤ 𝐹𝑋 (𝑥⁄𝐵 ) ≤ 1
4. 𝐹𝑋 (𝑥2 /𝐵 ) ≥ 𝐹𝑋 (𝑥1 )/𝐵 𝑤ℎ𝑒𝑛 𝑥2 > 𝑥1
(𝑥1 < 𝑋 ≤ 𝑥2 )⁄
5. 𝑃 { 𝐵 } = 𝐹𝑋 (𝑥2 /𝐵 ) − 𝐹𝑋 (𝑥1/𝐵 )

CONDITIONAL DENSITY FUNCTION


The derivative of conditional distribution function is called conditional
density function.
It gives the conditional probability of an event at a specific value.
𝑑𝐹𝑋 (𝑥⁄𝐵 )
𝑓𝑋 (𝑥⁄𝐵 ) =
𝑑𝑥
Properties
1. 𝑓𝑋 (𝑥⁄𝐵 ) ≥ 𝑜

2. ∫−∞ 𝑓𝑋 (𝑥⁄𝐵 ) 𝑑𝑥 = 1
𝑥
3. 𝐹𝑋 (𝑥/𝐵 ) = ∫−∞ 𝑓𝑋 (𝑥⁄𝐵 )𝑑𝑥
𝑥
4. 𝑃 {𝑥1 < 𝑋 ≤ 𝑥2 } = ∫𝑥 2 𝑓𝑋 (𝑥⁄𝐵 )𝑑𝑥
1

JOINT CONDITIONAL DISTRIBUTION AND DENSITY FUNCTIONS


Two conditions are defined called as
1. Point conditioning
2. Interval conditioning
Based on the values taken by random variable ‘X’
if ‘Y’ takes a single value then it is called Point conditioning
if ‘Y’ takes a range of values then it is called Interval conditioning

Point conditioning:

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 9


UNIT-II MULTIPLE RANDOM VARIABLES

Under this the random variable ‘Y’ takes a single value such that
𝑦 − ∆𝑦 ≤ 𝑌 ≤ 𝑦 + ∆𝑦
𝑃 (𝐴 ∩ 𝐵 )
𝑃(𝐴⁄𝐵 ) =
𝑃 (𝐵 )
𝐹𝑋 (𝑥⁄𝐵 ) = P{𝑋 ≤ 𝑥 ∩ 𝐵 }

P{(𝑋 ≤ 𝑥 ) ∩ (𝑦 − ∆𝑦 ≤ 𝑌 ≤ 𝑦 + ∆𝑦)}
𝐹𝑋 (𝑥⁄𝑦 − ∆𝑦 ≤ 𝑌 ≤ 𝑦 + ∆𝑦) =
P(𝑦 − ∆𝑦 ≤ 𝑌 ≤ 𝑦 + ∆𝑦)

It is known that the distribution function is the integral of density function


𝑦+∆𝑦 𝑥
∫𝑦−∆𝑦 ∫−∞ 𝑓𝑋𝑌 (𝑥, 𝑦)𝑑𝑥 𝑑𝑦
𝐹𝑋 (𝑥⁄𝑦 − ∆𝑦 ≤ 𝑌 ≤ 𝑦 + ∆𝑦) = 𝑦+∆𝑦
∫𝑦−∆𝑦 𝑓𝑌 (𝑦) 𝑑𝑦
𝑥 𝑦+∆𝑦
∫−∞ 𝑓𝑋𝑌 (𝑥, 𝑦)𝑑𝑥 ∫𝑦−∆𝑦 𝑑𝑦
∆𝑦 ⟶ 0 𝐹𝑋 (𝑥⁄𝑦 − ∆𝑦 ≤ 𝑌 ≤ 𝑦 + ∆𝑦) = 𝑦+∆𝑦
𝑓𝑌 (𝑦) ∫𝑦−∆𝑦 𝑑𝑦

𝑥 𝑥
∫ 𝑓𝑋𝑌 (𝑥, 𝑦)𝑑𝑥 2∆𝑦 ∫ 𝑓𝑋𝑌 (𝑥, 𝑦)𝑑𝑥
= −∞ = −∞
𝑓𝑌 (𝑦) 2∆𝑦 𝑓𝑌 (𝑦)
Conditional density function is
𝑑 𝑥
𝑑 𝐹𝑋 (𝑥⁄𝑌 ) 𝑑𝑥 ∫−∞ 𝑓𝑋𝑌 (𝑥, 𝑦)𝑑𝑥
𝑓𝑋 (𝑥⁄𝑌) = =
𝑑𝑥 𝑓𝑌 (𝑦)
𝑓𝑋𝑌 (𝑥, 𝑦)
𝑓𝑋 (𝑥⁄𝑌) =
𝑓𝑌 (𝑦)
Similarly
𝑦 𝑓𝑋𝑌 (𝑥, 𝑦)
𝑓𝑌 ( ⁄𝑋) =
𝑓𝑋 (𝑥 )
Interval conditioning:
Under this the random variable ‘Y’ takes the range of values such that
𝑦𝑎 ≤ 𝑌 ≤ 𝑦𝑏

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 10


UNIT-II MULTIPLE RANDOM VARIABLES
𝑦𝑏 𝑥
∫𝑦 ∫−∞ 𝑓𝑋𝑌 (𝑥, 𝑦)𝑑𝑥 𝑑𝑦
𝐹𝑋 (𝑥⁄𝑦 ≤ 𝑌 ≤ 𝑦 ) = 𝑎
𝑏 𝑦
𝑎 𝑏 ∫𝑦 𝑓𝑌 (𝑦) 𝑑𝑦
𝑎
𝑦
∫𝑦 𝑏 𝑓𝑋𝑌 (𝑥, 𝑦) 𝑑𝑦
𝐹𝑋 (𝑥⁄𝑦 ≤ 𝑌 ≤ 𝑦 ) = 𝑎 𝑦𝑏
𝑎 𝑏 ∫𝑦 𝑓𝑌 (𝑦) 𝑑𝑦
𝑎

𝑏 𝑦
∫𝑦 𝑓𝑋𝑌 (𝑥, 𝑦) 𝑑𝑦
𝐹𝑋 (𝑥⁄𝑦 ≤ 𝑌 ≤ 𝑦 ) = 𝑦𝑏 𝑎∞
𝑎 𝑏 ∫𝑦 ∫−∞ 𝑓𝑋𝑌 (𝑥, 𝑦)𝑑𝑥 𝑑𝑦
𝑎

Statistical independence of random variables:


Consider two random variables X and Y by defining the events 𝐴 = {𝑋 ≤ 𝑥 }
and 𝐵 = {𝑌 ≤ 𝑦} for two real numbers 𝑥 𝑎𝑛𝑑 𝑦
Two random variables are said to be statistical independent then
P{𝑋 ≤ 𝑥, 𝑌 ≤ 𝑦} = P{𝑋 ≤ 𝑥 } P{𝑌 ≤ 𝑦}
From distribution function
𝐹𝑋𝑌 (𝑥, 𝑦) = 𝐹𝑋 (𝑥 ) 𝐹𝑌 (𝑦)
From density function
𝑓𝑋𝑌 (𝑥, 𝑦) = 𝑓𝑋 (𝑥 ) 𝑓𝑌 (𝑦)

From conditional density function


𝑓𝑋𝑌 (𝑥, 𝑦) 𝑓𝑋 (𝑥 ) 𝑓𝑌 (𝑦)
𝑓𝑋 (𝑥⁄𝑦) = == = 𝑓𝑋 (𝑥 )
𝑓 (𝑦)𝑌 𝑓 (𝑦) 𝑌

𝑦 𝑓𝑋𝑌 (𝑥, 𝑦) 𝑓𝑋 (𝑥 ) 𝑓𝑌 (𝑦)


𝑓𝑋 ( ⁄𝑥 ) = = = 𝑓𝑌 (𝑦)
𝑓𝑋 (𝑥 ) 𝑓𝑋 (𝑥 )

DISTRIBUTION AND DENSITY OF A SUM OF RANDOM VARIABLES


In real time applications, the received signal is a sum of designed original
signal and noise. In such a case the information about the probability of
combine signal will help in analysing the communication system.

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 11


UNIT-II MULTIPLE RANDOM VARIABLES

Let ‘W’ be a random variable equal to sum of two independent random


variables X and Y
𝑊 =𝑋+𝑌
The density function of sum of two random variables is the area under the
given curve 𝑊 = 𝑋 + 𝑌

∞ 𝑥
𝐹𝑊 (𝑤 ) = ∫ ∫ 𝑓𝑋𝑌 (𝑥, 𝑦)𝑑𝑥 𝑑𝑦
−∞ −∞

When X and Y are statically independent


∞ 𝑤−𝑦
𝐹𝑊 (𝑤 ) = ∫ 𝑓𝑌 (𝑦) ∫ 𝑓𝑋 (𝑥 ) 𝑑𝑥 𝑑𝑦
−∞ −∞

Differentiating (using Leibniz’s rule) w.r.to ‘w’ on both side


𝑑 𝑑 ∞ 𝑤−𝑦
𝐹 (𝑤 ) = ∫ 𝑓 (𝑦) ∫ 𝑓𝑋 (𝑥 ) 𝑑𝑥 𝑑𝑦
𝑑𝑤 𝑊 𝑑𝑤 −∞ 𝑌 −∞
∞ 𝑑 𝑤−𝑦
𝑓𝑊 (𝑤 ) = ∫ 𝑓𝑌 (𝑦) ∫ 𝑓𝑋 (𝑥 ) 𝑑𝑥 𝑑𝑦
−∞ 𝑑𝑤 −∞


𝑓𝑊 (𝑤 ) = ∫ 𝑓𝑌 (𝑦)[𝑓𝑋 (𝑤 − 𝑦) − 𝑓𝑋 (−∞)] 𝑑𝑦
−∞

𝑓𝑊 (𝑤 ) = ∫ 𝑓𝑌 (𝑦) 𝑓𝑋 (𝑤 − 𝑦) 𝑑𝑦
−∞

The above expression is recognized as a convolution integral.


𝑓𝑊 (𝑤 ) = 𝑓𝑋 (𝑥 ) ⊗ 𝑓𝑌 (𝑦)

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 12


UNIT-II MULTIPLE RANDOM VARIABLES

The density function of the sum of two statistically independent random


variables is the convolution of their individual density functions.
If there are ‘n’ number of statistically independent random variables then
𝑓𝑋1+𝑋2 +⋯𝑋𝑛 (𝑥 ) = 𝑓𝑋1 (𝑥1 ) ∗ 𝑓𝑋2 (𝑥2 ) ∗ 𝑓𝑋3 (𝑥3 ) … … ∗ 𝑓𝑋𝑛 (𝑥𝑛 )

Operations on Multiple Random Variables:


Expected Value or Mean
When more than a single random variable is involved, expectation must be
taken with respect to all the variables involved.
∞ ∞
𝐸 [𝑔 (𝑋, 𝑌)] = ∫ ∫ 𝑔(𝑥, 𝑦) 𝑓𝑋𝑌 (𝑥, 𝑦) 𝑑𝑥 𝑑𝑦
−∞ −∞

JOINT MOMENTS:
Moments are the measure of deviation of a random variable from a reference
value.
Joint moments indicate the deviation of a multiple random variables from a
reference value.
Joint moments about the origin
The expected value of a function of a form 𝑔(𝑥, 𝑦) = 𝑋 𝑛 𝑌 𝑘 is called joint
moment about the origin.
𝑚𝑛𝑘 = 𝐸 [𝑋𝑛 𝑌 𝑘 ]
∞ ∞
𝑚𝑛𝑘 = ∫ ∫ 𝑥 𝑛 𝑦 𝑘 𝑓𝑋𝑌 (𝑥, 𝑦) 𝑑𝑥 𝑑𝑦
−∞ −∞

The order of joint moment is the sum of individual orders n and k.


i.e: order=n+k
First order joint moments:
𝑚𝑛𝑘 = 𝐸 [𝑋𝑛 𝑌 𝑘 ]

1 0]
𝑚10 = 𝐸 [𝑋 𝑌 = 𝐸 (𝑋) = 𝑚1 = ∫ 𝑥. 𝑓𝑋 (𝑥 ) 𝑑𝑥
−∞

𝑚01 = 𝐸 [𝑋 0 𝑌1 ] = 𝐸 (𝑌)

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 13


UNIT-II MULTIPLE RANDOM VARIABLES

Note: The first order joint moments are equal respective individual expected
value.
Second order joint moments:
𝑚02 = 𝐸 [ 𝑌 2 ]
𝑚20 = 𝐸 [ 𝑋 2 ]
The second order joint moment 𝑚11 is called as Correlation.
𝑅𝑋𝑌 = 𝑚11 = 𝐸 [𝑋 𝑌]
∞ ∞
𝑚11 = ∫ ∫ 𝑥 𝑦 𝑓𝑋𝑌 (𝑥, 𝑦) 𝑑𝑥 𝑑𝑦
−∞ −∞

Correlation is a measure of similarity between two (or) more random variables.


If two random variables are said to be statistical independent then
𝑅𝑋𝑌 = 𝐸 [𝑋 𝑌] = 𝐸 [𝑋] 𝐸 [𝑌]
∞ ∞ ∞ ∞
= ∫ ∫ 𝑥 𝑦 𝑓𝑋𝑌 (𝑥, 𝑦) 𝑑𝑥 𝑑𝑦 = ∫ ∫ 𝑥 𝑦 𝑓𝑋 (𝑥 ) 𝑓𝑌 (𝑦) 𝑑𝑥 𝑑𝑦
−∞ −∞ −∞ −∞
∞ ∞
= ∫ 𝑥 𝑓𝑋 (𝑥 ) 𝑑𝑥 ∫ 𝑦 𝑓𝑌 (𝑦) 𝑑𝑦 = 𝐸 [𝑋] 𝐸 [𝑌]
−∞ −∞

Note: when two random variables are orthogonal 𝑅𝑋𝑌 = 0

Joint central moments

The expected value of a given function 𝑔(𝑥 ) = (𝑋 − 𝑋̅)𝑛 (𝑌 − 𝑌̅ )𝐾 is called


joint central moment of two random variables.

𝜇𝑛𝑘 = 𝐸 [(𝑋 − 𝑋̅)𝑛 (𝑌 − 𝑌̅ )𝑘 ]


∞ ∞
= ∫−∞ ∫−∞(𝑥 − 𝑋̅)𝑛 (𝑦 − 𝑌̅ )𝑘 𝑓𝑋𝑌 (𝑥, 𝑦) 𝑑𝑥 𝑑𝑦

Properties of central moments:


1. The zero order joint central moment is ‘1’.
2. The first order joint central moment is ‘zero’.

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 14


UNIT-II MULTIPLE RANDOM VARIABLES

𝜇01 = 𝜇10 = 1
3. The second order joint central moment
𝜇𝑛𝑘 = 𝐸 [(𝑋 − 𝑋̅)𝑛 (𝑌 − 𝑌̅ )𝑘 ]
𝜇20 = 𝐸 [(𝑋 − 𝑋̅)2 ] = 𝜎𝑋 2
𝜇02 = 𝐸 [ (𝑌 − 𝑌̅)2 ] = 𝜎𝑌 2
4. The second order joint central moment is called as covariance between
two random variables X and Y.
𝜇11 = 𝐸 [(𝑋 − 𝑋̅)(𝑌 − 𝑌̅)] = 𝐶𝑋𝑌
Covariance is a measure of change in random variable with another one.
It indicates how to random variables vary together.
Properties of Covariance:
1. If X and Y are two random variables then the Covariance between them
is given as
𝐶𝑋𝑌 = 𝑅𝑋𝑌 − 𝐸 [𝑋] 𝐸 [𝑌]
2. If X and Y are two statistical independent random variables then
𝐶𝑋𝑌 = 0
𝐶𝑋𝑌 = 𝑅𝑋𝑌 − 𝐸 [𝑋] 𝐸 [𝑌]
If two random variables are said to be statistical independent then
𝑅𝑋𝑌 = 𝐸 [𝑋 𝑌] = 𝐸 [𝑋] 𝐸 [𝑌]
𝐶𝑋𝑌 = 𝐸 [𝑋] 𝐸 [𝑌] − 𝐸 [𝑋] 𝐸 [𝑌] = 0
3. Let X and Y be two random variables then
𝑣𝑎𝑟[𝑋 + 𝑌] = 𝑣𝑎𝑟[𝑋] + 𝑣𝑎𝑟[𝑌] + 2𝐶𝑋𝑌
𝑣𝑎𝑟[𝑋 − 𝑌] = 𝑣𝑎𝑟[𝑋] + 𝑣𝑎𝑟[𝑌] − 2𝐶𝑋𝑌

Correlation coefficient:
It is defined as

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 15


UNIT-II MULTIPLE RANDOM VARIABLES

𝜇11 (𝑜𝑟) 𝐶𝑋𝑌


𝜌=
√𝜇20 𝜇02
𝜇11
𝜌=
𝜎𝑋 𝜎𝑌
JOINT CHARACTERISTICS FUNCTION

The expected value of the joint function 𝑔(𝑥, 𝑦) = 𝑒 𝑗𝜔1 𝑋 𝑒 𝑗𝜔2 𝑌 is


called joint characteristics function.
𝜙𝑋𝑌 (𝜔1, 𝜔2 ) = 𝐸[𝑒𝑗𝜔1 𝑋 𝑒𝑗𝜔2 𝑌 ]
∞ ∞

= ∫ ∫ 𝑒𝑗𝜔1 𝑋 𝑒𝑗𝜔2 𝑌 𝑓𝑋𝑌 (𝑥, 𝑦) dx dy


−∞ −∞

1 ∞ −𝑗𝜔 𝑋 −𝑗𝜔 𝑌
𝑓𝑋𝑌 (𝑥, 𝑦) = ∫ 𝑒 1 𝑒 2 𝜙 (𝜔 𝜔 ) d𝜔 d𝜔
𝑋𝑌 1, 2 1 2
2𝜋 −∞

Joint characteristics function and joint density function are Fourier transform
pairs with the sign of the variable are reversed.
Properties of joint characteristics function:
1. The marginal characteristics function can be obtained from the
knowledge of joint characteristics function
𝜙𝑋 (𝜔1 ) = 𝜙𝑋𝑌 (𝜔1, 0)
𝜙𝑌 (𝜔2 ) = 𝜙𝑋𝑌 (0, 𝜔2 )
𝑝𝑟𝑜𝑜𝑓:
We know that
𝜙𝑋𝑌 (𝜔1, 𝜔2 ) = 𝐸[𝑒𝑗𝜔1 𝑋 𝑒𝑗𝜔2 𝑌 ]
Let 𝜔2 = 0
𝜙𝑋𝑌 (𝜔1, 0) = 𝐸[𝑒𝑗𝜔1 𝑋 ] = 𝜙𝑋 (𝜔1 )
Let 𝜔1 = 0
𝜙𝑋𝑌 (0, 𝜔2 ) = 𝐸[𝑒𝑗𝜔2 𝑌 ] = 𝜙𝑌 (𝜔2 )

2. If X and Y are two statistical independent random variables then their


joint characteristics function is the product of individual characteristics
functions

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 16


UNIT-II MULTIPLE RANDOM VARIABLES

𝜙𝑋𝑌 (𝜔1, 𝜔2 ) = 𝜙𝑋 (𝜔1) 𝜙𝑌 (𝜔2 )

𝑝𝑟𝑜𝑜𝑓:
We know that
𝜙𝑋𝑌 (𝜔1, 𝜔2 ) = 𝐸[𝑒𝑗𝜔1 𝑋 𝑒𝑗𝜔2 𝑌 ]
∞ ∞

= ∫ ∫ 𝑒𝑗𝜔1 𝑋 𝑒𝑗𝜔2 𝑌 𝑓𝑋𝑌 (𝑥, 𝑦) dx dy


−∞ −∞

If X and Y are two statistical independent random variables then


∞ ∞

= ∫ ∫ 𝑒𝑗𝜔1 𝑋 𝑒𝑗𝜔2 𝑌 𝑓𝑋𝑌 (𝑥, 𝑦) dx dy


−∞ −∞
∞ ∞
=∫ 𝑒𝑗𝜔1𝑋 𝑓𝑋 (𝑥 ) dx ∫ 𝑒𝑗𝜔2𝑌 𝑓𝑌 (𝑦) dy
−∞ −∞

𝜙𝑋𝑌 (𝜔1,𝜔2 ) = 𝜙𝑋 (𝜔1 ) 𝜙𝑌 (𝜔2 )

3. If X and Y are two statistical independent random variables then the joint
characteristics function of sum of random variables is the product of
individual characteristics functions

𝜙𝑋+𝑌 (𝜔) = 𝜙𝑋 (𝜔) 𝜙𝑌 (𝜔)


𝑝𝑟𝑜𝑜𝑓:
𝜙𝑋+𝑌 (𝜔) = 𝐸[𝑒𝑗𝜔(𝑋+𝑌) ]
= 𝐸[𝑒𝑗𝜔𝑋 𝑒𝑗𝜔𝑌 ]
If X and Y are two statistical independent random variables then

𝜙𝑋+𝑌 (𝜔) = 𝐸[𝑒𝑗𝜔𝑋 ] 𝐸[𝑒𝑗𝜔𝑌 ]


𝜙𝑋+𝑌 (𝜔) = 𝜙𝑋 (𝜔) 𝜙𝑌 (𝜔)

4. The joint moments of multiple random variable can be obtained from the
knowledge of joint characteristic function is

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 17


UNIT-II MULTIPLE RANDOM VARIABLES

𝜕 𝑛+𝑘
𝑚𝑛 = (−𝑗 )𝑛+𝑘 𝜙𝑋𝑌 (𝜔1, 𝜔2)|
𝜕 𝜔1 𝑛 𝜕 𝜔2 𝑘 𝜔1 =0,𝜔2 =0

JOINTLY GAUSSIAN RANDOM VARIABLES:


 Among various standard density function Gaussian density
function is the most significantly used density function in the field
of science and engineering.
 In particular it is used to estimate the noise power while calculating
the signal to noise ratio.
 It is some time called bivariate Gaussian density
 Two random variables are said to be jointly Gaussian if their joint
density function of the form
1 −1 (𝑥 − 𝑋̅)2 2 𝜌 (𝑥 − 𝑋̅ ) (𝑦 − 𝑌̅) (𝑦 − 𝑌̅ )2
𝑓𝑋𝑌 (𝑥, 𝑦) = 𝑒𝑥𝑝 { [ − + ]}
2𝜋 𝜎𝑋 𝜎𝑌 √1 − 𝜌2 2(1 − 𝜌 2 ) 𝜎𝑋 2 𝜎𝑋 𝜎𝑌 𝜎𝑌 2

Here
𝑋̅ = 𝐸 [𝑋]
𝑌̅ = 𝐸 [𝑌]
𝜎𝑋 2 = 𝐸 (𝑋 − 𝑋̅)2
𝜎𝑌 2 = 𝐸 (𝑌 − 𝑌̅ )2
𝐸 [(𝑋 − 𝑋̅) (𝑌 − 𝑌̅ )]
𝜌=
𝜎𝑋 𝜎𝑌

1. The maximum value of joint Gaussian density function occurs at (𝑥 =


𝑋̅, 𝑦 = 𝑌̅ )
1
𝑚𝑎𝑥 [𝑓𝑋𝑌 (𝑥, 𝑦)] =
2𝜋 𝜎𝑋 𝜎𝑌 √1 − 𝜌 2
2. If X and Y are two statistical independent random variables then their
joint Gaussian density function is

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 18


UNIT-II MULTIPLE RANDOM VARIABLES

1 −1 (𝑥 − 𝑋̅)2 (𝑦 − 𝑌̅)2
𝑓𝑋𝑌 (𝑥, 𝑦) = 𝑒𝑥𝑝 { [ + ]}
2𝜋 𝜎𝑋 𝜎𝑌 2 𝜎𝑋 2 𝜎𝑌 2
Observe that if 𝜌 = 0, corresponding to uncorrelated X and Y, can be written
as
𝑓𝑋𝑌 (𝑥, 𝑦) = 𝑓𝑋 (𝑥 ) 𝑓𝑌 (𝑦)
Where 𝑓𝑋 (𝑥 ) 𝑎𝑛𝑑 𝑓𝑌 (𝑦) are the marginal density functions of X and Y
1 (𝑥 − 𝑋̅)2
𝑓𝑋 (𝑥 ) = 𝑒𝑥𝑝 [− ]
√2𝜋𝜎𝑋 2 2𝜎𝑋 2
1 (𝑦 − 𝑌̅ )2
𝑓𝑌 (𝑦) = 𝑒𝑥𝑝 [− ]
√2𝜋𝜎𝑌 2 2𝜎𝑌 2

Note:
Two random variables are said to be un-correlated if they are statistical
independent. However the reverse statement is not true for all cases. But for
Gaussian random variables the reverse statement also true.

N Random variables
N random variables X1,X2,.....xn are called jointly Gaussian if their joint density
function can be written as
|[𝐶𝑋 ]−1|1/2 [𝑥 − 𝑋̅]𝑡 [𝐶𝑋 ]−1 [𝑥 − 𝑋̅ ]
𝑓𝑋1 …….𝑋𝑁 (𝑥1. … … 𝑥𝑁 ) = 𝑒𝑥𝑝 {− [ ]}
(2𝜋) 𝑁/2 2
𝑥1 − 𝑋̅1
̅
[𝑥 − 𝑋̅] = 𝑥2 − 𝑋2

[𝑥𝑁 − 𝑋̅𝑁 ]
𝐶11 𝐶12 … 𝐶1𝑁
𝐶 𝐶22 … 𝐶2𝑁
[𝐶𝑋 ] = [ 21 ]
⋮ ⋮ ⋮
𝐶𝑁1 𝐶2𝑁 … 𝐶𝑁𝑁
[. ]−𝟏 For the matrix inverse

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 19


UNIT-II MULTIPLE RANDOM VARIABLES

[. ]𝒕 For the matrix transpose


|[. ]| For the matrix determinant
Elements of [𝑪𝑿 ] called the covariance matrix of N random variables, given by
𝜎𝑋𝑖 2 𝑖=𝑗
𝐶𝑖𝑗 = 𝐸[(𝑋𝑖 − 𝑋̅𝑖 )(𝑋𝑗 − 𝑋̅𝑗 )] = {
𝐶𝑋𝑖𝑋𝑗 𝑖≠𝑗

Properties of Gaussian density function for N random variables:


1. Gaussian random variables are defined by mean, variance and co-
variance.
2. All marginal density functions are derived from ‘N’ variate Gaussian
density function are also Gaussian.
3. All conditional density functions derived from ‘N’ variate Gaussian
density function are also Gaussian.
4. Linear transformation Gaussian random variables will also be Gaussian

DESCRPTIVE QUESTIONS

1. Contrast the properties of joint probability distribution function by


using necessary mathematical expressions.
2. Contrast the properties of joint probability density function.
3. Explain about marginal distribution functions of random variables X
and Y?

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 20


UNIT-II MULTIPLE RANDOM VARIABLES

4. Explain about marginal density functions of random variables X and


Y?
5. Infer with necessary expressions that the density function of sum of
two statistically independent random variables is the convolution of
individual density functions.
6. Distinguish between various joint moments.
7. Discuss about Joint Central Moments with necessary mathematical
Expressions.
8. Interpret the properties of joint characteristic function with the help of
necessary mathematical expressions
9. Discuss about the two dimensional Gaussian random variables
density function and summarize its properties

1. Two random variables X and Y have the joint PDF


f XY ( x, y )  A e  ( 2 x  y ) , x, y  0
0 , otherwise
Evaluate (i) A (ii) Marginal pdfs f X (x) & f Y ( y)

2. The joint density of two random variables X and Y is


f XY x, y   c 2 x  y , 0  x  1, 0  y  2
0, elsewhere
Compute
i) The value of c.
ii)The marginal density functions of X and Y.

3. The density function

f XY xy  ,
xy
0  x  2, 0  y  3
9
0 , elsewhere
applies to two random variables X and Y.
Categorize whether X and Y are statistically independent or not.

4. Differentiate whether two given random variables are statistically


independent or not if their joint probability density function is given
as
5
f XY ( x, y )  x 2 y ,0  x  2 & 0  y  2
16
0 , otherwise

5. Two random variables X and Y are having joint density function


f XY x, y   x  y , 0  x  2, 0  y  1
0, elsewhere
Categorize whether X and Y are statistically independent or not.
CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 21
UNIT-II MULTIPLE RANDOM VARIABLES

Calculate correlation coefficient.

6. Given
( x  y) 2
f XY xy  ,  1  x  1,  3  y  3
40
0 , elsewhere
Determine variances of X & Y

7. Two statistically independent random variables X and Y with


X  2 , X 2  8, Y  4 , Y 2  2 5 . For another random variable given as W =
3X-Y, calculate the variance.

8. X & Y be statistically independent random variables with


3
X  , X 2  4, Y 1 , Y 2  5 . If a new random variable is defined as W = X-
4
2Y+1, then calculate
(i) RXY (ii) RXW (iii) RYW .

9. Two random variables X and Y have means 𝑋̅ = 1 , 𝑌̅ = 3 variances


σ2X = 4 and σ2Y = 1 and correlation coefficient  XY = 0.4. New random
variables W and V are defined such that W = X + 3Y, V = - X +2Y.
Find (i)Mean (ii)Variance of W and V.

10. Two random variables X and Y have means X  1 and Y  2


variances  2X  4 and  2Y  1 and a correlation coefficient  XY = 0.4.
New random variables W and V are defined by V = - X +2Y, W = X + 3Y.
Find (i) The means (ii). The Variances (iii) The Correlations (iv) The
correlation coefficient VW of V and W.

11. Find & Sketch the density of W = X+Y, if X & Y are Statistically
independent and have marginal densities
1 1
f X ( x)  [ u ( x)  u ( x  a )] f Y ( y )  [ u ( x)  u ( x  b)] assume b>a
a b

1. Calculate ‘𝑏′ value and also Joint distribution function for the given Joint
density function.
𝑏(𝑥 + 𝑦)2 ; −2 < 𝑥 ≤ 2 , −3 < 𝑦 ≤ 3
𝑓𝑋𝑌 (𝑥, 𝑦) = {
0 ; 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
Sol: Given,

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 22


UNIT-II MULTIPLE RANDOM VARIABLES

𝑏(𝑥 + 𝑦)2 ; −2 < 𝑥 ≤ 2 , −3 < 𝑦 ≤ 3


𝑓𝑋𝑌 (𝑥, 𝑦) = {
0 ; 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
Calculation of “𝑏” value:
We know that,
∞ ∞

∫ ∫ 𝒇𝑿𝒀 (𝒙, 𝒚) 𝒅𝒙 𝒅𝒚 = 𝟏
−∞ −∞

2 3

⇒ ∫ ∫ 𝑏(𝑥 + 𝑦)2 𝑑𝑥 𝑑𝑦 = 1
−2 −3

2 3

⇒ 𝑏 ∫ ∫(𝑥 2 + 𝑦 2 + 2𝑥𝑦) 𝑑𝑦 𝑑𝑥 = 1
−2 −3

Integrating w.r.to ‘y’


2 3
𝑦3
2
𝑦2
⇒ 𝑏 ∫ [𝑥 𝑦 + + 2𝑥 ( )] 𝑑𝑥 = 1
3 2 −3
−2

2
27 27
⇒ 𝑏 ∫ [3𝑥 2 + + 9𝑥 − (−3𝑥 2 − + 9𝑥)] 𝑑𝑥 = 1
3 3
−2

⇒ 𝑏 ∫[6𝑥 2 + 18] 𝑑𝑥 = 1
−2

⇒ 6𝑏 ∫[𝑥 2 + 3] 𝑑𝑥 = 1
−2

Integrating w.r.to ‘x’


2
𝑥3
⇒ 6𝑏 [ + 3𝑥] = 1
3 −2

8 8
⇒ 6𝑏 [ + 6 − (− − 6)] = 1
3 3
CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 23
UNIT-II MULTIPLE RANDOM VARIABLES

16 + 36
⇒ 6𝑏 [ ]=1
3

⇒ 2𝑏 [52] = 1

⇒ 104𝑏 = 1

𝟏
∴𝒃=
𝟏𝟎𝟒
Calculation of Joint distribution function:
We know that,
𝒙 𝒚

𝑭𝑿𝒀 (𝒙, 𝒚) = ∫ ∫ 𝒇𝑿𝒀 (𝒙, 𝒚) 𝒅𝒙 𝒅𝒚


−∞ −∞

𝑥 𝑦

= ∫ ∫ 𝑏(𝑥 + 𝑦)2 𝑑𝑥 𝑑𝑦
−2 −3

𝑥 𝑦

= 𝑏 ∫ ∫(𝑥 2 + 𝑦 2 + 2𝑥𝑦) 𝑑𝑦 𝑑𝑥
−2 −3

Integrating w.r.to ‘y’


𝑥 𝑦
2
𝑦3 𝑦2
= 𝑏 ∫ [𝑥 𝑦 + + 2𝑥 ( )] 𝑑𝑥
3 2 −3
−2

𝑥
2
𝑦3 27
= 𝑏 ∫ [𝑥 𝑦 + + 𝑥𝑦 2 − (−3𝑥 2 − + 9𝑥)] 𝑑𝑥
3 3
−2

Integrating w.r.to ‘x’


𝑥
𝑥3 𝑦3 𝑥2 2 𝑥3 𝑥2
= 𝑏 [( ) 𝑦 + (𝑥 ) + ( ) 𝑦 + 3 ( ) + 9𝑥 − 9 ( )]
3 3 2 3 2 −2

𝑥 3 𝑦 𝑥𝑦 3 𝑥 2 𝑦 2 3
9𝑥 2 8𝑦 2𝑦 3
= 𝑏[ + + + 𝑥 + 9𝑥 − − (− − + 2𝑦 2 − 8 − 18 − 18)]
3 3 2 2 3 3

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 24


UNIT-II MULTIPLE RANDOM VARIABLES

𝟏 ( 𝒙 𝟑 + 𝟖) 𝒚 + ( 𝒙 + 𝟐) 𝒚𝟑 ( 𝒚𝟐 − 𝟗) 𝒙 𝟐
∴ 𝑭𝑿𝒀 (𝒙, 𝒚) = [ + + 𝒙𝟑 + 𝟗𝒙 − 𝟐𝒚𝟐 + 𝟒𝟒]
𝟏𝟎𝟒 𝟑 𝟐

2. The Joint density function of two random variables X and Y is given as


𝑥 + 𝑦 ; 0 ≤ 𝑥 ≤ 1 ,0 ≤ 𝑦 ≤ 1
𝑓𝑋𝑌 (𝑥, 𝑦) = {
0 ; 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
Find the Conditional density function.
Sol: Given,
𝑥 + 𝑦 ; 0 ≤ 𝑥 ≤ 1,0 ≤ 𝑦 ≤ 1
𝑓𝑋𝑌 (𝑥, 𝑦) = {
0 ; 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
Formulas to find Conditional density function:
𝒙 𝒇𝑿𝒀 (𝒙, 𝒚)
𝒇𝑿 ( ) =
𝒚 𝒇𝒀 (𝒚)
𝒚 𝒇𝑿𝒀 (𝒙, 𝒚)
𝒇𝒀 ( ) =
𝒙 𝒇𝑿 (𝒙)
Calculation of 𝑓𝑋 (𝑥 ) and 𝑓𝑌 (𝑦):
We know that,

𝒇𝑿 (𝒙) = ∫ 𝒇𝑿𝒀 (𝒙, 𝒚) 𝒅𝒚


−∞
1

⇒ 𝑓𝑋 (𝑥 ) = ∫(𝑥 + 𝑦) 𝑑𝑦
0
1
𝑦2
= [𝑥𝑦 + ]
2 0
1
=𝑥+
2
2𝑥 + 1
∴ 𝑓𝑋 (𝑥 ) =
2
And

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 25


UNIT-II MULTIPLE RANDOM VARIABLES

𝒇𝒀 (𝒚) = ∫ 𝒇𝑿𝒀 (𝒙, 𝒚) 𝒅𝒙


−∞
1

⇒ 𝑓𝑌 (𝑦) = ∫(𝑥 + 𝑦) 𝑑𝑥
0
1
𝑥2
= [ + 𝑥𝑦]
2 0
1
=𝑦+
2
2𝑦 + 1
∴ 𝑓𝑌 (𝑦) =
2
∴ Conditional density functions;
𝒙 𝒇𝑿𝒀 (𝒙, 𝒚) 𝟐(𝒙 + 𝒚)
𝒇𝑿 ( ) = =
𝒚 𝒇𝒀 (𝒚) 𝟐𝒚 + 𝟏
𝒚 𝒇𝑿𝒀 (𝒙, 𝒚) 𝟐(𝒙 + 𝒚)
𝒇𝒀 ( ) = =
𝒙 𝒇𝑿 (𝒙) 𝟐𝒙 + 𝟏
3. Two random variables are such that 𝑌 = −4𝑋 + 20 the mean of 𝑋 is 4.
Check whether the given random variables are statistically independent or
not, when the variance of 𝑋 is 2.
Sol: Given,
𝑌 = −4𝑋 + 20
𝐸 [𝑋] = 4
𝜎𝑋 2 = 2
If two random variables are said to be statistically independent then,
𝑬[𝑿𝒀] = 𝑬[𝑿]𝑬[𝒀]
Calculation of 𝐸 [𝑌]:
𝐸 [𝑌] = 𝐸 [−4𝑋 + 20]
= −4𝐸 [𝑋] + 20
= −4(4) + 20
= −16 + 20
CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 26
UNIT-II MULTIPLE RANDOM VARIABLES

∴ 𝐸 [𝑌] = 4
Evaluation of𝐸 [𝑋2 ]:
We know that,
𝝈 𝑿 𝟐 = 𝒎𝟐 − 𝒎𝟏 𝟐
⇒ 2 = 𝑚2 − 42
⇒ 𝑚2 = 16 + 2
∴ 𝐸 [𝑋 2 ] = 18
Calculation of𝐸 [𝑋𝑌 ]:
𝐸 [𝑋𝑌 ] = 𝐸 [𝑋(−4𝑋 + 20)]
= 𝐸 [−4𝑋 2 + 20𝑋]
= −4𝐸 [𝑋2 ] + 20𝐸 [𝑋]
= −4(18) + 20(4)
∴ 𝐸 [𝑋𝑌] = 8
Now, 𝐸 [𝑋]𝐸 [𝑌] = 4 × 4 = 16
∴ 𝑬[𝑿𝒀] ≠ 𝑬[𝑿]𝑬[𝒀]
Hence, the given random variables 𝑋 and 𝑌 are not statistically independent.
4. Two random variables 𝑋 and 𝑌 have the joint PDF
𝐴𝑒 −(2𝑥+𝑦) ; 𝑥, 𝑦 ≥ 0
𝑓𝑋𝑌 (𝑥, 𝑦) = {
0 ; 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
Evaluate (i) 𝐴 (ii) Marginal pdfs 𝑓𝑋 (𝑥 ) & 𝑓𝑌 (𝑦)
Sol:Given,
𝐴𝑒 −(2𝑥+𝑦) ; 𝑥, 𝑦 ≥ 0
𝑓𝑋𝑌 (𝑥, 𝑦) = {
0 ; 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
(i)Calculation of “𝐴” value:
We know that,
∞ ∞

∫ ∫ 𝒇𝑿𝒀 (𝒙, 𝒚) 𝒅𝒙 𝒅𝒚 = 𝟏
−∞ −∞

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 27


UNIT-II MULTIPLE RANDOM VARIABLES
∞ ∞

⇒ ∫ ∫ 𝐴𝑒 −(2𝑥+𝑦) 𝑑𝑥 𝑑𝑦 = 1
0 0

Integrating w.r.to ‘y’


∞ ∞
𝑒 −(2𝑥+𝑦)
⇒ 𝐴∫[ ] 𝑑𝑥 = 1
−1 0
0


𝑒 −2𝑥
⇒ 𝐴 ∫ [0 − ] 𝑑𝑥 = 1
−1
0

⇒ 𝐴 ∫ 𝑒 −2𝑥 𝑑𝑥 = 1
0

Integrating w.r.to ‘x’



𝑒 −2𝑥
⇒ 𝐴[ ] =1
−2 0

1
⇒ 𝐴 [0 − (− )] = 1
2
𝐴
⇒ =1
2

∴ 𝑨=𝟐

(ii) Calculation of Marginal pdfs:


Formulas:

𝒇𝑿 (𝒙) = ∫ 𝒇𝑿𝒀 (𝒙, 𝒚) 𝒅𝒚


−∞

𝒇𝒀 (𝒚) = ∫ 𝒇𝑿𝒀 (𝒙, 𝒚) 𝒅𝒙


−∞

Now,

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 28


UNIT-II MULTIPLE RANDOM VARIABLES

𝑓𝑋 (𝑥 ) = ∫ 𝐴𝑒 −(2𝑥+𝑦) 𝑑𝑦
0

𝑒 −(2𝑥+𝑦)
= 𝐴[ ]
−1 0

𝑒 −2𝑥
= 2 [0 − ]
−1
∴ 𝒇𝑿 (𝒙) = 𝟐ⅇ−𝟐𝒙
Now,

𝑓𝑌 (𝑦) = ∫ 𝐴𝑒 −(2𝑥+𝑦) 𝑑𝑥
0

𝑒 −(2𝑥+𝑦)
= 𝐴[ ]
−2 0

𝑒 −𝑦
= 2 [0 − ]
−2
∴ 𝒇𝒀 (𝒚) = ⅇ−𝒚

5. The joint density of two random variables 𝑋 and 𝑌 is


𝑐 (2𝑥 + 𝑦) ; 0 ≤ 𝑥 ≤ 1, 0≤𝑦≤2
𝑓𝑋𝑌 (𝑥, 𝑦) = {
0 ; 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
Compute
(i) The value of “𝑐”.
(ii) The marginal density functions of 𝑋 and 𝑌.
Sol: Given,
𝑐 (2𝑥 + 𝑦) ; 0 ≤ 𝑥 ≤ 1, 0≤𝑦≤2
𝑓𝑋𝑌 (𝑥, 𝑦) = {
0 ; 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
(i) Calculation of “𝑐” value:
We know that,

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 29


UNIT-II MULTIPLE RANDOM VARIABLES
∞ ∞

∫ ∫ 𝒇𝑿𝒀 (𝒙, 𝒚) 𝒅𝒙 𝒅𝒚 = 𝟏
−∞ −∞

1 2

⇒ ∫ ∫ 𝑐(2𝑥 + 𝑦) 𝑑𝑥 𝑑𝑦 = 1
0 0

Integrating w.r.to ‘y’


1 2
𝑦2
⇒ 𝑐 ∫ [2𝑥𝑦 + ] 𝑑𝑥 = 1
2 0
0

⇒ 𝑐 ∫(4𝑥 + 2) 𝑑𝑥 = 1
0

Integrating w.r.to ‘x’


1
4𝑥 2
⇒ 𝑐[ + 2𝑥] = 1
2 0

⇒ 𝑐[2 + 2] = 1

𝟏
∴𝒄=
𝟒
(ii) Calculation of marginal density functions:
Formulas:

𝒇𝑿 (𝒙) = ∫ 𝒇𝑿𝒀 (𝒙, 𝒚) 𝒅𝒚


−∞

𝒇𝒀 (𝒚) = ∫ 𝒇𝑿𝒀 (𝒙, 𝒚) 𝒅𝒙


−∞

Now,
2

𝑓𝑋 (𝑥 ) = ∫ 𝑐(2𝑥 + 𝑦)𝑑𝑦
0

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 30


UNIT-II MULTIPLE RANDOM VARIABLES
2
𝑦2
= 𝑐 [2𝑥𝑦 + ]
2 0
1
= [4𝑥 + 2]
4
𝟏
∴ 𝒇𝑿 (𝒙) = 𝒙 +
𝟐
Now,
1

𝑓𝑌 (𝑦) = ∫ 𝑐 (2𝑥 + 𝑦) 𝑑𝑥
0
1
2𝑥 2
= 𝑐[ + 𝑥𝑦]
2 0
1
= [1 + 𝑦]
4
𝟏
∴ 𝒇𝒀 (𝒚) = (𝒚 + 𝟏)
𝟒

6. The density function, applies to two random variables 𝑋 and 𝑌 Categorize


whether 𝑋 and 𝑌 are statistically independent or not.

𝑥𝑦
; 0 < 𝑥 < 2,0 < 𝑦 < 3
𝑓𝑋𝑌 (𝑥, 𝑦) = { 9
0 ; 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
Sol: Given,
𝑥𝑦
; 0 < 𝑥 < 2,0 < 𝑦 < 3
𝑓𝑋𝑌 (𝑥, 𝑦) = { 9
0 ; 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
Condition for statistical independence: 𝒇𝑿𝒀 (𝒙, 𝒚) = 𝒇𝑿 (𝒙)𝒇𝒀 (𝒚)
Calculation of Marginal pdfs:
Formulas:

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 31


UNIT-II MULTIPLE RANDOM VARIABLES

𝒇𝑿 (𝒙) = ∫ 𝒇𝑿𝒀 (𝒙, 𝒚) 𝒅𝒚


−∞

𝒇𝒀 (𝒚) = ∫ 𝒇𝑿𝒀 (𝒙, 𝒚) 𝒅𝒙


−∞

Now,
3
𝑥𝑦
𝑓𝑋 (𝑥 ) = ∫ 𝑑𝑦
9
0
3
1 𝑦2
= [𝑥 ( )]
9 2 0
1 9𝑥
= [ ]
9 2
𝑥
∴ 𝑓𝑋 (𝑥 ) =
2
Now,
2
𝑥𝑦
𝑓𝑌 (𝑦) = ∫ 𝑑𝑥
9
0
2
1 𝑥2
= [( ) 𝑦]
9 2 0

1
= [2𝑦]
9
2𝑦
∴ 𝑓𝑌 (𝑦) =
9
𝑥 2𝑦
∴ 𝑓𝑋 (𝑥 )𝑓𝑌 (𝑦) = ( ) ( )
2 9
𝑥𝑦
=
9
∴ 𝒇𝑿 (𝒙)𝒇𝒀 (𝒚) = 𝒇𝑿𝒀 (𝒙, 𝒚)
Hence, the random variables 𝑋 and 𝑌 are statistically independent.

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 32


UNIT-II MULTIPLE RANDOM VARIABLES

7. Differentiate whether two given random variables are statistically


independent or not, if their joint probability density function is given as
5 2
𝑓𝑋𝑌 (𝑥, 𝑦) = {16 𝑥 𝑦 ; 0 < 𝑥 < 2 , 0 < 𝑦 < 2
0 ; 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
Sol: Given,
5 2
𝑓𝑋𝑌 (𝑥, 𝑦) = {16 𝑥 𝑦 ; 0 < 𝑥 < 2 , 0 < 𝑦 < 2
0 ; 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
Condition for statistical independence: 𝒇𝑿𝒀 (𝒙, 𝒚) = 𝒇𝑿 (𝒙)𝒇𝒀 (𝒚)
Calculation of Marginal pdfs:
Formulas:

𝒇𝑿 (𝒙) = ∫ 𝒇𝑿𝒀 (𝒙, 𝒚) 𝒅𝒚


−∞

𝒇𝒀 (𝒚) = ∫ 𝒇𝑿𝒀 (𝒙, 𝒚) 𝒅𝒙


−∞

Now,
2
5 2
𝑓𝑋 (𝑥 ) = ∫ 𝑥 𝑦 𝑑𝑦
16
0
2
5 2 𝑦2
= [𝑥 ( )]
16 2 0
5
= [2𝑥 2 ]
16
5
∴ 𝑓𝑋 (𝑥 ) = 𝑥 2
8
Now,
2
5 2
𝑓𝑌 (𝑦) = ∫ 𝑥 𝑦 𝑑𝑥
16
0

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 33


UNIT-II MULTIPLE RANDOM VARIABLES
2
5 𝑥3
= [( ) 𝑦]
16 3 0

5 8𝑦
= [ ]
16 3
5𝑦
∴ 𝑓𝑌 (𝑦) =
6
5 5𝑦
∴ 𝑓𝑋 (𝑥 )𝑓𝑌 (𝑦) = ( 𝑥 2 ) ( )
8 6
25 2
= (𝑥 𝑦)
48
∴ 𝒇𝑿 (𝒙)𝒇𝒀 (𝒚) ≠ 𝒇𝑿𝒀 (𝒙, 𝒚)
Hence, the given two random variables are not statistically independent.

8. Two random variables 𝑋 and 𝑌 are having joint density function


𝑥+𝑦 ; 0 < 𝑥 <2, 0<𝑦<1
𝑓𝑋𝑌 (𝑥, 𝑦) = {
0 ; 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
Categorize whether 𝑋 and 𝑌 are statistically independent or not.

Sol: Given,
𝑥+𝑦 ; 0<𝑥 <2, 0<𝑦<1
𝑓𝑋𝑌 (𝑥, 𝑦) = {
0 ; 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
Condition for statistical independence: 𝒇𝑿𝒀 (𝒙, 𝒚) = 𝒇𝑿 (𝒙)𝒇𝒀 (𝒚)
Calculation of Marginal pdfs:
Formulas:

𝒇𝑿 (𝒙) = ∫ 𝒇𝑿𝒀 (𝒙, 𝒚) 𝒅𝒚


−∞

𝒇𝒀 (𝒚) = ∫ 𝒇𝑿𝒀 (𝒙, 𝒚) 𝒅𝒙


−∞

Now,

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 34


UNIT-II MULTIPLE RANDOM VARIABLES
1

𝑓𝑋 (𝑥 ) = ∫(𝑥 + 𝑦) 𝑑𝑦
0
1
𝑦2
= [𝑥𝑦 + ]
2 0
1
∴ 𝑓𝑋 (𝑥 ) = 𝑥 +
2
Now,
2

𝑓𝑌 (𝑦) = ∫(𝑥 + 𝑦) 𝑑𝑥
0
2
𝑥2
= [ + 𝑥𝑦]
2 0

∴ 𝑓𝑌 (𝑦) = 2𝑦 + 2
1
∴ 𝑓𝑋 (𝑥 )𝑓𝑌 (𝑦) = (𝑥 + ) (2𝑦 + 2)
2
= 2𝑥𝑦 + 2𝑥 + 𝑦 + 1
∴ 𝒇𝑿 (𝒙)𝒇𝒀 (𝒚) ≠ 𝒇𝑿𝒀 (𝒙, 𝒚)
Hence, 𝑋 and 𝑌 are not statistically independent.
Calculation of Correlation Coefficient:
Formulas:
𝝁𝟏𝟏 𝑪𝑿𝒀
𝝆=[ ]=[ ]
√𝝁𝟐𝟎 𝝁𝟎𝟐 𝝈𝒙 𝝈𝒀

𝝁𝟏𝟏 𝒐𝒓 𝑪𝑿𝒀 = 𝑹𝑿𝒀 − 𝑬[𝑿]𝑬[𝒀]


̅ )𝟐 ] = 𝑬[𝑿𝟐 ] − (𝑬[𝑿])𝟐
𝝁𝟐𝟎 𝒐𝒓 𝝈𝑿 𝟐 = 𝑬[(𝑿 − 𝑿
̅ )𝟐 ] = 𝑬[𝒀𝟐 ] − (𝑬[𝒀])𝟐
𝝁𝟎𝟐 𝒐𝒓 𝝈𝒀 𝟐 = 𝑬[(𝒀 − 𝒀
To find 𝑅𝑋𝑌 :
∞ ∞

𝑹𝑿𝒀 = 𝒎𝟏𝟏 = ∫ ∫ 𝒙𝒚 𝒇𝑿𝒀 (𝒙, 𝒚) 𝒅𝒙 𝒅𝒚


−∞ −∞

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 35


UNIT-II MULTIPLE RANDOM VARIABLES
2 1

⇒ 𝑅𝑋𝑌 = ∫ ∫ 𝑥𝑦 (𝑥 + 𝑦) 𝑑𝑥 𝑑𝑦
0 0
2 1

⇒ 𝑅𝑋𝑌 = ∫ ∫(𝑥 2𝑦 + 𝑥𝑦 2 ) 𝑑𝑥 𝑑𝑦
0 0

Integrating w.r.to ‘y’


2 1
2
𝑦2 𝑦3
⇒ 𝑅𝑋𝑌 = ∫ [𝑥 ( ) + 𝑥 ( )] 𝑑𝑥
2 3 0
0
2
𝑥2 𝑥
⇒ 𝑅𝑋𝑌 = ∫ [ + ] 𝑑𝑥
2 3
0

Integrating w.r.to ‘x’


2
𝑥3 𝑥2
⇒ 𝑅𝑋𝑌 =[ + ]
6 6 0
8 4
⇒ 𝑅𝑋𝑌 = [ + ]
6 6
∴ 𝑅𝑋𝑌 = 2
To find 𝐸 [𝑋]:

𝑬[𝑿] = 𝒎𝟏𝟎 = ∫ 𝒙𝒇𝑿 (𝒙) 𝒅𝒙


−∞
2
1
𝐸 [𝑋] = ∫ 𝑥 (𝑥 + ) 𝑑𝑥
2
0
2
𝑥
= ∫ (𝑥 2 + ) 𝑑𝑥
2
0
2
𝑥3 𝑥2
=[ + ]
3 4 0
11
∴ 𝐸 [𝑋] =
3

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 36


UNIT-II MULTIPLE RANDOM VARIABLES

To find𝐸 [𝑌]:

𝑬[𝒀] = 𝒎𝟎𝟏 = ∫ 𝒚𝒇𝒀 (𝒚) 𝒅𝒚


−∞
1

𝐸 [𝑌] = ∫ 𝑦(2𝑦 + 2) 𝑑𝑦
0
1

= 2 ∫(𝑦 2 + 𝑦) 𝑑𝑦
0
1
𝑦3 𝑦2
= 2[ + ]
3 2 0
5
= 2[ ]
6
5
∴ 𝐸 [𝑌] =
3
Evaluation of𝐶𝑋𝑌 :
𝑪𝑿𝒀 = 𝑹𝑿𝒀 − 𝑬[𝑿]𝑬[𝒀]
11 5
=2−( )( )
3 3
= 2 − 6.1111
∴ 𝑪𝑿𝒀 = −𝟒. 𝟏𝟏𝟏𝟏
To find𝐸 [𝑋2 ]:

𝑬[𝑿𝟐 ] = 𝒎𝟐𝟎 = ∫ 𝒙𝟐 𝒇𝑿 (𝒙) 𝒅𝒙


−∞
2
1
𝐸 [𝑋 2 ] = ∫ 𝑥 2 (𝑥 + ) 𝑑𝑥
2
0
2
𝑥2 3
= ∫ (𝑥 + ) 𝑑𝑥
2
0

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 37


UNIT-II MULTIPLE RANDOM VARIABLES
2
𝑥4 𝑥6
=[ + ]
4 6 0
4
= 4+
3
16
∴ 𝐸 [𝑋 2 ] =
3
To find𝐸 [𝑌 2 ]:

𝑬[𝒀𝟐 ] = 𝒎𝟎𝟐 = ∫ 𝒚𝟐 𝒇𝒀 (𝒚) 𝒅𝒚


−∞
1

𝐸 [𝑌 2 ] = ∫ 𝑦 2 (2𝑦 + 2) 𝑑𝑦
0
1

= 2 ∫[𝑦 3 + 𝑦 2 ] 𝑑𝑦
0
1
𝑦4 𝑦3
= 2[ + ]
4 3 0
1 1
= 2[ + ]
4 3
7
∴ 𝐸 [𝑌 2 ] =
6
Evaluation of𝜎𝑋 2 :
𝝈𝑿 𝟐 = 𝑬[𝑿𝟐 ] − (𝑬[𝑿])𝟐

2
16 11 2
𝜎𝑋 = −( )
3 3
𝟕𝟑
∴ 𝝈𝑿 𝟐 = −
𝟗
Evaluation of𝜎𝑌 2 :
𝝈𝒀 𝟐 = 𝑬[𝒀𝟐 ] − (𝑬[𝒀])𝟐

2
7 5 2
𝜎𝑌 = −( )
6 3

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 38


UNIT-II MULTIPLE RANDOM VARIABLES

𝟐𝟗
∴ 𝝈𝒀 𝟐 = −
𝟏𝟖
Correlation coefficient:
𝑪𝑿𝒀
𝝆=
𝝈𝒙 𝝈𝒀
−4.1111
=
√(− 73) (− 29)
9 18
−4.1111
=
3.615
∴ 𝝆 = −𝟏. 𝟏𝟑𝟕

9. Given,
(𝑥 + 𝑦)2
𝑓𝑋𝑌 (𝑥, 𝑦) = { 40 ; −1 < 𝑥 < 1 , −3 < 𝑦 < 3
0 ; 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
Determine variance of𝑋 and 𝑌.
Sol: Given,
(𝑥 + 𝑦 ) 2
𝑓𝑋𝑌 (𝑥, 𝑦) = { 40 ; −1 < 𝑥 < 1 , −3 < 𝑦 < 3
0 ; 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
Calculation of Marginal pdfs:
Formulas:

𝒇𝑿 (𝒙) = ∫ 𝒇𝑿𝒀 (𝒙, 𝒚) 𝒅𝒚


−∞

𝒇𝒀 (𝒚) = ∫ 𝒇𝑿𝒀 (𝒙, 𝒚) 𝒅𝒙


−∞

Now,
3
(𝑥 + 𝑦)2
𝑓𝑋 (𝑥 ) = ∫ 𝑑𝑦
40
−3

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 39


UNIT-II MULTIPLE RANDOM VARIABLES
3
1
= ∫(𝑥 2 + 𝑦 2 + 2𝑥𝑦) 𝑑𝑦
40
−3
3
1 2 𝑦3 𝑦2
= [𝑥 𝑦 + + 2𝑥 ( )]
40 3 2 −3
1 27 27
= [3𝑥 2 + + 9𝑥 − (−3𝑥 2 − + 9𝑥)]
40 3 3
1
= [6𝑥 2 + 18]
40
3𝑥 2 + 9
∴ 𝑓𝑋 (𝑥 ) =
20
Now,
1
(𝑥 + 𝑦)2
𝑓𝑌 (𝑦) = ∫ 𝑑𝑥
40
−1
1
1
= ∫(𝑥 2 + 𝑦 2 + 2𝑥𝑦) 𝑑𝑥
40
−1
1
1 𝑥3 2
𝑥2
= [ + 𝑥𝑦 + 2 ( ) 𝑦]
40 3 2 −1
1 1 1
= [ + 𝑦 2 + 𝑦 − (− − 𝑦 2 + 𝑦)]
40 3 3
1 2
= [ + 2𝑦 2 ]
40 3
3𝑦 2 + 1
∴ 𝑓𝑌 (𝑦) =
60
To find 𝑚10(𝑜𝑟)𝑚1 of 𝑓𝑋 (𝑥 ):

𝒎𝟏𝟎 = 𝑬[𝑿] = 𝒎𝟏 = ∫ 𝒙 𝒇𝑿 (𝒙) 𝒅𝒙


−∞
1
3𝑥 2 + 9
𝑚1 = ∫ 𝑥 ( ) 𝑑𝑥
20
−1

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 40


UNIT-II MULTIPLE RANDOM VARIABLES
1
3
= ∫[𝑥 3 + 3𝑥 ] 𝑑𝑥
20
−1
1
3 𝑥 4 3𝑥 2
= [ + ]
20 4 2 −1
3 1 3 1 3
= [ + − ( + )]
20 4 2 4 2
3
= [0]
20
∴ 𝑚1 = 0
To find 𝑚20 (𝑜𝑟)𝑚2 of 𝑓𝑋 (𝑥 ):

𝒎𝟐𝟎 = 𝑬[𝑿𝟐 ] = 𝒎𝟐 = ∫ 𝒙𝟐 𝒇𝑿 (𝒙) 𝒅𝒙


−∞
1
3𝑥 2 + 9
2
𝑚2 = ∫ 𝑥 ( ) 𝑑𝑥
20
−1
1
3
= ∫[𝑥 4 + 3𝑥 2 ] 𝑑𝑥
20
−1
1
3 𝑥 5 3𝑥 3
= [ + ]
20 5 3 −1
3 1 1
= [ + 1 − (− − 1)]
20 5 5
3 12
= [ ]
20 5
∴ 𝑚2 = 0.36
Variance of 𝑋:
𝝈 𝑿 𝟐 = 𝒎𝟐 − 𝒎 𝟏 𝟐
𝜎𝑋 2 = 0.36 − 02
∴ 𝝈𝑿 𝟐 = 𝟎. 𝟑𝟔
To find 𝑚01 (𝑜𝑟)𝑚1 of 𝑓𝑌 (𝑦):

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 41


UNIT-II MULTIPLE RANDOM VARIABLES

𝒎𝟎𝟏 = 𝑬[𝒀] = 𝒎𝟏 = ∫ 𝒚 𝒇𝒀 (𝒚) 𝒅𝒚


−∞
3
3𝑦 2 + 1
𝑚1 = ∫ 𝑦 ( ) 𝑑𝑦
60
−3
3
1
= ∫(3𝑦 3 + 𝑦) 𝑑𝑦
60
−3
3
1 𝑦4 𝑦2
= [3 ( ) + ]
60 4 2 −3
1 3 9 3 9
= [ (81) + − ( (81) + )]
60 4 2 4 2
1
= [0]
60
∴ 𝑚1 = 0
To find 𝑚02 (𝑜𝑟)𝑚2 of 𝑓𝑌 (𝑦):

𝒎𝟎𝟐 = 𝑬[𝒀𝟐 ] = 𝒎𝟐 = ∫ 𝒚𝟐 𝒇𝒀 (𝒚) 𝒅𝒚


−∞
3
3𝑦 2 + 1
2
𝑚2 = ∫ 𝑦 ( ) 𝑑𝑦
60
−3
3
1
= ∫(3𝑦 4 + 𝑦 2 ) 𝑑𝑦
60
−3
3
1 𝑦5 𝑦3
= [3 ( ) + ]
60 5 3 −3
1 3 27 3 27
= [ (243) + − ( (−243) − )]
60 5 3 5 3
1
= [309.6]
60
∴ 𝑚2 = 5.16

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 42


UNIT-II MULTIPLE RANDOM VARIABLES

Variance of 𝑌:
𝝈 𝒀 𝟐 = 𝒎𝟐 − 𝒎 𝟏 𝟐
𝜎𝑌 2 = 5.16 − 02
∴ 𝝈𝒀 𝟐 = 𝟓. 𝟏𝟔

10. Two statistically independent random variables 𝑋 and 𝑌 have 𝑋̅ =


̅̅̅̅2 = 8, 𝑌̅ = 4, ̅𝑌̅̅2̅ = 25.
2,𝑋
For another random variable given as 𝑊 = 3𝑋 − 𝑌, calculate the variance.
Sol:Given,
𝑊 = 3𝑋 − 𝑌 and
𝑋̅ = 2, ̅̅̅̅
𝑋2 = 8, 𝑌̅ = 4, ̅𝑌̅̅2̅ = 25
To find 𝑚1 of 𝑊:
𝑚1 = 𝐸 [𝑊 ] = 𝐸 [3𝑋 − 𝑌]
= 3𝐸 [𝑋] − 𝐸 [𝑌]
= 3(2) − 4
∴ 𝑚1 = 2
To find 𝑚2 of 𝑊:
𝑚2 = 𝐸 [𝑊 2 ] = 𝐸 [(3𝑋 − 𝑌)2 ]
= 𝐸 [9𝑋 2 + 𝑌 2 − 6𝑋𝑌 ]
= 9𝐸 [𝑋2 ] + 𝐸 [𝑌 2 ] − 6𝐸 [𝑋𝑌]
= 9(8) + 25 − 6𝐸 [𝑋]𝐸 [𝑌][∵ 𝑋 𝑎𝑛𝑑 𝑌 𝑎𝑟𝑒 𝑠𝑡𝑎𝑡𝑖𝑠𝑡𝑖𝑐𝑎𝑙𝑙𝑦 𝑖𝑛𝑑𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡 ]
= 72 + 25 − 6(2)(4)
∴ 𝑚2 = 49
Variance of𝑊:
𝝈 𝑾 𝟐 = 𝒎𝟐 − 𝒎𝟏 𝟐
𝜎𝑊 2 = 𝟒𝟗 − 𝟐𝟐
∴ 𝝈𝑾𝟐 = 𝟒𝟓

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 43


UNIT-II MULTIPLE RANDOM VARIABLES
3 ̅̅̅̅
11. 𝑋 & 𝑌 be statistically independent random variables with 𝑋̅ = , 𝑋2 =
4

4, 𝑌̅ = 1, ̅𝑌̅̅2̅ = 5. If a new random variable is defined as 𝑊 = 𝑋 −


2𝑌 + 1 then, calculate
(i)𝑅𝑋𝑌 (ii)𝑅𝑋𝑊 (iii)𝑅𝑌𝑊
Sol:Given,
𝑊 = 𝑋 − 2𝑌 + 1 and
3 ̅̅̅̅
𝑋̅ = ,𝑋 ̅ = 1, ̅𝑌̅̅2̅ = 5
2 = 4, 𝑌
4

To find mean of𝑊:


𝐸 [𝑊 ] = 𝐸 [𝑋 − 2𝑌 + 1]
= 𝐸 [𝑋] − 2𝐸 [𝑌] + 1
3
− 2(1) + 1
=
4
1
∴ 𝐸 [𝑊 ] = −
4
(i)𝑅𝑋𝑌 :
𝑹𝑿𝒀 = 𝑬[𝑿𝒀]
= 𝑬[𝑿]𝑬[𝒀][∵ 𝑋 𝑎𝑛𝑑 𝑌 𝑎𝑟𝑒 𝑠𝑡𝑎𝑡𝑖𝑠𝑡𝑖𝑐𝑎𝑙𝑙𝑦 𝑖𝑛𝑑𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡 ]
⇒ 𝑅𝑋𝑌 = 𝑋̅𝑌̅
3
⇒ 𝑅𝑋𝑌 = ×1
4
𝟑
∴ 𝑹𝑿𝒀 =
𝟒
(ii)𝑅𝑋𝑊 :
𝑅𝑋𝑊 = 𝐸 [𝑋𝑊 ]
= 𝐸 [𝑋(𝑋 − 2𝑌 + 1)]
= 𝐸 [𝑋2 − 2𝑋𝑌 + 𝑋]
= 𝐸 [𝑋2 ] − 2𝐸 [𝑋]𝐸 [𝑌] + 𝐸 [𝑋]
3 3 13
= 4 − 2 ( ) (1) + =
4 4 4

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 44


UNIT-II MULTIPLE RANDOM VARIABLES

∴ 𝑹𝑿𝑾 = 𝟑. 𝟐𝟓
(iii)𝑅𝑌𝑊 :
𝑅𝑌𝑊 = 𝐸 [𝑌𝑊 ]
= 𝐸 [𝑌 (𝑋 − 2𝑌 + 1)]
= 𝐸 [𝑌𝑋 − 2𝑌 2 + 𝑌]
= 𝐸 [𝑌]𝐸 [𝑋] − 2𝐸 [𝑌 2 ] + 𝐸 [𝑌]
3
= (1) ( ) − 2(5) + 1
4
−33
=
4
∴ 𝑹𝒀𝑾 = −𝟖 ⋅ 𝟐𝟓
12. Two random variables 𝑋 and 𝑌 have means 𝑋̅ = 1, 𝑌̅ = 3 and
variances 𝜎𝑋 2 = 4 and 𝜎𝑌 2 = 1 and correlation coefficient 𝜌𝑋𝑌 =0.4. New
random variables 𝑊 and 𝑉 are defined such that 𝑊 = 𝑋 + 3𝑌 and 𝑉 =
−𝑋 + 2𝑌.
Find (i)Mean (ii)Variance of 𝑊 and 𝑉
Sol:Given,
𝑋̅ = 1,𝑌̅ = 3
𝜎𝑋 2 = 4, 𝜎𝑌 2 = 1
𝜌𝑋𝑌 = 0.4 and also
𝑊 = 𝑋 + 3𝑌, 𝑉 = −𝑋 + 2𝑌
(i) Mean of𝑊:
̅ = 𝐸 [𝑋 + 3𝑌]
𝐸 [𝑊 ] = 𝑊
= 𝐸 [𝑋] + 3𝐸 [𝑌]
= 𝑋̅ + 3𝑌̅
= 1 + 3(3)
∴ 𝑬[𝑾] = 𝟏𝟎
Mean of𝑉:
𝐸 [𝑉 ] = 𝑉̅ = 𝐸 [−𝑋 + 2𝑌]

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 45


UNIT-II MULTIPLE RANDOM VARIABLES

= − 𝐸 [𝑋] + 2𝐸 [𝑌]
= −𝑋̅ + 2𝑌̅
= −1 + 2(3)
∴ 𝑬[𝑽] = 𝟓
(ii)Variance of𝑊 and 𝑉:
Given,
𝑪𝑿𝒀
𝜌𝑋𝑌 = 0.4 [∵ 𝝆𝑿𝒀 = ]
𝝈𝑿 𝝈𝒀
𝐶𝑋𝑌
⇒ = 0.4
𝜎𝑋 𝜎𝑌
𝑅𝑋𝑌 − 𝐸 [𝑋] 𝐸 [𝑌]
⇒ = 0.4
𝜎𝑋 𝜎𝑌
⇒ 𝐸 [𝑋𝑌] = 0.4(𝜎𝑋 𝜎𝑌 ) + 𝐸 [𝑋]𝐸 [𝑌]
⇒ 𝑅𝑋𝑌 = 𝐸 [𝑋𝑌 ] = 0.4(2 × 1) + 3
∴ 𝑅𝑋𝑌 = 𝐸 [𝑋𝑌 ] = 3.8
Now,
𝜎𝑋 2 = 𝑚2 − 𝑚12
⇒ 𝑚2 = 𝜎𝑋 2 + 𝑚12
⇒ 𝑚2 = 4 + 1
∴ 𝐸 [𝑋 2 ] = 5
And
𝜎𝑌 2 = 𝑚2 − 𝑚12
⇒ 𝑚2 = 𝜎𝑌 2 + 𝑚12
⇒ 𝑚2 = 1 + 9
∴ 𝐸 [𝑌 2 ] = 10
To find variance of𝑊:
Now, 𝑚2 of 𝑊 i.e,
𝐸 [𝑊 2 ] = 𝐸 [(𝑋 + 3𝑌)2]
= 𝐸 [𝑋2 + 9𝑌 2 + 6𝑋𝑌 ]

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 46


UNIT-II MULTIPLE RANDOM VARIABLES

= 𝐸 [𝑋 2 ] + 9 𝐸 [𝑌 2 ] + 6𝐸 [𝑋𝑌]
= 5 + 90 + 6(3.8)
∴ 𝐸 [𝑊 2 ] = 117.8
Variance of 𝑊:
𝜎𝑊 2 = 𝑚2 − 𝑚12
= 𝐸 [𝑊 2 ] − 𝐸 [𝑊 ]2
= 117.8 − 102
∴ 𝝈𝑾 𝟐 = 𝟏𝟕. 𝟖
To find variance of𝑉:
Now, 𝑚2 of 𝑉 i.e,
𝐸 [𝑉 2 ] = 𝐸 [(−𝑋 + 2𝑌)2]
= 𝐸 [𝑋2 + 4𝑌 2 − 4𝑋𝑌 ]
= 𝐸 [𝑋 2 ] + 4 𝐸 [𝑌 2 ] − 4𝐸 [𝑋𝑌]
= 5 + 40 − 4(3.8)
∴ 𝐸 [𝑉 2 ] = 29.8

Variance of 𝑉:
𝜎𝑉 2 = 𝑚2 − 𝑚12
= 𝐸 [𝑉 2 ] − 𝐸 [𝑉 ]2
= 29.8 − 52
∴ 𝝈𝑽 𝟐 = 𝟒. 𝟖

CH SIVA RAMA KRISHNA Dept.of ECE, LBRCE Page 47

You might also like