0% found this document useful (0 votes)
3 views18 pages

Module 2_1

The document covers operations on multiple random variables, including joint moments, central moments, and transformations of random variables. It explains concepts such as expectation, variance, skewness, and kurtosis, as well as methods for calculating moments using characteristic and moment generating functions. Additionally, it provides problems related to discrete and continuous random variables for practical understanding.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views18 pages

Module 2_1

The document covers operations on multiple random variables, including joint moments, central moments, and transformations of random variables. It explains concepts such as expectation, variance, skewness, and kurtosis, as well as methods for calculating moments using characteristic and moment generating functions. Additionally, it provides problems related to discrete and continuous random variables for practical understanding.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 18

Module – 2 Operations on Multiple random Variables

Joint Moments for continuous and discrete random variables –


Joint Central Moments – Joint Characteristics Function – Jointly
Gaussian Random Variables – Transformations of Multiple
Random Variables – Linear Transformation of Gaussian Random
Variables – Complex Random Variables.
Review-Operations on Single Random Variable
Expectation: expected value of X is a weighted average of the possible value of X.
Expectation of X = Expected value of X = mean of X = statistical average of X = 𝑋

Continuous Random Variable


Discrete Random Variable

𝐸[𝑋] = 𝑋 = ;∞
𝑥𝑓𝑋 𝑥 𝑑𝑥
𝑁
𝐸[𝑋] = 𝑋 = 𝑖<1 𝑥𝑖 𝑃(𝑥𝑖 )

If g(.) is a real function of random variable X, then the expected value is,

𝑁
E[g(X)] == 𝑖<1 𝑔(𝑥𝑖 )𝑃(𝑥𝑖 ) ∞
𝐸𝑔 𝑥 = 𝑔 𝑥 𝑓𝑋 𝑥 𝑑𝑥
;∞
Conditional Expected Value
Conditional expectation of a Random Variable X, is

– Continuous Random Variable: E[X|B] = ;∞
𝑥𝑓𝑋 (𝑥|𝐵)𝑑𝑥
𝑁
– Discrete Random Variable : E[X|B] = 𝑖<1 𝑥𝑖 𝑃 𝑥𝑖 𝐵

event B can be defined as - it depends on a Random


Variable X by definition

B = {X ≤ 𝑏} −∞ < 𝑏 < ∞
Moments
The “moments” of a random variable (or of its
distribution) are expected values of powers or related
functions of the random variable.

𝐸[𝑋] = 𝑋 = 𝑥𝑓
;∞ 𝑋
𝑥 𝑑𝑥 - first moment about the origin
∞ 2
𝐸 𝑋2 = 𝑋2 = = ;∞
𝑥 𝑓𝑋 𝑥 𝑑𝑥 - Mean Square of X or Second moment about the
origin

𝐸[𝑋 − 𝑋]2 = ;∞
𝑥 − 𝑋 2 𝑓𝑋 𝑥 𝑑𝑥 - Second moment about the mean or Variance

The nth moment about the origin of a Random Variable X: g(X) = Xn , n = 0,1,2, …

Discrete Random Variable, nth moment Continuous Random Variable, nth moment
𝑁
𝑚𝑛 = 𝐸 𝑋 𝑛 = 𝑖<1 𝑥𝑖
𝑛
𝑃(𝑥𝑖 ) ∞ 𝑛
𝑚𝑛 = 𝐸 𝑋 𝑛 = ;∞
𝑥 𝑓𝑋 𝑥 𝑑𝑥

Where, m0 = 1 i.e the area of the function 𝑓𝑋 (𝑥) and m1 = 𝑋, the expected value of X

m = 0, is zeroth moment and m = 1 is the 1st order moment


Moments (contd.,)
Central Moment
• Moments about the mean value of X are called central moments
and the symbol is µn.
• Central moments are defined as the expected value of the function
𝑔 𝑋 = (𝑋 − 𝑋)𝑛 , n = 0, 1, 2, …
𝑛 ∞
• 𝜇𝑛 = 𝐸 𝑋 − 𝑋 = ;∞
𝑥 − 𝑋 𝑛 𝑓𝑋 𝑥 𝑑𝑥 -> Continuous Random
Variable
𝑛 𝑁
• 𝜇𝑛 = 𝐸 𝑋 − 𝑋 = 𝑖<1(𝑥𝑖 − 𝑋)𝑛 𝑃(𝑥𝑖 ) -> Discrete Random
Variable
• µ0 = 1 and
• µ1 = E[X-𝑋] = E[X]-E[𝑋] = 𝑋 − 𝑋 = 0
Variance
• Variance is the second central moment 𝜇2 and the notation is 𝜍𝑋 2
• Var [X]= 𝜍𝑋 2 = 𝜇2 = 𝐸 𝑋 − 𝑋 2

= 𝐸[𝑋 2 − 2𝑋 𝑋 + 𝑋 2 ]
= 𝐸 𝑋 2 − 2𝑋 𝐸 𝑋 + 𝑋 2
= 𝑚2 − 𝑚1 2
• Variance can be found from the first and second moments.
Standard deviation
• The positive square root of variance is called
the standard deviation of X denoted by 𝜍𝑋

• The standard deviation is a measure of the


spread in the function 𝑓𝑋 (𝑥) about the mean.

• 𝜍𝑋 = + 𝑉𝑎𝑟[𝑋]
Skewness
• The third central moment 𝜇3 = 𝐸[(𝑋 − 𝑋)3 ] is a measure of the
asymmetry of 𝑓𝑋 (𝑥) about 𝑥 = 𝑋= 𝑚1 .
• It is also called as the skew of the density function.
• If the density is symmetric about x = 𝑋, then it has zero skew.
• For this case, 𝜇𝑛 = 0 for all odd values of n.
• Skewness/ Coefficient of Skewness:
• The skewness of the density function is the normalized third
𝜇3
central moment =
𝜎𝑋 3

𝜇4 = Kurtosis(Sharpness)
Problems
1. Consider a discrete random variable X
0.2, 𝑥 = 0
0.3, 𝑥 = 1
𝑃𝑋 𝑥 = 0.1, 𝑥 = 2
0.4, 𝑥 = 3
0, 𝑜𝑡𝑕𝑒𝑟𝑤𝑖𝑠𝑒
Find m1, m2, 1, 2, 3, 4
2. A continuous random variable has probability density function
2𝑒 ;2𝑥 , 𝑥 ≥ 0
𝑓𝑋 𝑥 =
0, 𝑜𝑡𝑕𝑒𝑟𝑤𝑖𝑠𝑒
find m1, m2,, 1, 2
5
(1 − 𝑥 4), 0 < 𝑥 ≤ 1
3. Let 𝑓𝑋 𝑥 = 4 find a. E[X], b. E[4X+2], c. E[X2]
0, 𝑒𝑙𝑠𝑒𝑤𝑕𝑒𝑟𝑒
4. Find the first two moments2 about the origin and about mean for a random variable x having density
9;𝑥
4𝑥
function 𝑓𝑋 𝑥 = 81 ,0 ≤ 𝑥 ≤ 3
0, 𝑒𝑙𝑠𝑒𝑤𝑕𝑒𝑟𝑒
4.Let X be the outcome of throwing fair die, find m1, m2, m3, 1, 2,

5. Exponentially distributed density function is given by


1 ;(𝑥;𝑎), 𝑥 > 𝑎
𝑓𝑋 𝑥 = 𝑏 𝑒
𝑏

0, 𝑒𝑙𝑠𝑒𝑤𝑕𝑒𝑟𝑒
Determine:
(i) Mean
(ii) Conditional expected value E[X|B]
5
(1 − 𝑥 4), 0 < 𝑥 ≤ 1
Let 𝑓𝑋 𝑥 = 4
0, 𝑒𝑙𝑠𝑒𝑤𝑕𝑒𝑟𝑒
find a. E[X], b. E[4X+2], c. E[X2]
4. Find the first two moments about the origin
and about mean for a random variable x having
density function
9;𝑥 2
4𝑥
𝑓𝑋 𝑥 = 81 ,0 ≤ 𝑥 ≤ 3
0, 𝑒𝑙𝑠𝑒𝑤𝑕𝑒𝑟𝑒
Functions that Give Moments
Characteristic function and Moment generating function allow
moments to be calculated for a random variable X.
Continuous Random Variable
Characteristic Function ∞
𝜑𝑋 𝜔 = 𝐸 𝑒 𝑗𝜔𝑋 = 𝑓
;∞ 𝑋
𝑥 𝑒 𝑗𝜔𝑥 𝑑𝑥,
Discrete Random Variable

𝑁 𝑗𝜔𝑥 1
𝜑𝑋 𝜔 = 𝑖<1 𝑒 𝑃(𝑥𝑖 ) 𝑓𝑋 𝑥 = 𝜑𝑋 𝜔 𝑒 ;𝑗𝜔𝑥 𝑑𝜔
2𝜋 ;∞

• nth moment of X, after n times differentiation of 𝜑𝑋 𝜔 wrt 𝜔 and


𝑛 𝑑 𝑛 𝜑𝑋 𝜔
then setting 𝜔 =0, is 𝑚𝑛 = (−𝑗) |𝜔<0
𝑑𝜔𝑛

• Advantage of using 𝜑𝑋 𝜔 to find moments is that it always exists


• The maximum magnitude of a characteristic function is unity and
occurs at 𝜔 = 0 i. e 𝜑𝑋 𝜔 ≤ 𝜑𝑋 0 = 1
Moment generating Function
Moment generating function is defined by

𝑀𝑋 𝑣 = 𝐸 𝑒 𝑣𝑥 = ;∞
𝑓𝑋 (𝑥)𝑒 𝑣𝑥
𝑑𝑥 , v is a real number, - < v < 

Continuous Random Variable


Discrete Random Variable

𝑀𝑋 𝑣 = 𝑁 𝑣𝑥 𝑀𝑋 𝑣 = 𝑓𝑋 (𝑥)𝑒 𝑣𝑥 𝑑𝑥
𝑖<1 𝑒 𝑃(𝑥𝑖 ) ;∞

• Main advantage of MGF is its ability give moments

𝑑 𝑛 𝑀𝑋 (𝑣)
𝑚𝑛 = 𝑛
|𝑣<0
𝑑𝑣
• Disadvantage of moment generating function compared to characteristic
function is, it does not exist for all random variables and all values of v.
Transformation of Random Variable
Transformation is a means to transform (change) one
random variable X into a new random variable Y.
𝑌 = 𝑇(𝑋)

1. Monotonic Transformations
;1 𝑑𝑇 −1 (𝑦) 𝑑𝑥
𝑓𝑌 𝑦 = 𝑓𝑋 (𝑇 (𝑦)) (or) 𝑓𝑌 𝑦 = 𝑓𝑋 (𝑥)
𝑑𝑦 𝑑𝑦
2. Non-monotonic Transformations
𝑓𝑋 (𝑥𝑛 )
𝑓𝑌 𝑦 = 𝑛 𝑑𝑇(𝑥) n=1,2,3, ….
𝑑𝑥 𝑥=𝑥𝑛
Problems
1. Consider a random variable with the
exponential density function
1 ;(𝑥;𝑎), 𝑥>𝑎
𝑓𝑋 𝑥 = 𝑏 𝑒 𝑏

0, 𝑥 < 𝑎
a) Find the characteristic function and first
moment
b) Find the moment generating function and
first moment
Operations on Multiple Random Variables
Joint Moments about Origin
∞ ∞ 𝑛 𝑘
𝑚𝑛𝑘 = 𝐸 𝑋𝑛𝑌𝑘 = ;∞ ;∞
𝑥 𝑦 𝑓𝑋,𝑌 (𝑥, 𝑦) 𝑑𝑥𝑑𝑦

𝑚𝑛0 = 𝐸 𝑋𝑛 = 𝑋 - moments mn of X or expected values of X


𝑚0𝑘 = 𝐸 𝑌𝑘 = 𝑌 − moments mk of Y or expected values of Y

The first order moments or the expected values of X and Y respectively are:
𝑚10 = 𝐸 𝑋 = 𝑋
𝑚01 = 𝐸 𝑌 = 𝑌 ,
• These are also the coordinates of the ‘center of gravity’ of the function 𝑓𝑋,𝑌 𝑥, 𝑦
• n + k is the order of the moment
• The second order moment or correlation of X and Y and is denoted by 𝑅𝑋𝑌
𝑚11 = 𝐸 𝑋𝑌
𝑅𝑋𝑌 = 𝐸 𝑋𝑌 = 𝐸 𝑋 𝐸[𝑦], X and Y are uncorrelated
𝑅𝑋𝑌 = 𝐸 𝑋𝑌 = 0 X and Y are called orthogonal.
• Statistical independence of X and Y is sufficient to guarantee they are uncorrelated.
• The opposite is not necessarily true
Joint Central Moments
∞ ∞
• 𝜇𝑛𝑘 = 𝐸[(𝑋 − 𝑋)𝑛 (𝑌 − 𝑌)𝑘 ] = ;∞ ;∞
(𝑋 − 𝑋)𝑛 𝑌 − 𝑌 𝑘 𝑓𝑋,𝑌 𝑥, 𝑦 𝑑𝑥𝑑𝑦
• Second order central moments or variance
2
𝜇20 = 𝐸 𝑋 − 𝑋 = 𝜍𝑥 2
2
𝜇02 = 𝐸 𝑌 − 𝑌 = 𝜍𝑦 2
𝜇11 = 𝐸 𝑋 − 𝑋 𝑌 − 𝑌 = 𝐶𝑋𝑌 ⇒ Covariance of X and Y
∞ ∞
𝐶𝑋𝑌 = 𝐸 𝑋 − 𝑋 𝑌 − 𝑌 = ;∞ ;∞
𝑋 − 𝑋 𝑌 − 𝑌 𝑓𝑋,𝑌 𝑥, 𝑦 𝑑𝑥 𝑑𝑦
• If we expand 𝐶𝑋𝑌
𝐶𝑋𝑌 = 𝐸 𝑋 − 𝑋 𝑌 − 𝑌 = 𝑅𝑋𝑌 − 𝑋𝑌 = 𝑅𝑋𝑌 − 𝐸 𝑋 𝐸[𝑌]
𝐶𝑋𝑌 = 0, X and Y are independent or uncorrelated
either X or Y has zero mean i.e. 𝑋 or 𝑌 = 0 ⇒ 𝐶𝑋𝑌 = 0
𝐶𝑋𝑌 = −𝐸 𝑋 𝐸 𝑌 , X and Y are Orthogonal
• For N random variables 𝑋1 , 𝑋2 , . . . , 𝑋𝑛 , the (𝑛1 + 𝑛2 + ⋯ + 𝑛𝑛 )-order joint central
moment is
𝑛1 𝑛2 𝑛𝑁
𝜇𝑛1,𝑛2,…,𝑛𝑁 = 𝐸[ 𝑋1 − 𝑋1 𝑋2 − 𝑋2 𝑋𝑁 − 𝑋𝑁 ]
Joint Central Moments
• The normalized Second order moment

𝜇11 𝐶𝑋𝑌
𝜌= =
𝜇20 𝜇02 𝜍𝑋 𝜍𝑌

𝑋−𝑋 (𝑌 − 𝑌)
𝜌=𝐸
𝜍𝑋 𝜍𝑌

are known as correlation coefficient of X and Y


• Advantage ⇒ −1 ≤ 𝜌 ≤ +1
Joint Characteristic Function
∞ ∞
• 𝜙𝑋,𝑌 𝜔1 , 𝜔2 = E 𝑒 𝑗𝜔1 𝑋:𝑗𝜔2 𝑌 = ;∞ ;∞
𝑓𝑋,𝑌 (𝑥, 𝑦) 𝑒 𝑗𝜔1 𝑥:𝑗𝜔2 𝑦
𝑑𝑥𝑑𝑦
is a two-Dimensional Fourier Transform with sign of 𝜔1 and 𝜔2 reversed
• The opposite relation
∞ ∞
1
𝑓𝑋,𝑌 𝑥, 𝑦 = 𝜙𝑋,𝑌 (𝜔1 , 𝜔2 ) 𝑒 ;𝑗𝜔1 𝑥;𝑗𝜔2 𝑦 𝑑𝜔1 𝑑𝜔2
(2𝜋)2 ;∞ ;∞

𝜔1 and 𝜔2 are real numbers.


by setting either 𝜔2 = 0 or 𝜔1 = 0 marginal Characteristic function can be
obtained
𝜙𝑋 𝜔1 = 𝜙𝑋,𝑌 (𝑤1 , 0)
𝜙𝑌 𝜔2 = 𝜙𝑋,𝑌 (0, 𝑤2 )
• Joint Moments 𝑚𝑛𝑘 can be determined from the joint Characteristic
function
𝑑 𝑛+𝑘 𝜙
𝑋,𝑌 (𝜔1 ,𝜔2 )
• 𝑚𝑛𝑘 = (−𝑗)𝑛:𝑘 |𝜔1 <0,𝜔2 <0
𝑑𝜔1 𝑛 𝑑𝑤2 𝑘

You might also like