Module 2_1
Module 2_1
If g(.) is a real function of random variable X, then the expected value is,
𝑁
E[g(X)] == 𝑖<1 𝑔(𝑥𝑖 )𝑃(𝑥𝑖 ) ∞
𝐸𝑔 𝑥 = 𝑔 𝑥 𝑓𝑋 𝑥 𝑑𝑥
;∞
Conditional Expected Value
Conditional expectation of a Random Variable X, is
∞
– Continuous Random Variable: E[X|B] = ;∞
𝑥𝑓𝑋 (𝑥|𝐵)𝑑𝑥
𝑁
– Discrete Random Variable : E[X|B] = 𝑖<1 𝑥𝑖 𝑃 𝑥𝑖 𝐵
B = {X ≤ 𝑏} −∞ < 𝑏 < ∞
Moments
The “moments” of a random variable (or of its
distribution) are expected values of powers or related
functions of the random variable.
∞
𝐸[𝑋] = 𝑋 = 𝑥𝑓
;∞ 𝑋
𝑥 𝑑𝑥 - first moment about the origin
∞ 2
𝐸 𝑋2 = 𝑋2 = = ;∞
𝑥 𝑓𝑋 𝑥 𝑑𝑥 - Mean Square of X or Second moment about the
origin
∞
𝐸[𝑋 − 𝑋]2 = ;∞
𝑥 − 𝑋 2 𝑓𝑋 𝑥 𝑑𝑥 - Second moment about the mean or Variance
The nth moment about the origin of a Random Variable X: g(X) = Xn , n = 0,1,2, …
Discrete Random Variable, nth moment Continuous Random Variable, nth moment
𝑁
𝑚𝑛 = 𝐸 𝑋 𝑛 = 𝑖<1 𝑥𝑖
𝑛
𝑃(𝑥𝑖 ) ∞ 𝑛
𝑚𝑛 = 𝐸 𝑋 𝑛 = ;∞
𝑥 𝑓𝑋 𝑥 𝑑𝑥
Where, m0 = 1 i.e the area of the function 𝑓𝑋 (𝑥) and m1 = 𝑋, the expected value of X
= 𝐸[𝑋 2 − 2𝑋 𝑋 + 𝑋 2 ]
= 𝐸 𝑋 2 − 2𝑋 𝐸 𝑋 + 𝑋 2
= 𝑚2 − 𝑚1 2
• Variance can be found from the first and second moments.
Standard deviation
• The positive square root of variance is called
the standard deviation of X denoted by 𝜍𝑋
• 𝜍𝑋 = + 𝑉𝑎𝑟[𝑋]
Skewness
• The third central moment 𝜇3 = 𝐸[(𝑋 − 𝑋)3 ] is a measure of the
asymmetry of 𝑓𝑋 (𝑥) about 𝑥 = 𝑋= 𝑚1 .
• It is also called as the skew of the density function.
• If the density is symmetric about x = 𝑋, then it has zero skew.
• For this case, 𝜇𝑛 = 0 for all odd values of n.
• Skewness/ Coefficient of Skewness:
• The skewness of the density function is the normalized third
𝜇3
central moment =
𝜎𝑋 3
𝜇4 = Kurtosis(Sharpness)
Problems
1. Consider a discrete random variable X
0.2, 𝑥 = 0
0.3, 𝑥 = 1
𝑃𝑋 𝑥 = 0.1, 𝑥 = 2
0.4, 𝑥 = 3
0, 𝑜𝑡𝑒𝑟𝑤𝑖𝑠𝑒
Find m1, m2, 1, 2, 3, 4
2. A continuous random variable has probability density function
2𝑒 ;2𝑥 , 𝑥 ≥ 0
𝑓𝑋 𝑥 =
0, 𝑜𝑡𝑒𝑟𝑤𝑖𝑠𝑒
find m1, m2,, 1, 2
5
(1 − 𝑥 4), 0 < 𝑥 ≤ 1
3. Let 𝑓𝑋 𝑥 = 4 find a. E[X], b. E[4X+2], c. E[X2]
0, 𝑒𝑙𝑠𝑒𝑤𝑒𝑟𝑒
4. Find the first two moments2 about the origin and about mean for a random variable x having density
9;𝑥
4𝑥
function 𝑓𝑋 𝑥 = 81 ,0 ≤ 𝑥 ≤ 3
0, 𝑒𝑙𝑠𝑒𝑤𝑒𝑟𝑒
4.Let X be the outcome of throwing fair die, find m1, m2, m3, 1, 2,
0, 𝑒𝑙𝑠𝑒𝑤𝑒𝑟𝑒
Determine:
(i) Mean
(ii) Conditional expected value E[X|B]
5
(1 − 𝑥 4), 0 < 𝑥 ≤ 1
Let 𝑓𝑋 𝑥 = 4
0, 𝑒𝑙𝑠𝑒𝑤𝑒𝑟𝑒
find a. E[X], b. E[4X+2], c. E[X2]
4. Find the first two moments about the origin
and about mean for a random variable x having
density function
9;𝑥 2
4𝑥
𝑓𝑋 𝑥 = 81 ,0 ≤ 𝑥 ≤ 3
0, 𝑒𝑙𝑠𝑒𝑤𝑒𝑟𝑒
Functions that Give Moments
Characteristic function and Moment generating function allow
moments to be calculated for a random variable X.
Continuous Random Variable
Characteristic Function ∞
𝜑𝑋 𝜔 = 𝐸 𝑒 𝑗𝜔𝑋 = 𝑓
;∞ 𝑋
𝑥 𝑒 𝑗𝜔𝑥 𝑑𝑥,
Discrete Random Variable
∞
𝑁 𝑗𝜔𝑥 1
𝜑𝑋 𝜔 = 𝑖<1 𝑒 𝑃(𝑥𝑖 ) 𝑓𝑋 𝑥 = 𝜑𝑋 𝜔 𝑒 ;𝑗𝜔𝑥 𝑑𝜔
2𝜋 ;∞
𝑀𝑋 𝑣 = 𝑁 𝑣𝑥 𝑀𝑋 𝑣 = 𝑓𝑋 (𝑥)𝑒 𝑣𝑥 𝑑𝑥
𝑖<1 𝑒 𝑃(𝑥𝑖 ) ;∞
𝑑 𝑛 𝑀𝑋 (𝑣)
𝑚𝑛 = 𝑛
|𝑣<0
𝑑𝑣
• Disadvantage of moment generating function compared to characteristic
function is, it does not exist for all random variables and all values of v.
Transformation of Random Variable
Transformation is a means to transform (change) one
random variable X into a new random variable Y.
𝑌 = 𝑇(𝑋)
1. Monotonic Transformations
;1 𝑑𝑇 −1 (𝑦) 𝑑𝑥
𝑓𝑌 𝑦 = 𝑓𝑋 (𝑇 (𝑦)) (or) 𝑓𝑌 𝑦 = 𝑓𝑋 (𝑥)
𝑑𝑦 𝑑𝑦
2. Non-monotonic Transformations
𝑓𝑋 (𝑥𝑛 )
𝑓𝑌 𝑦 = 𝑛 𝑑𝑇(𝑥) n=1,2,3, ….
𝑑𝑥 𝑥=𝑥𝑛
Problems
1. Consider a random variable with the
exponential density function
1 ;(𝑥;𝑎), 𝑥>𝑎
𝑓𝑋 𝑥 = 𝑏 𝑒 𝑏
0, 𝑥 < 𝑎
a) Find the characteristic function and first
moment
b) Find the moment generating function and
first moment
Operations on Multiple Random Variables
Joint Moments about Origin
∞ ∞ 𝑛 𝑘
𝑚𝑛𝑘 = 𝐸 𝑋𝑛𝑌𝑘 = ;∞ ;∞
𝑥 𝑦 𝑓𝑋,𝑌 (𝑥, 𝑦) 𝑑𝑥𝑑𝑦
The first order moments or the expected values of X and Y respectively are:
𝑚10 = 𝐸 𝑋 = 𝑋
𝑚01 = 𝐸 𝑌 = 𝑌 ,
• These are also the coordinates of the ‘center of gravity’ of the function 𝑓𝑋,𝑌 𝑥, 𝑦
• n + k is the order of the moment
• The second order moment or correlation of X and Y and is denoted by 𝑅𝑋𝑌
𝑚11 = 𝐸 𝑋𝑌
𝑅𝑋𝑌 = 𝐸 𝑋𝑌 = 𝐸 𝑋 𝐸[𝑦], X and Y are uncorrelated
𝑅𝑋𝑌 = 𝐸 𝑋𝑌 = 0 X and Y are called orthogonal.
• Statistical independence of X and Y is sufficient to guarantee they are uncorrelated.
• The opposite is not necessarily true
Joint Central Moments
∞ ∞
• 𝜇𝑛𝑘 = 𝐸[(𝑋 − 𝑋)𝑛 (𝑌 − 𝑌)𝑘 ] = ;∞ ;∞
(𝑋 − 𝑋)𝑛 𝑌 − 𝑌 𝑘 𝑓𝑋,𝑌 𝑥, 𝑦 𝑑𝑥𝑑𝑦
• Second order central moments or variance
2
𝜇20 = 𝐸 𝑋 − 𝑋 = 𝜍𝑥 2
2
𝜇02 = 𝐸 𝑌 − 𝑌 = 𝜍𝑦 2
𝜇11 = 𝐸 𝑋 − 𝑋 𝑌 − 𝑌 = 𝐶𝑋𝑌 ⇒ Covariance of X and Y
∞ ∞
𝐶𝑋𝑌 = 𝐸 𝑋 − 𝑋 𝑌 − 𝑌 = ;∞ ;∞
𝑋 − 𝑋 𝑌 − 𝑌 𝑓𝑋,𝑌 𝑥, 𝑦 𝑑𝑥 𝑑𝑦
• If we expand 𝐶𝑋𝑌
𝐶𝑋𝑌 = 𝐸 𝑋 − 𝑋 𝑌 − 𝑌 = 𝑅𝑋𝑌 − 𝑋𝑌 = 𝑅𝑋𝑌 − 𝐸 𝑋 𝐸[𝑌]
𝐶𝑋𝑌 = 0, X and Y are independent or uncorrelated
either X or Y has zero mean i.e. 𝑋 or 𝑌 = 0 ⇒ 𝐶𝑋𝑌 = 0
𝐶𝑋𝑌 = −𝐸 𝑋 𝐸 𝑌 , X and Y are Orthogonal
• For N random variables 𝑋1 , 𝑋2 , . . . , 𝑋𝑛 , the (𝑛1 + 𝑛2 + ⋯ + 𝑛𝑛 )-order joint central
moment is
𝑛1 𝑛2 𝑛𝑁
𝜇𝑛1,𝑛2,…,𝑛𝑁 = 𝐸[ 𝑋1 − 𝑋1 𝑋2 − 𝑋2 𝑋𝑁 − 𝑋𝑁 ]
Joint Central Moments
• The normalized Second order moment
𝜇11 𝐶𝑋𝑌
𝜌= =
𝜇20 𝜇02 𝜍𝑋 𝜍𝑌
𝑋−𝑋 (𝑌 − 𝑌)
𝜌=𝐸
𝜍𝑋 𝜍𝑌