Moment Generating Functions (MGF)
Moment Generating Functions (MGF)
Reference: 1. Meyer PL. Introductory probability and statistical applications. Oxford and IBH Publishing; 1965.
2. Johnson, Richard A., Irwin Miller, and John E. Freund. "Probability and statistics for engineers." (2000).
Definition:
𝑀𝑋 (𝑡) may not exist for all r.v.’s 𝑋. In general, 𝑀𝑋 (𝑡) will exist only for those values of 𝑡
for which the sum or integral of Eq (1) converges absolutely.
Suppose that 𝑀𝑋 (𝑡) exists. If we express 𝑒 𝑡𝑋 formally and take expectation, then
1 1
𝑀𝑋 (𝑡) = 𝐸(𝑒 𝑡𝑋 ) = 𝐸 [1 + 𝑡𝑋 + (𝑡𝑋)2 + ⋯ + (𝑡𝑋)𝑘 + ⋯ ]
2! 𝑘!
𝑡2 𝑡𝑘
= 1 + 𝑡𝐸(𝑋) + 𝐸(𝑋 2 ) + ⋯ + 𝐸(𝑋 𝑘 ) + ⋯ ----------(2)
2! 𝑘!
Note:
𝑡𝑛
𝐸(𝑋 𝑛 ) is coefficient of In 𝑀𝑋 (𝑡).
𝑛!
Solution:
∞
𝑀𝑋 (𝑡) = ∫ 𝑒 𝑡𝑥 𝑓(𝑥) 𝑑𝑥
−∞
∞ 𝑒 −|𝑥|
= ∫−∞ 𝑒 𝑡𝑥 𝑑𝑥
2
0 𝑒𝑥 ∞ 𝑒− 𝑥
= ∫−∞ 𝑒 𝑡𝑥 𝑑𝑥 + ∫0 𝑒 𝑡𝑥 𝑑𝑥
2 2
0 𝑒 (1+𝑡)𝑥 ∞ 𝑒 −(1−𝑡)𝑥
= ∫−∞ 𝑑𝑥 + ∫0 𝑑𝑥
2 2
1
𝑀𝑋 (𝑡) = , -1< t <1
1−𝑡 2
𝑀𝑋1 (0) = 0
𝑀𝑋2 (0) = 2
𝑡2
Cofficient of 𝑡 is 𝐸(𝑋) = 0. Cofficient of is 𝑉(𝑋) = 2 − 0 = 2.
2
2. Let X be a random variable taking the values 0,1,2, ⋯ and 𝑓(𝑥) = 𝑎𝑏 𝑥 , 𝑎, 𝑏 > 0 & a+b=1.
Find mgf of 𝑋. If 𝐸(𝑋) = 𝑚1 , 𝐸(𝑋 2 ) = 𝑚2 then show that 𝑚2= 𝑚1 (2𝑚1 + 1).
Solution:
1 𝑎
𝑀𝑋 (𝑡) = ∑∞ 𝑥 𝑡𝑥
0 𝑎𝑏 𝑒 = 𝑎 ∑∞ 𝑡 𝑥
0 (𝑏𝑒 ) = a =
1−𝑏𝑒 𝑡 1−𝑏𝑒 𝑡
𝑎𝑏
𝐸(𝑋) = 𝑀𝑋1 (0) =
(1 − 𝑏)2
(1 + 𝑏)𝑎𝑏
𝐸(𝑋 2 ) = 𝑀𝑋2 (0) =
(1 − 𝑏)3
Given,
𝐸(𝑋) = 𝑚1 , 𝐸(𝑋 2 ) = 𝑚2
𝑎𝑏
= (2𝑏(𝑎 − 1) + 1 + 𝑏 2 )
(1−𝑏)4
𝑎𝑏
= (2𝑏(−𝑏) + 1 + 𝑏 2 )= 𝑚2 .
(1−𝑏)4
Show that
Solution:
𝐸(𝑋) = 𝑛𝑝
𝐸(𝑋) = (𝑛 − 1)𝑛𝑝2 + 𝑛𝑝
Also, the variance is
Show that
𝑡 −1)
𝑀(𝑡) = 𝑒 𝜆(𝑒 for all 𝑡
Solution:
𝐸(𝑋) = 𝜆
𝐸(𝑋 2 ) = 𝜆 + 𝜆2
Variance is
𝑉𝑎𝑟 (𝑋) = 𝐸(𝑋 2 ) − [𝐸(𝑋)]2 = 𝜆
𝑥 𝑟−1 𝑒 −𝛼𝑥 𝛼 𝑟
𝑓(𝑥) = { , 𝑥 > 0, 𝛼, 𝑟 > 0
Γ(𝑟)
0, 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
∞ 𝛼𝑟 ∞
𝑀𝑋 (𝑡) = ∫−∞ 𝑒 𝑡𝑥 𝑓(𝑥) 𝑑𝑥 = ∫ 𝑒 −(𝛼−𝑡)𝑥 𝑥 𝑟−1 𝑑𝑥
Γ(𝑟) 0
𝑑𝑣
Substitute, 𝑥(𝛼 − 𝑡) = 𝑣 then 𝑑𝑥 =
𝛼−𝑡
𝛼𝑟 ∞ −𝑣 𝑣 𝑟−1 𝑑𝑣
𝑀𝑋 (𝑡) = ∫ 𝑒 (𝛼−𝑡)
Γ(𝑟) 0 𝛼−𝑡
𝛼𝑟 ∞
=
Γ(𝑟)(𝛼−𝑡)𝑟
∫0 𝑒 −𝑣 (𝑣)𝑟−1 𝑑𝑣
𝛼𝑟 𝛼 𝑟
= Γ(𝑟) = ( )
Γ(𝑟)(𝛼−𝑡)𝑟 𝛼−𝑡
𝑟
𝐸(𝑋) =
𝛼
𝑟(𝑟+1)
𝐸(𝑋 2 )=
𝛼2
𝑟
𝑉(𝑋) =
𝛼2
𝜆
𝑀𝑋 (𝑡) = ( ), 𝑡 < 𝜆
𝜆−𝑡
1
𝐸(𝑋) =
𝜆
1
𝑉(𝑋) =
𝜆2
Moment generating function for chi square Distribution
𝒏 𝟏
Special case of Gamma distribution: 𝒓 = and 𝜶 = in 𝚪 function we get 𝝌𝟐 distribution.
𝟐 𝟐
A continuous random variable X is said to have a chi-square distribution if its PDF is given
by
𝑛 𝑥
𝑥 2 −1 𝑒 −2
, 𝑥 > 0,
𝑓(𝑥) = Γ (𝑛) 2𝑛2
2
{ 0, 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
1 𝑛
∞ 𝛼𝑟 ∞ (𝑡− )𝑥
𝑀𝑋 (𝑡) = ∫−∞ 𝑒 𝑡𝑥 𝑓(𝑥) 𝑑𝑥 = 𝑛
𝑛 ∫
−∞
𝑒 2 𝑥 2 −1 𝑑𝑥
Γ( )2 2
2
𝑛
𝑀𝑋 (𝑡) = (1 − 2𝑡)− 2
∞ 1 1
𝑀𝑋 (𝑡) = ∫−∞ 𝑒 𝑡𝑥 𝑓(𝑥) 𝑑𝑥 = = (𝑒 𝑏𝑡 − 𝑒 𝑎𝑡 )
𝑏−𝑎 𝑡(𝑏−𝑎)
Expand and get the suitable coefficient in the expanssion to find 𝐸(𝑋) and 𝑉(𝑋).
(𝑎+𝑏)
Mean 𝐸(𝑋) =
2
(𝑎2 + 𝑏2 +𝑎𝑏)
E(𝑋 2 ) =
3
(𝑏−𝑎)2
Variance 𝑉(𝑋) = 𝐸(𝑋 2 ) – [𝐸(𝑋)]2 =
12
Moment generating function for normal distribution
which exists for all 𝑡. Also, verify the first two moments.
Solution:
+∞ 1 𝑥−𝜇 2
1 𝑡𝑥 −2[ 𝜎 ]
𝑀𝑋 (𝑡) = ∫ 𝑒 𝑒 𝑑𝑥
√2𝜋𝜎 −∞
𝑥−𝜇
Let = 𝑠; thus 𝑥 = 𝜎𝑠 + 𝜇 and 𝑑𝑥 = 𝜎𝑑𝑠.
𝜎
Therefore,
+∞ 𝑠2
1 𝑡(𝜎𝑠+𝜇) − 2
𝑀𝑋 (𝑡) = ∫ 𝑒 𝑒 𝑑𝑠
√2𝜋 −∞
+∞ 𝑠2
𝑡𝜇
1 𝜎𝑡𝑠 − 2
=𝑒 ∫ 𝑒 𝑒 𝑑𝑠
√2𝜋 −∞
+∞
1 1 2
(𝑠 −2𝜎𝑡𝑠)
=𝑒 𝑡𝜇
∫ 𝑒 −2 𝑑𝑠
√2𝜋 −∞
+∞
1 1 2
(𝑠 −2𝜎𝑡𝑠)
=𝑒 𝑡𝜇
∫ 𝑒 −2 𝑑𝑠
√2𝜋 −∞
+∞
1 1
[(𝑠−𝜎𝑡)2 −𝜎 2 𝑡 2 ]
=𝑒 𝑡𝜇
∫ 𝑒 −2 𝑑𝑠
√2𝜋 −∞
𝜎2𝑡 2 +∞
𝑡𝜇+ 1 1
[(𝑠−𝜎𝑡)2 ]
=𝑒 2 ∫ 𝑒 −2 𝑑𝑠
√2𝜋 −∞
∞ 2 1 ∞ 2 /2 1
∫−∞ 𝑒 −𝑥 𝑑𝑥 = Γ (2) = √𝜋 and ∫−∞ 𝑒 −𝑣 𝑑𝑣 = √2Γ ( ) = √2𝜋
2
To obtain the moments of the normal, we differentiate once to obtain
1 2 2
𝑀′ (𝑡) = 𝑒 𝑡𝜇+2𝑡 𝜎
(𝜇 + 𝑡𝜎 2 )
1
′′ (𝑡) 𝑡𝜇+ 𝑡 2 𝜎 2
And a second time to get 𝑀 =𝑒 2 [(𝜇 + 𝑡𝜎 2 )2 + 𝜎 2 ]
So 𝑉𝑎𝑟(𝑋) = 𝜎 2 .
𝜆
4. Exponential Distributions: 𝑀𝑋 (𝑡) = ,𝑡 < 𝜆
𝜆−𝑡
𝛼 𝑟
5. Gamma Distributions: 𝑀𝑋 (𝑡) = ( )
𝛼−𝑡
𝑛
6. Chi square Distributions: 𝑀𝑋 (𝑡) = (1 − 2𝑡)−2
1
7. Uniform Distributions: 𝑀𝑋 (𝑡) == (𝑒 𝑏𝑡 − 𝑒 𝑎𝑡 )
𝑡(𝑏−𝑎)
Properties of MGF
1. Suppose that the random variable 𝑋 has mgf 𝑀𝑋 . Let 𝑌 = 𝛼𝑋 + 𝛽. Then 𝑀𝑌 , the mgf
of the random variable 𝑌, is given by
𝑀𝑌 (𝑡) = 𝑒 𝛽𝑡 𝑀𝑋 (𝛼𝑡)
2. Let 𝑋 and 𝑌 be two random variables with mgf’s, 𝑀𝑋 (𝑡) and 𝑀𝑌 (𝑡), respectively. If
𝑀𝑋 (𝑡) = 𝑀𝑌 (𝑡) for all values of 𝑡, then 𝑋 and 𝑌 have the same probability distribution.
3. Suppose that 𝑋 and 𝑌 are independent random variables. Let 𝑍 = 𝑋 + 𝑌. Let 𝑀𝑋 (𝑡),
𝑀𝑌 (𝑡) and 𝑀𝑍 (𝑡) be the mgf’s of the random variables 𝑋, 𝑌 and 𝑍, respectively.
But this is the mgf of a normally distributed random variable with expectation 𝛼𝜇 + 𝛽
and variance 𝛼 2 𝜎 2 . Thus the distribution of 𝑌 is normal.
Reproductive Properties of mgf
1. Suppose that 𝑋 and 𝑌 are independent random variables with distributions 𝑁(𝜇1 , 𝜎 2 )
and 𝑁(𝜇2 , 𝜎22 ), respectively. Let 𝑍 = 𝑋 + 𝑌. Hence
𝜎2𝑡 2 𝜎2𝑡 2
𝜇1 𝑡+ 1 𝜇2𝑡+ 2
𝑀𝑍 (𝑡) = 𝑀𝑋 (𝑡)𝑀𝑌 (𝑡) = 𝑒 2 𝑒 2
(𝜎 2 +𝜎 2 )𝑡 2
[(𝜇1 +𝜇2 )𝑡+ 1 2 ]
2
=𝑒
Theorem: (The reproductive property of the normal distribution)
Let 𝑋1 , 𝑋2 , ⋯ , 𝑋𝑛 be 𝑛 independent random variables with distribution 𝑁(𝜇𝑖 , 𝜎𝑖2 ), 𝑖 =
1,2, ⋯ , 𝑛.
𝛼 = 𝛼1 + ⋯ + 𝛼𝑛 .
Theorem:
Suppose that the distribution of 𝑋𝑖 is 𝜒𝑛2𝑖 , 𝑖 = 1,2, ⋯ , 𝑘, where the 𝑋𝑖′ 𝑠 are independent
random variables. Let 𝑍 = 𝑋1 + ⋯ + 𝑋𝑘 . Then 𝑍 has distribution 𝜒𝑛2 , where 𝑛 = 𝑛1 + ⋯ +
𝑛𝑘 .
𝑛𝑖
Proof: 𝑀𝑋𝑖 (𝑡) = (1 − 2𝑡)− 2 , 𝑖 = 1,2, ⋯ , 𝑘
𝑒 𝑎𝑡 −𝑒 −𝑎𝑡
= (2𝑎)𝑡
, by expanding.
𝑡 2𝑛 𝑎2𝑛
𝐸(𝑋 2𝑛 ) = coefficient of =
(2𝑛)! (2𝑛+1)
2. If X is normally distributed with mean 𝜇 and variance, 𝜎 2 then show that 𝐸(𝑋 − 𝜇)2𝑛 =
1.3.5 … . (2𝑛 − 1)𝜎 2𝑛 .
𝜎2 𝑡 2
Solution: Given 𝑋~𝑁(𝜇, 𝜎 2 ), 𝑀𝑋 (𝑡) = 𝑒 𝜇𝑡+ 2
Let 𝑌= (𝑋 − 𝜇)
𝑡 2𝑛
To get E(𝑌 2𝑛 )= The coefficient of (2𝑛)! in 𝑀𝑌 (𝑡).
= 𝑒 −𝜇𝑡 𝐸(𝑒 𝑡𝑥 )
= 𝑒 −𝜇𝑡 𝑀𝑋 (𝑡)
𝜎2 𝑡 2 𝜎2 𝑡 2 2 3
−𝜇𝑡 𝜇𝑡+ 𝜎2𝑡 2 1 𝜎2𝑡 2 1 𝜎2𝑡 2
𝑀𝑌 (𝑡) = 𝑒 .𝑒 2 =𝑒 2 =1+( ) + 2! ( ) + 3! ( ) +⋯
2 2 2
𝑡 2𝑛 𝜎 2𝑛 (2𝑛)! (2𝑛)!
E(𝑌 2𝑛 )= The coefficient of (2𝑛)! in 𝑀𝑌 (𝑡)= = 𝜎 2𝑛
2𝑛 𝑛! 2𝑛 𝑛!
(2𝑛)(2𝑛−1)(2𝑛−2)…(3)(2)(1)
= (2𝑛)(2𝑛−2)(2𝑛−4)…(6)(4)(2)
𝜎 2𝑛
Solution:
3
𝑋1 ~𝜒 2 (3) gives 𝑀𝑋1 (𝑡) = (1 − 2𝑡) − 2
5
𝑋2 ~𝜒 2 (5) gives 𝑀𝑋2 (𝑡) = (1 − 2𝑡) – 2
8
𝑀𝑍 (𝑡) = 𝑀𝑋1 (𝑡)𝑀𝑋2 (𝑡) = (1 − 2𝑡) − 2 ~𝜒 2 (8)
𝑛 𝑍 𝑍
−1 − −
𝑍2 𝑒 2 𝑍3𝑒 2
, 𝑍>0
f(Z)= { 2𝑛2Γ(𝑛/2) = { 24Γ(4) , 𝑍 > 0
0, 𝑂𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒 0, 𝑂𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
4. Let 𝑋1 , 𝑋2 𝑎𝑛𝑑 𝑋3 𝑎𝑟𝑒 3 independent random variable having normal distributions with
parameter (4, 1), (5,2), (7, 3) respectively. Let 𝑌 = 2𝑋1 + 2𝑋2 + 𝑋3 . Find the pdf of 𝑉 =
𝑌−𝜇 2
( 𝜎
) , where 𝜇 𝑎𝑛𝑑 𝜎 are the mean and standard deviation of Y.
Solution:
𝜎2𝑡 2
𝜇𝑡+
𝑀𝑋 (𝑡) = 𝑒 2
𝑌~𝑁(2 ∗ 4 + 2 ∗ 5 + 1 ∗ 7, 22 ∗ 1 + 22 ∗ 2 + 1 ∗ 3)
𝑌~𝑁(25, 15)
15𝑡 2
25𝑡+
𝑀𝑌 (𝑡) = 𝑒 2
𝑥−𝜇
We have, if 𝑋~𝑁(𝜇, 𝜎 2 ), then 𝑍 = ~𝑁(0,1) and 𝑌 = 𝑍 2 ~𝜒 2 (1).
𝜎
𝑌−25
“𝑌” is in normal distribution so is in standard normal distribution and square of it is in
√15
chi square distribution with degree of freedom, n=1.
𝑌−25 2
Therefore,𝑉 = ( ) = 𝑍 2 ~𝜒 2 (1)
√15
𝑣 1 𝑣 1
− − − −
𝑒 2 𝑣 2 𝑒 2 𝑣 2
1 , 𝑣>0 1 , 𝑣>0
f(V)= { 1
22 Γ( ) = { √𝜋 22
2
0, 𝑂𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒 0, 𝑂𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒
3
5. If a random variable 𝑋 has mgf 𝑀𝑋 (𝑡) = then find the standard deviation of the
3−𝑡
random variable 𝑋.
Solution:
3 3 1 𝑡 −1 𝑡 𝑡 2
𝑀𝑋 (𝑡) = = 𝑡 = 𝑡 = (1 − ) =1+ +( ) +⋯
3−𝑡 3(1−3) (1−3) 3 3 3
1
𝐸(𝑋) = 𝑐𝑜𝑒𝑓 𝑜𝑓 𝑡 𝑖𝑛 𝑀𝑋 (𝑡) =
3
2)
𝑡2 1 2
𝐸(𝑋 = 𝑐𝑜𝑒𝑓 𝑜𝑓 𝑖𝑛 𝑀𝑋 (𝑡) = 2! =
2! 9 9
1 1
𝑉(𝑋) = . 𝐻𝑒𝑛𝑐𝑒, 𝜎 =
9 3
6. Let 𝑋 be random variable having probability mass function p(x=k) = 𝑝(1 − 𝑝)𝑘−1 ,
Solution:
𝑝
= ∑𝑛1(𝑒 𝑡 (1 − 𝑝))𝑥 , Expanding
1−𝑝
𝑝
= {(𝑒 𝑡 (1 − 𝑝)) + (𝑒 𝑡 (1 − 𝑝))2 + ⋯ }
1−𝑝
𝑝
= (𝑒 𝑡 (1 − 𝑝)){1 + (𝑒 𝑡 (1 − 𝑝)) + (𝑒 𝑡 (1 − 𝑝))2 + ⋯ }
1−𝑝
𝑝 𝑡
1 𝑝𝑒 𝑡
𝑀𝑋 (𝑡) = ∗ (𝑒 (1 − 𝑝)) ∗ =
1−𝑝 1 − 𝑒 𝑡 (1 − 𝑝) 1 − 𝑒 𝑡 (1 − 𝑝)
𝑝𝑒 𝑡
𝑀𝑋1 (𝑡) = [1−𝑒 𝑡(1−𝑝)]2,
1
at t=0, 𝐸(𝑋) =
𝑝
and
1 2(1 − 𝑝)
𝐸(𝑋 2 ) = +
𝑝 𝑝2
1−𝑝
𝑉(𝑋) =
𝑝2
7. If 𝑋 has pdf 𝑓(𝑥) = 𝜆𝑒 −𝜆(𝑥−𝑎) , 𝑋 ≥ 𝑎, find the mgf of 𝑋 and hence find 𝑉(𝑋).
𝜆𝑎
𝑒 −(𝜆−𝑡)𝑥
= 𝜆𝑒
−(𝜆 − 𝑡)
𝑒 −(𝜆−𝑡)𝑎 𝜆𝑒 𝑎𝑡
= 𝜆𝑒 𝜆𝑎 (𝜆−𝑡)
= ,𝜆 >𝑡
𝜆−𝑡
1 1
𝐸(𝑋)= 𝑎 + and 𝑉(𝑋) =
𝜆 𝜆2
𝑡 −1)
8. If the mgf of discrete random variable is 𝑒 4(𝑒 then find P(𝑋 = 𝜇 + 𝜎) where 𝜇 and 𝜎
are the mean and S.D of 𝑋.
𝑡 −1)
Solution: 𝑀𝑋 (𝑡) = 𝑒 4(𝑒
𝑋~𝑝(𝛼) where 𝛼 = 4
For, 𝛼 = 4,
𝑒 −4 4𝑘
𝑃(𝑋 = 𝑘) =
𝑘!
𝑒 −4 46
To find P(𝑋 = 𝜇 + 𝜎)= P(𝑋 = 6)= = 0.1042.
6!