0% found this document useful (0 votes)
8 views13 pages

Moment Generating Functions (MGF)

Uploaded by

guptaparth310505
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views13 pages

Moment Generating Functions (MGF)

Uploaded by

guptaparth310505
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 13

Moment generating functions (mgf)

Reference: 1. Meyer PL. Introductory probability and statistical applications. Oxford and IBH Publishing; 1965.
2. Johnson, Richard A., Irwin Miller, and John E. Freund. "Probability and statistics for engineers." (2000).

Definition:

The moment generating function of a r.v. 𝑋 is defined by

∑𝑖 𝑒 𝑡𝑥𝑖 𝑝𝑋 (𝑥𝑖 ) (discrete case)


𝑡𝑋 )
𝑀𝑋 (𝑡) = 𝐸(𝑒 = { ∞ 𝑡𝑥𝑖 ----------(1)
∫−∞ 𝑒 𝑓𝑋 (𝑥)𝑑𝑥 (continuous case)

where 𝑡 is a real variable.


Note:

𝑀𝑋 (𝑡) may not exist for all r.v.’s 𝑋. In general, 𝑀𝑋 (𝑡) will exist only for those values of 𝑡
for which the sum or integral of Eq (1) converges absolutely.

Suppose that 𝑀𝑋 (𝑡) exists. If we express 𝑒 𝑡𝑋 formally and take expectation, then
1 1
𝑀𝑋 (𝑡) = 𝐸(𝑒 𝑡𝑋 ) = 𝐸 [1 + 𝑡𝑋 + (𝑡𝑋)2 + ⋯ + (𝑡𝑋)𝑘 + ⋯ ]
2! 𝑘!
𝑡2 𝑡𝑘
= 1 + 𝑡𝐸(𝑋) + 𝐸(𝑋 2 ) + ⋯ + 𝐸(𝑋 𝑘 ) + ⋯ ----------(2)
2! 𝑘!

and the 𝑘 𝑡ℎ moment of 𝑋 is given by


(𝑘)
𝑚𝑘 = 𝐸(𝑋 𝑘 ) = 𝑀𝑋 (0), 𝑘 = 1,2, ⋯ ----------(3)
(𝑘) 𝑑𝑘
where 𝑀𝑋 (0) = 𝑀𝑋 (𝑡)|𝑡=0 ----------(4)
𝑑𝑡 𝑘

Note:

𝐸(𝑋) is coefficient of t In 𝑀𝑋 (𝑡).


𝑡2
𝐸(𝑋 2 ) is coefficient of In 𝑀𝑋 (𝑡).
2!

𝑡𝑛
𝐸(𝑋 𝑛 ) is coefficient of In 𝑀𝑋 (𝑡).
𝑛!

𝐸(𝑋 𝑛 ) = 𝑀𝑋𝑛 (0)


Exercise
𝑒 −|𝑥|
1. Suppose that 𝑋 has pdf 𝑓(𝑥) = , −∞ < 𝑥 < ∞ then find 𝐸(𝑋) and 𝑉(𝑋) using mgf.
2

Solution:

𝑀𝑋 (𝑡) = ∫ 𝑒 𝑡𝑥 𝑓(𝑥) 𝑑𝑥
−∞

∞ 𝑒 −|𝑥|
= ∫−∞ 𝑒 𝑡𝑥 𝑑𝑥
2

0 𝑒𝑥 ∞ 𝑒− 𝑥
= ∫−∞ 𝑒 𝑡𝑥 𝑑𝑥 + ∫0 𝑒 𝑡𝑥 𝑑𝑥
2 2

0 𝑒 (1+𝑡)𝑥 ∞ 𝑒 −(1−𝑡)𝑥
= ∫−∞ 𝑑𝑥 + ∫0 𝑑𝑥
2 2

1
𝑀𝑋 (𝑡) = , -1< t <1
1−𝑡 2

𝑀𝑋1 (0) = 0

𝑀𝑋2 (0) = 2

𝐸(𝑋) = 0 and 𝑉(𝑋) = 2.


1 2𝑡 2
Or: 𝑀𝑋 (𝑡) = = 1 + 𝑡2 + 𝑡4 + 𝑡6 + ⋯ = 1 + + 𝑡4 + 𝑡6 + ⋯
1−𝑡 2 2

𝑡2
Cofficient of 𝑡 is 𝐸(𝑋) = 0. Cofficient of is 𝑉(𝑋) = 2 − 0 = 2.
2

2. Let X be a random variable taking the values 0,1,2, ⋯ and 𝑓(𝑥) = 𝑎𝑏 𝑥 , 𝑎, 𝑏 > 0 & a+b=1.
Find mgf of 𝑋. If 𝐸(𝑋) = 𝑚1 , 𝐸(𝑋 2 ) = 𝑚2 then show that 𝑚2= 𝑚1 (2𝑚1 + 1).

Solution:
1 𝑎
𝑀𝑋 (𝑡) = ∑∞ 𝑥 𝑡𝑥
0 𝑎𝑏 𝑒 = 𝑎 ∑∞ 𝑡 𝑥
0 (𝑏𝑒 ) = a =
1−𝑏𝑒 𝑡 1−𝑏𝑒 𝑡

𝑎𝑏
𝐸(𝑋) = 𝑀𝑋1 (0) =
(1 − 𝑏)2
(1 + 𝑏)𝑎𝑏
𝐸(𝑋 2 ) = 𝑀𝑋2 (0) =
(1 − 𝑏)3

Given,

𝐸(𝑋) = 𝑚1 , 𝐸(𝑋 2 ) = 𝑚2

To Prove, 𝑚2 = 𝑚1 (2𝑚1 + 1).


𝑎𝑏 𝑎𝑏 𝑎𝑏 2𝑎𝑏+1+𝑏2 −2𝑏
Consider, 𝑚1 (2𝑚1 + 1) = (1−𝑏)2
(2 (1−𝑏)2 + 1) = (1−𝑏)2
( (1−𝑏)2
)

𝑎𝑏
= (2𝑏(𝑎 − 1) + 1 + 𝑏 2 )
(1−𝑏)4

𝑎𝑏
= (2𝑏(−𝑏) + 1 + 𝑏 2 )= 𝑚2 .
(1−𝑏)4

Moment generating function for binomial distribution

Let 𝑋 have the binomial distribution with probability distribution

𝑏(𝑥|𝑛, 𝑝) = (𝑛𝑥)𝑝 𝑥 (1 − 𝑝)𝑛−𝑥 for 𝑥 = 0,1, ⋯ , 𝑛

Show that

𝑀(𝑡) = (1 − 𝑝 + 𝑝𝑒 𝑡 )𝑛 for all 𝑡

𝐸(𝑋) = 𝑛𝑝 and 𝑉𝑎𝑟(𝑋) = 𝑛𝑝(1 − 𝑝)

Solution:

By definition of moment generating function


𝑛
𝑛
𝑀(𝑡) = ∑ 𝑒 𝑡𝑥 ( ) 𝑝 𝑥 (1 − 𝑝)𝑛−𝑥
𝑥
𝑥=0
𝑛
𝑛
= ∑(𝑒 𝑡 𝑝)𝑥 ( ) (1 − 𝑝)𝑛−𝑥
𝑥
𝑥=0

= (𝑝𝑒 𝑡 + 1 − 𝑝)𝑛 for all 𝑡

where we have used binomial formula


𝑛
𝑛
(𝑎 + 𝑏)𝑛 = ∑ ( ) 𝑎 𝑥 𝑏 𝑛−𝑥
𝑥
𝑥=0

Differentiating 𝑀(𝑡), we find

𝑀′ (𝑡) = 𝑛𝑝𝑒 𝑡 (𝑝𝑒 𝑡 + 1 − 𝑝)𝑛−1

𝑀′′ (𝑡) = (𝑛 − 1)𝑛𝑝2 𝑒 2𝑡 (𝑝𝑒 𝑡 + 1 − 𝑝)𝑛−2 + 𝑛𝑝𝑒 𝑡 (𝑝𝑒 𝑡 + 1 − 𝑝)𝑛−1

Evaluating these derivatives at 𝑡 = 0, we obtain the moments

𝐸(𝑋) = 𝑛𝑝

𝐸(𝑋) = (𝑛 − 1)𝑛𝑝2 + 𝑛𝑝
Also, the variance is

𝑉𝑎𝑟(𝑋) = 𝐸(𝑋 2 ) − [𝐸(𝑋)]2 = 𝑛𝑝(1 − 𝑝)

Moment generating function for Poisson distribution

Let 𝑋 have the Poisson distribution with probability distribution


𝜆𝑥
𝑓(𝑥) = 𝑒 −𝜆 for 𝑥 = 0,1, ⋯ , ∞
𝑥!

Show that
𝑡 −1)
𝑀(𝑡) = 𝑒 𝜆(𝑒 for all 𝑡

𝐸(𝑋) = 𝜆 and 𝑉𝑎𝑟(𝑋) = 𝜆

The mean and variance of the Poisson distribution are equal.

Solution:

By definition of the moment generating function



𝑡𝑥
𝜆𝑥 −𝜆
𝑀(𝑡) = ∑ 𝑒 𝑒
𝑥!
𝑥=0

(𝜆𝑒 𝑡 )𝑥 −𝜆
=∑ 𝑒
𝑥!
𝑥=0
𝑡
= 𝑒 −𝜆 𝑒 𝜆𝑒
𝑡 −1)
= 𝑒 𝜆(𝑒 , for −∞ < 𝑡 < ∞

Differentiating 𝑀(𝑡), we find


𝑡 −1)
𝑀′ (𝑡) = 𝜆𝑒 𝑡 𝑒 𝜆(𝑒
𝑡 −1) 𝑡 −1)
𝑀′′ (𝑡) = 𝜆𝑒 𝑡 𝑒 𝜆(𝑒 + 𝜆2 𝑒 2𝑡 𝑒 𝜆(𝑒

Evaluating these derivatives at 𝑡 = 0, we obtain the moments

𝐸(𝑋) = 𝜆

𝐸(𝑋 2 ) = 𝜆 + 𝜆2

Variance is
𝑉𝑎𝑟 (𝑋) = 𝐸(𝑋 2 ) − [𝐸(𝑋)]2 = 𝜆

Moment generating function for Gamma Distribution

𝑥 𝑟−1 𝑒 −𝛼𝑥 𝛼 𝑟
𝑓(𝑥) = { , 𝑥 > 0, 𝛼, 𝑟 > 0
Γ(𝑟)
0, 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
∞ 𝛼𝑟 ∞
𝑀𝑋 (𝑡) = ∫−∞ 𝑒 𝑡𝑥 𝑓(𝑥) 𝑑𝑥 = ∫ 𝑒 −(𝛼−𝑡)𝑥 𝑥 𝑟−1 𝑑𝑥
Γ(𝑟) 0

𝑑𝑣
Substitute, 𝑥(𝛼 − 𝑡) = 𝑣 then 𝑑𝑥 =
𝛼−𝑡

𝛼𝑟 ∞ −𝑣 𝑣 𝑟−1 𝑑𝑣
𝑀𝑋 (𝑡) = ∫ 𝑒 (𝛼−𝑡)
Γ(𝑟) 0 𝛼−𝑡

𝛼𝑟 ∞
=
Γ(𝑟)(𝛼−𝑡)𝑟
∫0 𝑒 −𝑣 (𝑣)𝑟−1 𝑑𝑣

𝛼𝑟 𝛼 𝑟
= Γ(𝑟) = ( )
Γ(𝑟)(𝛼−𝑡)𝑟 𝛼−𝑡

𝑟
𝐸(𝑋) =
𝛼
𝑟(𝑟+1)
𝐸(𝑋 2 )=
𝛼2

𝑟
𝑉(𝑋) =
𝛼2

Moment generating function for Exponential Distribution

Note: When we sub r=1 in gamma distribution we get exponential distribution.


−𝜆𝑥
𝑓(𝑥) = {𝜆𝑒 , 𝑥>0
0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒

∞ ∞ 𝑒 −(𝜆−𝑡)𝑥 𝜆
𝑀𝑋 (𝑡) = ∫0 𝑒 𝑡𝑥 𝑓(𝑥) 𝑑𝑥 = 𝜆 ∫0 𝑒 −(𝜆−𝑡)𝑥 𝑑𝑥 = 𝜆 [
−(𝜆−𝑡) 0
] = ( )
𝜆−𝑡

𝜆
𝑀𝑋 (𝑡) = ( ), 𝑡 < 𝜆
𝜆−𝑡

1
𝐸(𝑋) =
𝜆
1
𝑉(𝑋) =
𝜆2
Moment generating function for chi square Distribution
𝒏 𝟏
Special case of Gamma distribution: 𝒓 = and 𝜶 = in 𝚪 function we get 𝝌𝟐 distribution.
𝟐 𝟐

A continuous random variable X is said to have a chi-square distribution if its PDF is given
by
𝑛 𝑥
𝑥 2 −1 𝑒 −2
, 𝑥 > 0,
𝑓(𝑥) = Γ (𝑛) 2𝑛2
2
{ 0, 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒

1 𝑛
∞ 𝛼𝑟 ∞ (𝑡− )𝑥
𝑀𝑋 (𝑡) = ∫−∞ 𝑒 𝑡𝑥 𝑓(𝑥) 𝑑𝑥 = 𝑛
𝑛 ∫
−∞
𝑒 2 𝑥 2 −1 𝑑𝑥
Γ( )2 2
2

𝑛
𝑀𝑋 (𝑡) = (1 − 2𝑡)− 2

𝐸(𝑋) = 𝑛 and 𝑉(𝑋) = 2𝑛

Moment generating function for Uniform Distribution


1
𝑎≤𝑥≤𝑏
𝑓(𝑥) = {(𝑏 − 𝑎)
0 𝑒𝑙𝑠𝑒 𝑤ℎ𝑒𝑟𝑒

∞ 1 1
𝑀𝑋 (𝑡) = ∫−∞ 𝑒 𝑡𝑥 𝑓(𝑥) 𝑑𝑥 = = (𝑒 𝑏𝑡 − 𝑒 𝑎𝑡 )
𝑏−𝑎 𝑡(𝑏−𝑎)

Expand and get the suitable coefficient in the expanssion to find 𝐸(𝑋) and 𝑉(𝑋).
(𝑎+𝑏)
Mean 𝐸(𝑋) =
2

(𝑎2 + 𝑏2 +𝑎𝑏)
E(𝑋 2 ) =
3

(𝑏−𝑎)2
Variance 𝑉(𝑋) = 𝐸(𝑋 2 ) – [𝐸(𝑋)]2 =
12
Moment generating function for normal distribution

Show that the normal distribution, whose probability density function


1 2 2
1 2 /2𝜎 2
𝑓(𝑥) = 𝑒 −(𝑥−𝜇) has 𝑀(𝑡) = 𝑒 𝑡𝜇+2𝑡 𝜎
√2𝜋𝜎

which exists for all 𝑡. Also, verify the first two moments.

Solution:
+∞ 1 𝑥−𝜇 2
1 𝑡𝑥 −2[ 𝜎 ]
𝑀𝑋 (𝑡) = ∫ 𝑒 𝑒 𝑑𝑥
√2𝜋𝜎 −∞
𝑥−𝜇
Let = 𝑠; thus 𝑥 = 𝜎𝑠 + 𝜇 and 𝑑𝑥 = 𝜎𝑑𝑠.
𝜎

Therefore,
+∞ 𝑠2
1 𝑡(𝜎𝑠+𝜇) − 2
𝑀𝑋 (𝑡) = ∫ 𝑒 𝑒 𝑑𝑠
√2𝜋 −∞
+∞ 𝑠2
𝑡𝜇
1 𝜎𝑡𝑠 − 2
=𝑒 ∫ 𝑒 𝑒 𝑑𝑠
√2𝜋 −∞
+∞
1 1 2
(𝑠 −2𝜎𝑡𝑠)
=𝑒 𝑡𝜇
∫ 𝑒 −2 𝑑𝑠
√2𝜋 −∞
+∞
1 1 2
(𝑠 −2𝜎𝑡𝑠)
=𝑒 𝑡𝜇
∫ 𝑒 −2 𝑑𝑠
√2𝜋 −∞
+∞
1 1
[(𝑠−𝜎𝑡)2 −𝜎 2 𝑡 2 ]
=𝑒 𝑡𝜇
∫ 𝑒 −2 𝑑𝑠
√2𝜋 −∞

𝜎2𝑡 2 +∞
𝑡𝜇+ 1 1
[(𝑠−𝜎𝑡)2 ]
=𝑒 2 ∫ 𝑒 −2 𝑑𝑠
√2𝜋 −∞

Let 𝑠 − 𝜎𝑡 = 𝑣; then 𝑑𝑠 = 𝑑𝑣 and we obtain


𝜎2𝑡 2 +∞ 𝑣2 𝜎2𝑡 2
𝑡𝜇+ 1 − 𝑡𝜇+
𝑀𝑋(𝑡) = 𝑒 2 ∫ 𝑒 2 𝑑𝑣 = 𝑒 2
√2𝜋 −∞

∞ 2 1 ∞ 2 /2 1
∫−∞ 𝑒 −𝑥 𝑑𝑥 = Γ (2) = √𝜋 and ∫−∞ 𝑒 −𝑣 𝑑𝑣 = √2Γ ( ) = √2𝜋
2
To obtain the moments of the normal, we differentiate once to obtain
1 2 2
𝑀′ (𝑡) = 𝑒 𝑡𝜇+2𝑡 𝜎
(𝜇 + 𝑡𝜎 2 )
1
′′ (𝑡) 𝑡𝜇+ 𝑡 2 𝜎 2
And a second time to get 𝑀 =𝑒 2 [(𝜇 + 𝑡𝜎 2 )2 + 𝜎 2 ]

Setting 𝑡 = 0, 𝐸(𝑋) = 𝑀′ (0) = 𝜇 and 𝐸(𝑋 2 ) = 𝑀′′ (0) = 𝜎 2 + 𝜇2 .

So 𝑉𝑎𝑟(𝑋) = 𝜎 2 .

Summarizing the MGF of distributions:


1. Binomial Distributions: 𝑀𝑋 (𝑡) = (𝑝𝑒 𝑡 + 𝑞)𝑛
𝑡 −1)
2. Poisson Distributions: 𝑀𝑋 (𝑡) = 𝑒 𝜆(𝑒
𝜎2 𝑡2
𝑡𝜇+
3. Normal Distributions: 𝑀𝑋 (𝑡) = 𝑒 2

𝜆
4. Exponential Distributions: 𝑀𝑋 (𝑡) = ,𝑡 < 𝜆
𝜆−𝑡

𝛼 𝑟
5. Gamma Distributions: 𝑀𝑋 (𝑡) = ( )
𝛼−𝑡
𝑛
6. Chi square Distributions: 𝑀𝑋 (𝑡) = (1 − 2𝑡)−2
1
7. Uniform Distributions: 𝑀𝑋 (𝑡) == (𝑒 𝑏𝑡 − 𝑒 𝑎𝑡 )
𝑡(𝑏−𝑎)

Properties of MGF
1. Suppose that the random variable 𝑋 has mgf 𝑀𝑋 . Let 𝑌 = 𝛼𝑋 + 𝛽. Then 𝑀𝑌 , the mgf
of the random variable 𝑌, is given by

𝑀𝑌 (𝑡) = 𝑒 𝛽𝑡 𝑀𝑋 (𝛼𝑡)

Proof: 𝑀𝑌 (𝑡) = 𝐸(𝑒 𝑌𝑡 ) = 𝐸[𝑒 (𝛼𝑋+𝛽)𝑡 ]

= 𝑒 𝛽𝑡 𝐸(𝑒 𝛼𝑡𝑋 ) = 𝑒 𝛽𝑡 𝑀𝑋 (𝛼𝑡)

2. Let 𝑋 and 𝑌 be two random variables with mgf’s, 𝑀𝑋 (𝑡) and 𝑀𝑌 (𝑡), respectively. If
𝑀𝑋 (𝑡) = 𝑀𝑌 (𝑡) for all values of 𝑡, then 𝑋 and 𝑌 have the same probability distribution.

3. Suppose that 𝑋 and 𝑌 are independent random variables. Let 𝑍 = 𝑋 + 𝑌. Let 𝑀𝑋 (𝑡),
𝑀𝑌 (𝑡) and 𝑀𝑍 (𝑡) be the mgf’s of the random variables 𝑋, 𝑌 and 𝑍, respectively.

Then 𝑀𝑍 (𝑡) = 𝑀𝑋 (𝑡)𝑀_𝑌(𝑡).


4. Suppose that 𝑋 has distribution 𝑁(𝜇, 𝜎 2 ). Let 𝑌 = 𝛼𝑋 + 𝛽. Then 𝑌 is again normally
distributed. Then mgf of 𝑌 is 𝑀𝑌 (𝑡) = 𝑒 𝛽𝑡 𝑀𝑋 (𝛼𝑡).
𝜎2 𝑡 2
5. Proof: Mgf of normal distribution is 𝑀𝑋 (𝑡) = 𝑒 𝜇𝑡+ 2 .
𝛼2 𝜎2 𝑡2 𝛼2 𝜎2 𝑡2
𝛽𝑡 𝛽𝑡 𝛼𝜇𝑡+ (𝛽+𝛼𝜇)𝑡
Then Mgf of 𝑌 is 𝑀𝑌 (𝑡) = 𝑒 𝑀𝑋 (𝛼𝑡) = 𝑒 [𝑒 2 ]=𝑒 𝑒 2

But this is the mgf of a normally distributed random variable with expectation 𝛼𝜇 + 𝛽
and variance 𝛼 2 𝜎 2 . Thus the distribution of 𝑌 is normal.
Reproductive Properties of mgf
1. Suppose that 𝑋 and 𝑌 are independent random variables with distributions 𝑁(𝜇1 , 𝜎 2 )
and 𝑁(𝜇2 , 𝜎22 ), respectively. Let 𝑍 = 𝑋 + 𝑌. Hence
𝜎2𝑡 2 𝜎2𝑡 2
𝜇1 𝑡+ 1 𝜇2𝑡+ 2
𝑀𝑍 (𝑡) = 𝑀𝑋 (𝑡)𝑀𝑌 (𝑡) = 𝑒 2 𝑒 2

(𝜎 2 +𝜎 2 )𝑡 2
[(𝜇1 +𝜇2 )𝑡+ 1 2 ]
2
=𝑒
Theorem: (The reproductive property of the normal distribution)
Let 𝑋1 , 𝑋2 , ⋯ , 𝑋𝑛 be 𝑛 independent random variables with distribution 𝑁(𝜇𝑖 , 𝜎𝑖2 ), 𝑖 =
1,2, ⋯ , 𝑛.

Let 𝑍 = 𝑋1 + ⋯ + 𝑋𝑛 . Then 𝑍 has distribution 𝑁(∑𝑛𝑖=1 𝜇𝑖 , ∑𝑛𝑖=1 𝜎𝑖2 ).

Note: The Poisson distribution also possesses a reproductive property.


Theorem:
Let 𝑋1 , ⋯ , 𝑋𝑛 be independent random variables. Suppose that 𝑋𝑖 has a Poisson distribution
with parameter 𝛼𝑖 , 𝑖 = 1,2, ⋯ , 𝑛. Let 𝑍 = 𝑋1 + ⋯ + 𝑋𝑛 . Then 𝑍 has a Poisson distribution
with parameter

𝛼 = 𝛼1 + ⋯ + 𝛼𝑛 .
Theorem:
Suppose that the distribution of 𝑋𝑖 is 𝜒𝑛2𝑖 , 𝑖 = 1,2, ⋯ , 𝑘, where the 𝑋𝑖′ 𝑠 are independent
random variables. Let 𝑍 = 𝑋1 + ⋯ + 𝑋𝑘 . Then 𝑍 has distribution 𝜒𝑛2 , where 𝑛 = 𝑛1 + ⋯ +
𝑛𝑘 .
𝑛𝑖
Proof: 𝑀𝑋𝑖 (𝑡) = (1 − 2𝑡)− 2 , 𝑖 = 1,2, ⋯ , 𝑘

Hence 𝑀𝑍 (𝑡) = 𝑀𝑋1 (𝑡) ⋯ 𝑀𝑋𝑘 (𝑡) = (1 − 2𝑡)−(𝑛1+⋯+𝑛𝑘)/2 .


Exercise
1. Find the mgf of random variable 𝑋 which is uniformly distributed with an interval

(−𝑎, 𝑎) and hence find 𝐸(𝑋 2𝑛 ).


𝑒 𝑎𝑡 −𝑒 −𝑎𝑡 1
Solution: 𝑋~ 𝑈(−𝑎, 𝑎) = (𝑎+𝑎)𝑡
we know, 𝑀𝑋 (𝑡) = (𝑒 𝑏𝑡 − 𝑒 𝑎𝑡 )
𝑡(𝑏−𝑎)

𝑒 𝑎𝑡 −𝑒 −𝑎𝑡
= (2𝑎)𝑡
, by expanding.

𝑡 2𝑛 𝑎2𝑛
𝐸(𝑋 2𝑛 ) = coefficient of =
(2𝑛)! (2𝑛+1)

2. If X is normally distributed with mean 𝜇 and variance, 𝜎 2 then show that 𝐸(𝑋 − 𝜇)2𝑛 =
1.3.5 … . (2𝑛 − 1)𝜎 2𝑛 .
𝜎2 𝑡 2
Solution: Given 𝑋~𝑁(𝜇, 𝜎 2 ), 𝑀𝑋 (𝑡) = 𝑒 𝜇𝑡+ 2

Let 𝑌= (𝑋 − 𝜇)
𝑡 2𝑛
To get E(𝑌 2𝑛 )= The coefficient of (2𝑛)! in 𝑀𝑌 (𝑡).

𝑀𝑌 (𝑡) = 𝐸 (𝑒 𝑡𝑦 )= 𝐸(𝑒 𝑡(𝑥−𝜇) )

= 𝑒 −𝜇𝑡 𝐸(𝑒 𝑡𝑥 )

= 𝑒 −𝜇𝑡 𝑀𝑋 (𝑡)
𝜎2 𝑡 2 𝜎2 𝑡 2 2 3
−𝜇𝑡 𝜇𝑡+ 𝜎2𝑡 2 1 𝜎2𝑡 2 1 𝜎2𝑡 2
𝑀𝑌 (𝑡) = 𝑒 .𝑒 2 =𝑒 2 =1+( ) + 2! ( ) + 3! ( ) +⋯
2 2 2

𝑡 2𝑛 𝜎 2𝑛 (2𝑛)! (2𝑛)!
E(𝑌 2𝑛 )= The coefficient of (2𝑛)! in 𝑀𝑌 (𝑡)= = 𝜎 2𝑛
2𝑛 𝑛! 2𝑛 𝑛!

(2𝑛)(2𝑛−1)(2𝑛−2)…(3)(2)(1)
= (2𝑛)(2𝑛−2)(2𝑛−4)…(6)(4)(2)
𝜎 2𝑛

= 1.3.5 … . (2𝑛 − 1)𝜎 2𝑛 .

3. Let 𝑋1 ~𝜒 2 (3), 𝑋2 ~𝜒 2 (5) and Z= 𝑋1 + 𝑋2 where 𝑋1 𝑎𝑛𝑑 𝑋2 𝑖𝑛𝑑𝑒𝑝𝑒𝑛𝑑𝑒𝑛𝑡 random


variables. Find 𝑀𝑍 (𝑡) and 𝑉(𝑍) 𝑎𝑛𝑑 𝑝𝑑𝑓 𝑜𝑓 𝑍.

Solution:
3
𝑋1 ~𝜒 2 (3) gives 𝑀𝑋1 (𝑡) = (1 − 2𝑡) − 2
5
𝑋2 ~𝜒 2 (5) gives 𝑀𝑋2 (𝑡) = (1 − 2𝑡) – 2
8
𝑀𝑍 (𝑡) = 𝑀𝑋1 (𝑡)𝑀𝑋2 (𝑡) = (1 − 2𝑡) − 2 ~𝜒 2 (8)

Therefore, 𝑉(𝑍) = 2𝑛 = 2 (8) = 16. Where 𝑛 = 8.

𝑛 𝑍 𝑍
−1 − −
𝑍2 𝑒 2 𝑍3𝑒 2
, 𝑍>0
f(Z)= { 2𝑛2Γ(𝑛/2) = { 24Γ(4) , 𝑍 > 0
0, 𝑂𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒 0, 𝑂𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒

4. Let 𝑋1 , 𝑋2 𝑎𝑛𝑑 𝑋3 𝑎𝑟𝑒 3 independent random variable having normal distributions with
parameter (4, 1), (5,2), (7, 3) respectively. Let 𝑌 = 2𝑋1 + 2𝑋2 + 𝑋3 . Find the pdf of 𝑉 =
𝑌−𝜇 2
( 𝜎
) , where 𝜇 𝑎𝑛𝑑 𝜎 are the mean and standard deviation of Y.

Solution:
𝜎2𝑡 2
𝜇𝑡+
𝑀𝑋 (𝑡) = 𝑒 2

𝑌~𝑁(2 ∗ 4 + 2 ∗ 5 + 1 ∗ 7, 22 ∗ 1 + 22 ∗ 2 + 1 ∗ 3)

𝑌~𝑁(25, 15)
15𝑡 2
25𝑡+
𝑀𝑌 (𝑡) = 𝑒 2

𝑥−𝜇
We have, if 𝑋~𝑁(𝜇, 𝜎 2 ), then 𝑍 = ~𝑁(0,1) and 𝑌 = 𝑍 2 ~𝜒 2 (1).
𝜎

𝑌−25
“𝑌” is in normal distribution so is in standard normal distribution and square of it is in
√15
chi square distribution with degree of freedom, n=1.

𝑌−25 2
Therefore,𝑉 = ( ) = 𝑍 2 ~𝜒 2 (1)
√15
𝑣 1 𝑣 1
− − − −
𝑒 2 𝑣 2 𝑒 2 𝑣 2
1 , 𝑣>0 1 , 𝑣>0
f(V)= { 1
22 Γ( ) = { √𝜋 22
2
0, 𝑂𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒 0, 𝑂𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒

3
5. If a random variable 𝑋 has mgf 𝑀𝑋 (𝑡) = then find the standard deviation of the
3−𝑡
random variable 𝑋.

Solution:

3 3 1 𝑡 −1 𝑡 𝑡 2
𝑀𝑋 (𝑡) = = 𝑡 = 𝑡 = (1 − ) =1+ +( ) +⋯
3−𝑡 3(1−3) (1−3) 3 3 3

1
𝐸(𝑋) = 𝑐𝑜𝑒𝑓 𝑜𝑓 𝑡 𝑖𝑛 𝑀𝑋 (𝑡) =
3

2)
𝑡2 1 2
𝐸(𝑋 = 𝑐𝑜𝑒𝑓 𝑜𝑓 𝑖𝑛 𝑀𝑋 (𝑡) = 2! =
2! 9 9
1 1
𝑉(𝑋) = . 𝐻𝑒𝑛𝑐𝑒, 𝜎 =
9 3
6. Let 𝑋 be random variable having probability mass function p(x=k) = 𝑝(1 − 𝑝)𝑘−1 ,

𝑘 = 1,2,3 … 𝑛. Find 𝑀𝑋 (𝑡) and 𝑉(𝑋).

Solution:

𝑀𝑋 (𝑡) = ∑𝑛1 𝑒 𝑡𝑥 𝑝(𝑋 = 𝑥)

= ∑𝑛1 𝑒 𝑡𝑥 p(1 − 𝑝)𝑥−1


𝑝
= ∑𝑛1 𝑒 𝑡𝑥 (1 − 𝑝)𝑥
𝑝−1

𝑝
= ∑𝑛1(𝑒 𝑡 (1 − 𝑝))𝑥 , Expanding
1−𝑝

𝑝
= {(𝑒 𝑡 (1 − 𝑝)) + (𝑒 𝑡 (1 − 𝑝))2 + ⋯ }
1−𝑝
𝑝
= (𝑒 𝑡 (1 − 𝑝)){1 + (𝑒 𝑡 (1 − 𝑝)) + (𝑒 𝑡 (1 − 𝑝))2 + ⋯ }
1−𝑝

𝑝 𝑡
1 𝑝𝑒 𝑡
𝑀𝑋 (𝑡) = ∗ (𝑒 (1 − 𝑝)) ∗ =
1−𝑝 1 − 𝑒 𝑡 (1 − 𝑝) 1 − 𝑒 𝑡 (1 − 𝑝)
𝑝𝑒 𝑡
𝑀𝑋1 (𝑡) = [1−𝑒 𝑡(1−𝑝)]2,
1
at t=0, 𝐸(𝑋) =
𝑝

and
1 2(1 − 𝑝)
𝐸(𝑋 2 ) = +
𝑝 𝑝2
1−𝑝
𝑉(𝑋) =
𝑝2

7. If 𝑋 has pdf 𝑓(𝑥) = 𝜆𝑒 −𝜆(𝑥−𝑎) , 𝑋 ≥ 𝑎, find the mgf of 𝑋 and hence find 𝑉(𝑋).

Ans: Given is an exponential distribution.



𝑀𝑋 (𝑡) = ∫ 𝑒 𝑡𝑥 𝑓(𝑥) 𝑑𝑥
−∞
∞ ∞ ∞
= ∫𝑎 𝑒 𝑡𝑥 𝜆𝑒 −𝜆(𝑥−𝑎) 𝑑𝑥 = 𝜆 ∫𝑎 𝑒 𝑡𝑥 𝑒 −𝜆(𝑥−𝑎) 𝑑𝑥 = 𝜆𝑒 𝜆𝑎 ∫𝑎 𝑒 −(𝜆−𝑡)𝑥 𝑑𝑥

𝜆𝑎
𝑒 −(𝜆−𝑡)𝑥
= 𝜆𝑒
−(𝜆 − 𝑡)
𝑒 −(𝜆−𝑡)𝑎 𝜆𝑒 𝑎𝑡
= 𝜆𝑒 𝜆𝑎 (𝜆−𝑡)
= ,𝜆 >𝑡
𝜆−𝑡

1 1
𝐸(𝑋)= 𝑎 + and 𝑉(𝑋) =
𝜆 𝜆2
𝑡 −1)
8. If the mgf of discrete random variable is 𝑒 4(𝑒 then find P(𝑋 = 𝜇 + 𝜎) where 𝜇 and 𝜎
are the mean and S.D of 𝑋.
𝑡 −1)
Solution: 𝑀𝑋 (𝑡) = 𝑒 4(𝑒

𝑋~𝑝(𝛼) where 𝛼 = 4

Therefore 𝐸(𝑋) = 𝑉(𝑋) = 4,

Which gives 𝜇 = 4 𝑎𝑛𝑑 𝜎 = 2.


𝑒 −𝛼 𝛼 𝑘
𝑃(𝑋 = 𝑘)= , 𝑘 = 0,1,2, ⋯ , 𝑛
𝑘!

For, 𝛼 = 4,
𝑒 −4 4𝑘
𝑃(𝑋 = 𝑘) =
𝑘!

𝑒 −4 46
To find P(𝑋 = 𝜇 + 𝜎)= P(𝑋 = 6)= = 0.1042.
6!

You might also like