0% found this document useful (0 votes)
33 views14 pages

Moment Generating Functions 1

The document provides an overview of moment-generating functions (mgfs), including their definition, how to find the mgf of a binomial random variable, and their applications in calculating the mean and variance of random variables. It explains the uniqueness property of mgfs in determining probability distributions and includes examples and propositions related to both discrete and continuous random variables. Additionally, it covers the mgf for geometric and Poisson distributions, along with exercises for practical application.

Uploaded by

WESTON MALAMA
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
33 views14 pages

Moment Generating Functions 1

The document provides an overview of moment-generating functions (mgfs), including their definition, how to find the mgf of a binomial random variable, and their applications in calculating the mean and variance of random variables. It explains the uniqueness property of mgfs in determining probability distributions and includes examples and propositions related to both discrete and continuous random variables. Additionally, it covers the mgf for geometric and Poisson distributions, along with exercises for practical application.

Uploaded by

WESTON MALAMA
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 14

Moment Generating Functions

Objectives

 To learn the definition of a moment-generating function.


 To find the moment-generating function of a binomial random variable.

 To learn how to use a moment-generating function to find the mean and variance of a
random variable.

 To learn how to use a moment-generating function to identify which probability mass


function a random variable X follows.

 To understand the steps involved in each of the proofs in the lesson.

 To be able to apply the methods learned in the lesson to new problems.

Introduction

We discussed the mean and variance of theoretical probability distributions using appropriate
summation for discrete variables and integrals for continuous random variables. However these
calculations can be simplified by using a mathematical device called Moment Generating Function.

For a discrete random variable X, the moment generating function M(t) is defined by;

(1)

Assuming that can be expanded as a series so that the above equation become:

Differentiating w.r.t. t,

(2)

And putting t = 0
(3)

The right hand side of this equation is the expected value ( or mean) of X.

Differentiating again (2)

The variance of X is given by

Variance (X)

The power of the moment generating function is on how to calculate means and variances of any
distribution which is best illustrated by an example.

The expected values E(X), E(X2), E(X3), ..., and E(Xr) are called moments. You know that

μ = E(X)

and the variance:

σ2 = Var(X) = E(X2) − μ2

which are functions of moments, are sometimes difficult to find. Special functions, called
moment-generating functions can sometimes make finding the mean and variance of a random
variable simpler. In this unit, you will first learn what a moment-generating function is, and then
you will learn how to use moment generating functions (abbreviated "m.g.f."):

 to find moments and functions of moments, such as μ and σ2


 to identify which probability mass function a random variable X follows

Definition. Let X be a discrete random variable with probability mass function f(x) and support
S. Then:
is the moment generating function of X as long as the summation is finite for some interval of t
around 0. That is, M(t) is the moment generating function ("m.g.f.") of X if there is a positive
number h such that the above summation exists and is finite for −h < t < h.

Example

What is the moment generating function of a binomial random variable X?

, recall

, for

Once you find the moment generating function of a random variable, you can use it.

Finding Moments

Proposition. If a moment-generating function exists for a random variable X, then:

(1) The mean of X can be found by evaluating the first derivative of the moment-generating
function at t = 0. That is:

(2) The variance of X can be found by evaluating the first and second derivatives of the moment-
generating function at t = 0. That is:

Before we prove the above proposition, recall that E(X), E(X2), ..., and E(Xr) are called moments
about the origin. It is for this reason, and the above proposition, that the function M(t) is called
a moment-generating function. That is, M(t) generates moments! The proposition actually doesn't
tell the whole story. In fact, in general the rth moment about the origin can be found by
evaluating the rth derivative of the moment-generating function at t = 0. That is:

Proof. We begin the proof by recalling that the moment-generating function is defined as
follows:

And, by definition, M(t) is finite on some interval of t around 0. That tells you two things:

1. Derivatives of all orders exist at t = 0.


2. It is okay to interchange differentiation and summation.

That said, you can now work on the details of the proof:

Therefore, and

Example

Use the moment-generating function for a binomial random variable X:

to find the mean μ and variance σ2 of a binomial random variable.


Solution. Keeping in mind that we need to take the first derivative of M(t) with respect to t, we
get:

And, setting t = 0, we get the binomial mean μ = np:

To find the variance, we first need to take the second derivative of M(t) with respect to t. Doing
so, we get:

And, setting t = 0, and using the formula for the variance, we get the binomial variance σ2 = np(1
− p):

Not only can a moment-generating function be used to find moments of a random variable, it can
also be used to identify which probability mass function a random variable follows.

Finding distribution

Proposition. A moment-generating function uniquely determines the probability distribution of a


random variable.
Proof. If the support S is {b1, b2, b3, ...}, then the moment-generating function:

is given by:

Therefore, the coefficient of: is the probability:

This implies necessarily that if two random variables have the same moment-generating
function, then they must have the same probability distribution.

Example

If a random variable X has the following moment-generating function:

, for all t, then what is the probability mass function of X?

Solution. We previously determined that the moment generating function of a binomial random
variable is:

or −∞ < t < ∞. Comparing the given moment generating function with that of a binomial random

variable, you can see that X must be a binomial random variable with n = 20 and p = .

Therefore, the p.m.f. of X is:

, for x = 0, 1, ..., 20.

Example If a random variable X has the following moment-generating function:

for all t, then what is the p.m.f. of X?


Unit Summary

Moment generating functions (mgfs) are function of t. You can find the mgfs by using the
definition of expectation of function of a random variable. The moment generating function of X

Is

Note that exp(X) is another way of writing exp.X. You can see that the moment-generating
function uniquely determines the distribution of a random variable. In other words, if the mgf
exists, there is one and only one distribution associated with that mgf.". This property of the mgf
is sometimes referred to as the uniqueness property of the mgf.

Suppose we have the following mgf for a random variable Y

Using the information from this unit, you can find the E(Yk) for any k if the expectation exists.
Let’s find E(Y) and . You can solve these in a couple of ways. You can use the knowledge
that and. Then you can find variance by using
Unit Activities

1. Suppose that Y has the following mgf.

(a) Find E(Y) (b) ans: E(Y)=4, = 28

2. Find the MGF for e-x. ans.

3. Find E(X3) using the MGF (1-2t)-10. Ans. E(X3) = 10,560.

4. Find the moment generating function of x ∼ f(x) = 1, where 0 <x<1,


and thereby confirm that E(x)= 1/2

and var = 1/12 answe

MGF of Poisson distribution

Theorem: tells us how to derive the mgf of a random variable, since the mgf is given by taking
the expected value of a function applied to the random variable:

Example 1

Let X∼ Poisson(λ) Then, the pmf of X is given by:

for x = 0, 1, 2, …

Before we derive the mgf for X, we recall from calculus the Taylor series expansion of the exponential

function :

Using this fact, we find


Now we take the first and second derivatives of MX(t). Remember we are differentiating with respect

to t:

Next we evaluate the derivatives at t=0 to find the first and second moments of X:

Finally, in order to find the variance, we use the alternate formula:

Var (X)

Thus, we have shown that both the mean and variance for the Poisson(λ) distribution is given by the

parameter λ.

Moment generating function for continuous variables

For a continuous variable X with probability density function f(x), the moment generating function M(t)
is defined by:

Example
Find the moment generating function of the uniform distribution f(x) =1 (0<= x <= 1), f(x) = 0 elsewhere.
Use it to find the mean and variance of the distribution.

Solution

Expanding as a series gives

Differentiating w.r.t.t

Where t = 0

Mean =

Therefore,
Variance =

=
Geometric Distributions

Example

A representative from the National Football League's Marketing Division randomly selects
people on a random street in Kansas City, Kansas until he finds a person who attended the last
home football game. Let p, the probability that he succeeds in finding such a person, equal 0.20.
And, let X denote the number of people he selects until he finds his first success. What is the
probability mass function of X?

P=p(success) = 0.2

1-p=P(failure) = 0.8

Let X = number of people selected until first success

Definition. Assume Bernoulli trials — that is, (1) there are two possible outcomes, (2) the trials
are independent, and (3) p, the probability of success, remains the same from trial to trial.

Let X denote the number of trials until the first success. Then, the probability mass function of X
is:

, for x = 1, 2, ... In this case, we say that X follows a


geometric distribution.

Four properties of a geometric random variable areThe sum of a geometric series is:

(1) Then, taking the derivatives of both sides, the first derivative with respect to r
must be:
(2) And, taking the derivatives of both sides again, the second derivative with respect
to r must be:

In order to prove the properties, we need to recall the sum of the geometric series. So, we may as
well get that out of the way first.

You will use the sum of the geometric series (Recall (1)) in proving the first two of the following
four properties. And, we'll use the first derivative (Recall (2)) in proving the third property, and
the second derivative (Recall(3)) in proving the fourth property. Let's jump right in now!

Theorem. The probability mass function:

, 0 < p < 1, x = 1, 2, ... for a geometric random variable X is


a valid p.m.f.

(1) For 0<p<1

(2) Theorem. The mean of a geometric random variable X is:

μ=E(X)=

Theorem. The variance of a geometric random variable X is:

Var(X)=
Proof. To find the variance, we are going to use that trick of "adding zero" to the shortcut
formula for the variance. Recall that the shortcut formula is:

σ2=Var(X)=E(X2)−[E(X)]2

We "add zero" by adding and subtracting E(X) to get:

σ2=E(X2)−E(X)+E(X)−[E(X)]2=E[X(X−1)]+E(X)−[E(X)]2

Then, here's how the rest of the proof goes:

You might also like