4 Moment Generating Functions: 4.1 Definition and Moments
4 Moment Generating Functions: 4.1 Definition and Moments
At t = 0 this becomes
X
k n fX (k) = E[X n ]
k
1
There was a cheat in the proof. We interchanged derivatives and an
infinite sum. You can’t always do this and to justify doing it in the above
computation we need some assumptions on fX (k). We will not worry about
this issue.
Setting t = 0 we have
So the variance is
The sum is over all points (x, y) subject to the constraint that they lie on
the line x + y = z. This is equivalent to summing over all x and setting
y = z − x. Or we can sum over all y and set x = z − y. So
X X
fZ (z) = fX (x)fY (z − x), fZ (z) = fX (z − y)fY (y)
x y
2
Note that this formula look like a discrete convolution. One can use this
formula to compute the pmf of a sum of independent RV’s. But computing
the mgf is much easier.
Proof.
So
n
MX (t) = pet + 1 − p
3
which is of course the same result we obtained before.
Example: Now suppose X and Y are independent, both are binomial with
the same probability of success, p. X has n trials and Y has m trials. We
argued before that Z = X + Y should be binomial with n + m trials. Now
we can see this from the mgf. The mgf of Z is
n m n+m
MZ (t) = MX (t)MY (t) = pet + 1 − p pet + 1 − p = pet + 1 − p
The natural thing to do next is factor out an xn−1 from the series to turn
it into a geometric series. We do something different that will save some
4
computation later. Note that the n − 1th derivative will kill any term xk
with k < n − 1. So we can replace
∞
X ∞
X
n+j−1
x by xj
j=0 j=0
Note that this is just E[sX ], and this is our mgf E[etX ] with t = ln(s).
Anything you can do with the probability generating function you can do
with the mgf, and we will not use the probability generating function.
The mgf need not be defined for all t. We saw an example of this with
the geometric distribution where it was defined only if et (1 − p) < 1, i.e,
t < − ln(1 − p). In fact, it need not be defined for any t other than 0. As
an example of this consider the RV X that takes on all integer values and
P (X = k) = c(1 + k 2 )−1 . The constant c is given by
∞
1 X 1
=
c k=−∞ 1 + k 2
5
We leave it to the reader to show that
∞
X 1
etk =∞
k=−∞
1 + k2