0% found this document useful (0 votes)
11 views

Note_7

The document discusses the existence and properties of expectations, moments, and moment generating functions (MGF) for discrete and continuous random variables (RVs). It establishes that the expectation exists if and only if the absolute expectation is finite, and introduces various definitions including moments, absolute moments, and central moments. Additionally, it provides propositions and examples illustrating the calculation of moments and variances using MGFs.

Uploaded by

jyotsnaarya12
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views

Note_7

The document discusses the existence and properties of expectations, moments, and moment generating functions (MGF) for discrete and continuous random variables (RVs). It establishes that the expectation exists if and only if the absolute expectation is finite, and introduces various definitions including moments, absolute moments, and central moments. Additionally, it provides propositions and examples illustrating the calculation of moments and variances using MGFs.

Uploaded by

jyotsnaarya12
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

56

q
Note 1.191. If X is discrete with p.m.f. fX such that EX exists, then E|X| = xœSX |x|fX (x) <

Œ. Similarly, if X is continuous with p.d.f. fX such that EX exists, then E|X| = ≠Œ |x|fX (x) dx <
Œ. Therefore EX exists if and only if E|X| < Œ. In other words, EX is finite if and only if E|X|
is finite.

Note 1.192. Fix a, b œ R with a ”= 0. Let X be a discrete/continuous RV with p.m.f./p.d.f. fX


such that EX exists. Then Y = aX + b is also a discrete/continuous RV. If X is discrete, then
ÿ ÿ ÿ
|ax + b|fX (x) Æ |a| |x|fX (x) + |b| fX (x) = |a|E|X| + |b| < Œ
xœSX xœSX xœSX

and hence E(aX + b) exists and equals


ÿ ÿ ÿ
E(aX + b) = (ax + b)fX (x) = a xfX (x) + b fX (x) = a EX + b.
xœSX xœSX xœSX

If X is continuous, a similar argument shows E(aX + b) = a EX + b.

Using arguments similar to the above observations, we obtain the next result. We skip the
details for brevity.

Proposition 1.193. Let X be a discrete/continuous RV with p.m.f./p.d.f. fX .

(a) Let hi : R æ R be functions and let ai œ R for i = 1, 2, · · · , n. Then


A n B n
ÿ ÿ
E ai hi (X) = ai Ehi (X),
i=1 i=1

provided all the expectations above exist.


(b) Let h1 , h2 : R æ R be functions such that h1 (x) Æ h2 (x), ’x œ SX , where SX denotes the
support of X. Then,
Eh1 (X) Æ Eh2 (X),

provided all the expectations above exist.


(c) Take h1 (x) := ≠|x|, h2 (x) := x, h3 (x) := |x|, ’x œ R. If EX exists, then

≠E|X| Æ EX Æ E|X|,
57

i.e. |EX| Æ E|X|.


(d) If P(a Æ X Æ b) = 1 for some a, b œ R, then EX exists and a Æ EX Æ b.

Note 1.194. Given an RV X, by choosing different functions h : R æ R, we obtain several


quantities of interest of the form Eh(X).

Definition 1.195 (Moments). The quantity µÕr := E[X r ], if it exists, is called the r-th moment of
RV X for r > 0.

Definition 1.196 (Absolute Moments). The quantity E[|X|r ], if it exists, is called the r-th absolute
moment of RV X for r > 0.

Definition 1.197 (Moments about a point). Let c œ R. The quantity E[(X ≠ c)r ], if it exists, is
called the r-th moment of RV X about c for r > 0.

Definition 1.198 (Absolute Moments about a point). Let c œ R. The quantity E[|X ≠ c|r ], if it
exists, is called the r-th absolute moment of RV X about c for r > 0.

Note 1.199. It is clear from the definitions above that the usual moments and absolute moments
are moments and absolute moments about origin, respectively.

Proposition 1.200. Let X be a discrete/continuous RV such that E|X|r < Œ for some r > 0.
Then E|X|s < Œ for all 0 < s < r.

Proof. Observe that for all x œ R, we have |x|s Æ max{|x|r , 1} Æ |x|r + 1 and hence

E|X|s Æ E|X|r + 1 < Œ.

Remark 1.201. Suppose that the m-th moment EX m of X exists for some positive integer m. Then
we have E|X|m < Œ (see Note 1.191). By Proposition 1.200, we have E|X|n < Œ for all positive
integers n Æ m and hence the n-th moment EX n exists for X. In particular, the existence of the
second moment EX 2 implies the existence of the first moment EX, which is the expectation of X.
58

Definition 1.202 (Central Moments). Let X be an RV such that µÕ1 = EX exists. The quantity
µr := E[(X ≠ µÕ1 )r ], if it exists, is called the r-th moment of RV X about the mean or r-th central
moment of X for r > 0.

Definition 1.203 (Variance). The second central moment µ2 of an RV X, if it exists, is called


the variance of X and denoted by V ar(X). Note that V ar(X) = µ2 = E [(X ≠ µÕ1 )2 ].

Remark 1.204. The following are some simple observations about the variance of an RV X.
(a) We have
Ë È
V ar(X) = E (X ≠ µÕ1 )2 = E[X 2 +(µÕ1 )2 ≠2µÕ1 X] = µÕ2 ≠2(µÕ1 )2 +(µÕ1 )2 = µÕ2 ≠(µÕ1 )2 = EX 2 ≠(EX)2 .

(b) Since the RV (X ≠ µÕ1 )2 takes non-negative values, we have V ar(X) = E(X ≠ µÕ1 )2 Ø 0.
(c) We have (EX)2 Æ EX 2 .
(d) V ar(X) = 0 if and only if P(X = µÕ1 ) = 1. (see problem set 5).
(e) For any a, b œ R, we have V ar(aX + b) = a2 V ar(X).
(f) Let V ar(X) > 0. Then Y := ÔX≠EX has the property that EY = 0 and V ar(Y ) = 1.
V ar(X)
Ò
Definition 1.205 (Standard Deviation). The quantity ‡(X) = V ar(X) is defined to be the
standard deviation of X.

Example 1.206. Let X be a discrete RV with p.m.f.


Y
_
] 1 , ’x
_ œ {1, 2, 3, 4, 5, 6}
6
fX (x) :=
_
[0,
_ otherwise.

Here, existence of µÕ1 = EX and µÕ2 = EX 2 can be established by standard calculations. Moreover,
ÿ 1 7
EX = xfX (x) = (1 + 2 + 3 + 4 + 5 + 6) =
xœSX 6 2

and
ÿ 1 91
EX 2 = x2 fX (x) = (12 + 22 + 32 + 42 + 52 + 62 ) = .
xœSX 6 6
Variance can now be computed using the relation V ar(X) = EX 2 ≠ (EX)2 .
59

Example 1.207. In Example 1.184, we had shown EX = 12 , where X is a continuous RV with


the p.d.f. Y
_
]1,
if 0 < x < 1,
_
fX (x) = _
[0, otherwise.
_
sŒ s1 2
Now, EX 2 = ≠Œ x2 fX (x) dx = 0x dx = 13 . Then V ar(X) = EX 2 ≠ (EX)2 = 1
3
≠ 1
4
= 1
12
.

Note 1.208. We are familiar with the Laplace transform of a given real-valued function defined
on R. We also know that under certain conditions, the Laplace transform of a function determines
the function almost uniquely. In probability theory, the Laplace transform of a p.m.f./p.d.f. of a
random variable X plays an important role.

Let X be a discrete/continuous RV defined on a probability space ( , F, P) with DF FX ,


p.m.f./p.d.f. fX and support SX .

Definition 1.209 (Moment Generating Function (MGF)). We say that the moment generating
function (MGF) of X exists, denoted by MX and equals MX (t) := EetX , provided EetX exists for
all t œ (≠h, h), for some h > 0.

Note 1.210. Observe that ex > 0, ’x œ R.

Note 1.211. If X is discrete/continuous with p.m.f./p.d.f. fX , then following the definition of an


expectation of an RV, we write
Y
_ ÿ ÿ
_
_
] etx fX (x), if etx fX (x) < Œ for discrete X, ’t œ (≠h, h) for some h > 0
MX (t) = EetX = xœSX xœSX
_
_s sŒ
_
[ Œ
≠Œ e fX (x) dx, if
tx
≠Œ etx fX (x) dx < Œ for continuous X, t œ (≠h, h) for some h > 0.

In this case, we shall say that the MGF MX exists on (≠h, h).

Ó
Remark 1.212. (a) MX (0) = 1 and hence A := t œ R : E[etX ] is finite} =
” ÿ.
(b) MX (t) > 0 ’t œ A, with A as above.
60

(c) For c œ R, consider the constant/degenerate RV X given by the p.m.f. (see Example 1.179)
Y
_
]1,
_ if x = c
fX (x) =
_
[0,
_ otherwise.
q
Here, the support is SX = {c} and MX (t) = EetX = xœSX etx fX (x) = etc exists for all
t œ R.
(d) Suppose the MGF MX exists on (≠h, h). Take constants c, d œ R with c ”= 0. Then,
the RV Y = cX + d is discrete/continuous, according to X being discrete/continuous and
moreover,
MY (t) = Eet(cX+d) = etd MX (ct)

exists for all t œ (≠ |c|


h h
, |c| ).

Note 1.213. The MGF can be used to compute the moments of an RV and this is the motivation
behind the term ‘Moment Generating Function’. This result is stated below. We skip the proof
for brevity.

Theorem 1.214. Let X be an RV with MGF MX which exists on (≠h, h) for some h > 0. Then,
we have the following results.
(a) µÕr = E[X r ] is finite for each r œ {1, 2, . . .}.
C D
(r) (r) dr
(b) µr = E[X ] = MX (0), where MX (0) =
Õ r
MX (t) is the r-th derivative of MX (t) at
dtr t=0
the point 0 for each r œ {1, 2, . . .}.
(c) MX has the following Maclaurin’s series expansion around t = 0 of the following form
ÿŒ r
t Õ
MX (t) = µr with t œ (≠h, h).
r=0 r!

Proposition 1.215. Continue with the notations and assumptions of Theorem 1.214 and define
ÂX : (≠h, h) æ R by ÂX (t) := ln MX (t), t œ (≠h, h). Then
(1) (2)
µÕ1 = E[X] = ÂX (0) and µ2 = V ar(X) = ÂX (0),
(r)
where ÂX denotes the r-th (r œ {1, 2}) derivative of ÂX .
61

Proof. We have, for t œ (≠h, h)


1 22
(2) (1)
(1) M (t)
(1)
(2)
MX (t)MX (t) ≠ MX (t)
ÂX (t) = X and ÂX (t) = .
MX (t) (MX (t))2
Evaluating the above equalities at t = 0 give the required results. ⇤

Example 1.216. Let X be a discrete RV with p.m.f.


Y ≠⁄ x
] e ⁄ , if x œ {0, 1, 2, . . .}
fX (x) = [ x!
0, otherwise,

where ⁄ > 0. We have


(⁄et )
Ë È Œ ≠⁄ x Œ x
= e≠⁄ e⁄e = e⁄(e ≠1) ’ t œ R
ÿ ÿ
tx e ⁄
MX (t) = E e = =e
t t
tX ≠⁄
e
x=0 x! x=0 x!
Ó 1 2 Ô
since A = t œ R : E etX < Œ = R. Now,
1 2
MX (t) = ⁄et e⁄(e ≠1) MX (t) = ⁄et e⁄(e ≠1) 1 + ⁄et
(1) (2)
and
t t
’ t œ R.

Then,
(1) (2)
µÕ1 = E(X) = MX (0) = ⁄, µÕ2 = E(X 2 ) = MX (0) = ⁄(1 + ⁄), V ar(X) = µ2 = µÕ2 ≠ (µÕ1 )2 = ⁄.
(1) (2)
Again, for t œ R, ÂX (t) = ln (MX (t)) = ⁄ (et ≠ 1), which yields ÂX (t) = ÂX (t) = ⁄et , ’t œ R.
Then, µÕ1 = E(X) = ⁄, µ2 = V ar(X) = ⁄. Higher order moments can be calculated by looking at
higher order derivatives of MX .

Example 1.217. Let X be a continuous RV with p.d.f.


Y
] e≠x , if x > 0
fX (x) =
[ 0, otherwise.

We have
1 2 ⁄ Œ ⁄ Œ
MX (t) = E e tX
= tx ≠x
e e dx = e≠(1≠t)x dx = (1 ≠ t)≠1 < Œ, if t < 1.
0 0
62
Ó 1 2 Ô
In particular, MX exists on (≠1, 1) and A = t œ R : E etX < Œ = (≠Œ, 1) ∏ (≠1, 1). Now,
(1) (2)
MX (t) = (1 ≠ t)≠2 and MX (t) = 2(1 ≠ t)≠3 , t < 1.

Then,
(1) (2)
µÕ1 = E(X) = MX (0) = 1, µÕ2 = E(X 2 ) = MX (0) = 2, V ar(X) = µ2 = µÕ2 ≠ (µÕ1 )2 = 1.
(1) 1 (2) 1
Again, for t < 1, ÂX (t) = ln (MX (t)) = ≠ ln(1 ≠ t), which yields ÂX (t) = 1≠t
, ÂX (t) = (1≠t)2
, ’t <
1. Then, µÕ1 = E(X) = 1, µ2 = V ar(X) = 1.
Now, consider the Maclaurin’s series expansion for MX around t = 0. We have
Œ
ÿ
MX (t) = (1 ≠ t) ≠1
= tr , ’t œ (≠1, 1)
r=0

and hence µÕr = r!, which is the coefficient of tr


r!
in the above power series.

Example 1.218. Let X be a continuous RV with p.d.f.


1 1
fX (x) = · , ’x œ R.
fi 1 + x2
As observed earlier in Example 1.186, EX does not exist. Since the existence of moments is a
necessary condition for the existence of MGF, we conclude that the MGF does not exist for this
RV X.

Remark 1.219 (Identically distributed RVs). Let X and Y be two RVs, possibly defined on different
probability spaces.

(a) Recall from Remark 1.112 that their law/distribution may be the same and in this case,
we have FX = FY , i.e. FX (x) = FY (x), ’x œ R. The statement ‘X and Y are equal in
law/distribution’ is equivalent to ‘X and Y are identically distributed’.
(b) Recall from Remark 1.113 that the DF uniquely identifies the law/distribution, i.e. if
FX = FY , then X and Y are identically distributed.
63

(c) Suppose X and Y are discrete RVs. Recall from Remark 1.127, the p.m.f. is uniquely
determined by the DF and vice versa. In the case of discrete RVs, X and Y are identically
distributed if and only if the p.m.f.s are equal (i.e., fX = fY ).
(d) Suppose X and Y are continuous RVs. Recall from Note 1.138 that the p.d.f.s in this case
are uniquely identified upto sets of ‘length 0’. We may refer to such an almost equal p.d.f.
as a ‘version of a p.d.f.’. Recall from Note 1.141, the p.d.f. is uniquely determined by the
DF and vice versa. In the case of continuous RVs, X and Y are identically distributed if
and only if the p.d.f.s are versions of each other. In other words, X and Y are identically
distributed if and only if there exist versions fX and fY of the p.d.f.s such that fX = fY ,
i.e. fX (x) = fY (x), ’x œ R.
(e) Suppose X and Y are identically distributed and let h : R æ R be a function. Then we have
that the RVs h(X) and h(Y ) are identically distributed. In particular, Eh(X) = Eh(Y ),
provided one of the expectations exists.
(f) Suppose X and Y are identically distributed. By (e), X 2 and Y 2 are identically distributed
and EX 2 = EY 2 , provided one of the expectations exists. More generally, the n-th moments
EX n and EY n of X and Y are the same, provided they exist.
(g) There are examples where EX n = EY n , ’n = 1, 2, · · · , but X and Y are not identically
distributed. We may discuss such an example later in this course. Consequently, the
moments do not uniquely identify the distribution. Under certain sufficient conditions on
the moments, such as the Carleman’s condition, it is however possible to uniquely identify
the distribution. This is beyond the scope of this course.
(h) Suppose X and Y are identically distributed and suppose that the MGF MX exists on
(≠h, h) for some h > 0. By the above observation (e), the MGF MY exists and MX = MY ,
i.e. MX (t) = MY (t), ’t œ (≠h, h).
(i) We now state a result without proof. Suppose the MGFs MX and MY exist. If MX (t) =
MY (t), ’t œ (≠h, h), then X and Y are identically distributed. Therefore, the MGF
uniquely identifies the distribution.

Notation 1.220. We write X = Y to denote that X and Y are identically distributed.


d
64

Example 1.221. If Y is an RV with the MGF MY (t) = (1 ≠ t)≠1 , ’t œ (≠1, 1), then by Exam-
ple 1.217, we conclude that Y is a continuous RV with p.d.f.
Y
] e≠x , if x > 0
fY (x) = [
0, otherwise.
Example 1.222. If X is a discrete RV with support SX and p.m.f. fX , then the MGF MX is of
the form
ÿ
MX (t) = etx fX (x).
xœSX

We can also make a converse statement. Since the MGF uniquely identifies a distribution, if an
MGF is given by a sum of the above form, we can immediately identify the corresponding discrete
1
RV with its support and p.m.f.. For example, if MX (t) = 2
+ 13 et + 16 e≠t , then X is discrete with
the p.m.f. Y
1
_
_
_
_ 2
, if x = 0,
_
_
_
_
]1,
_ if x = 1,
3
fX (x) :=
_ 1
_
_
_
_ 6
, if x = ≠1,
_
_
_
_
[ 0, otherwise.

Notation 1.223. We may refer to expectations of the form EetX as exponential moments of the
RV X.

Definition 1.224 (Symmetric Distribution). An RV X is said to have a symmetric distribution


about a point µ œ R if X ≠ µ = µ ≠ X.
d

Proposition 1.225. Let X be an RV which is symmetric about 0.


(a) If X is discrete, then the p.m.f. fX has the property that fX (x) = fX (≠x), ’x œ R. Further,
EX n = 0, ’n = 1, 3, 5, · · · , provided the moments exist.
(b) If X is continuous, then the p.d.f. fX has the property that fX (x) = fX (≠x), ’x œ R.
Further, EX n = 0, ’n = 1, 3, 5, · · · , provided the moments exist.

Proof. We prove the statement when X is a continuous RV. The proof for the case when X is
discrete is similar.

You might also like