Lindeberg Condition PDF
Lindeberg Condition PDF
Lindeberg Condition PDF
See Durrett 2nd ed pages 116-118 for an equivalent formulation and a proof using characteristic
functions. That proof leans on the continuity theorem for characteristic functions, (3.4) on page 99,
which in turn relies on the Helly selection theorem (2.5) on page 88. The present approach, due to
Lindeberg, is more elementary in that it does not require these tools. But note that the basic idea
in both arguments is to estimate the expected value of a smooth function of a sum of independent
variables using a Taylor expansion with error bound.
10.1
Triangular Arrays
Roughly speaking, a sum of many small independent random variables will be nearly normally
distributed. To formulate a limit theorem of this kind, we must consider sums of more and more
smaller and smaller random variables. Therefore, throughout this section we shall study the sequence
of sums
X
Si =
Xij ,
j
Here the row index i should always be taken to range over 1, 2, 3, . . ., while the column index j
ranges from 1 to ni . It is not assumed that the r.vs in each row are identically distributed. And it is
not assumed that different rows are independent. (Different rows could even be defined on different
probability spaces.) For motivation, see section ********** below for how such a triangular array
is set up in the most important application to partial sums X1 + X2 + + Xn obtained from a
sequence of independent r.v.s X1 , X2 , . . .
It will usually be the case that n1 < n2 < , whence the term triangular. It is not necessary to
assume this however.
1 This
is not standard terminology, but is used here as a simple referent for these conditions.
10-1
10-2
10.2
We will write L(X) to denote the law or distribution of a random variable X. N (0, 2 ) is the normal
distribution with mean 0 and variance 2 .
Theorem 10.1 (Lindebergs Theorem) Suppose that in addition to the Triangular Array Conditions, the triangular array satisfies Lindebergs Condition:
> 0, lim
ni
X
j=1
2
E[Xij
1 (|Xij | > )] = 0
(10.1)
2
2 + Xij
1 (|Xij | > )
2
+
2 +
2 +
2
E[Xij
1 (|Xij |
X
j
X
j
(10.2)
> )]
(10.3)
2
E[Xij
1 (|Xij | > )], which is independent of j, so...
(10.4)
2
E[Xij
1 (|Xij | > )]
(10.5)
The Lindeberg condition says that, as we go down the rows (i.e. i ), the summation on the
RHS tends to zero. Since inequality (10.5) holds for all > 0, we get
2
lim max EXij
= 0,
(10.6)
P
2
A consequence of (10.6) and condition 3 ( j EXij
= 1 for all i) is that ni as i . Another
consequence follows from the application of (10.6) to Chebychevs inequality. We have for all > 0,
P(|Xij | > )
2
)
E(Xij
2
(10.7)
An array with property (10.7) is said to be uniformly asymptotically negligible (UAN), and there is
a striking converse to Lindebergs theorem:
Theorem 10.2 (Fellers Theorem) If a triangular array satisfies the the Triangular Array Conditions and is UAN, then L(Si ) N (0, 1) [if and] only if Lindebergs condition (10.1) holds.
10-3
10.3
A condition stronger than Lindebergs that is often easier to check is the Lyapounov condition:
X
E|Xij |2+ = 0
(10.8)
> 0 such that lim
i
|X|2+
|X|2+
|X|
(10.9)
E|X|
.
Take X = Xij to be the elements of our triangular array, and take to be the value from Lyapounovs
condition. Then we can sum over j on the RHS and take the limit as i on both sides to get
Lindebergs condition.
E[X 2 1 (|X| > )]
Theorem 10.4 (Lyapounovs Theorem) If a triangular array satisfies the Triangular Array Conditions and the Lyapounov condition (10.8), then L(Si ) N (0, 1).
10.4
10-4
3. use the radial symmetry of the joint density function of i.i.d. N (0, 2 + 2 ) r.v.s U and V to
2
argue that L(U sin + V cos ) = L(U ). Take sin() = 2+ 2
10.5
This illustrates the general idea and avoids a few tricky details. With n fixed, let X1 , X2 , . . . , Xn
be independent random variables, not necessarily identically distributed.
Suppose EXj = 0 and let
P
j2 = E(Xj2 ) < . Then for S = X1 + + Xn we have VarS = nj=1 j2 . Let 2 = VarS. Note:
1. If L(Xj ) is N (0, j2 ), then L(S) is N (0, 2 ) by Theorem 10.5.
:=
:=
:=
..
.
:=
X1 + X2 + X3 + + X n ,
Z 1 + X2 + X3 + + X n ,
Z 1 + Z 2 + X3 + + X n ,
..
.
Z1 + Z2 + Z3 + + Z n .
We want to show that L(S) is close to L(T ), which is N (0, 2 ), i.e., that Ef (S) is close to Ef (T )
for all f C3 (, ) with uniform bound K on |f (i) |, i = 0, 1, 2, 3.
Clearly,
|Ef (S) Ef (T )|
n
X
j=1
(10.10)
Let Rj be the sum of the common terms in Sj1 and Sj . Then Sj1 = Rj + Xj and Sj = Rj + Zj .
Note that by construction
Rj and Xj are independent, as are Rj and Zj
(10.11)
10-5
We need to compare Ef (Rj + Xj ) and Ef (Rj + Zj ). By the Taylor series expansion up to the third
term,
Xj2 (2)
Xj3 (3)
f (Rj + Xj ) = f (Rj ) + Xj f (1) (Rj ) +
f (Rj ) +
f (),
2!
3!
where (Rj , Rj + Xj ). And the same is true with Zj instead of Xj . So, assuming that the Xs
have third moments, we can take expectations in each of these identites and subtract the resulting
equations. We get the following:
1. Since EXj = 0 = EXj , and by the independence of Xj , Rj , and Zj (10.11), we have E(Xj f (1) (Rj )) =
0 = E(Zj f (1) (Rj )).
2. Since VarXj = VarZj , and by (10.11), we have E(Xj2 f (2) (Rj )) = E(Zj2 f (2) (Rj )).
Thus the first and second order terms cancel, so we are left with the last inequality below (the first
two equalities summarize the previous paragraphs):
|Ef (Sj ) Ef (Sj1 )| = |Ef (Rj + Xj ) Ef (Rj + Zj )|
X3
Zj3 (3)
j (3)
= E
f () E f ()
3!
3!
K
(E|Xj |3 + E|Zj |3 )
6
where K is the bound on the derivatives of f . Now
Z
1
3
E|Zj | = 2
z3
exp{z 2 /(2j2 )} dz
2j
0
Z
1
= 2
j3 x3 exp{x2 /2} dx
2
0
= cj3
where
c=2
1
(10.12)
(10.13)
(10.14)
(10.15)
(10.16)
(10.17)
2
1
<
x3 exp{x2 /2} dx = 2
2
2
and since (E|X|2 ) 2 (E|X|3 ) 3 for any random variable X, we have j3 E|Xj |3 for each j. Thus
E|Zj |3 = cj3 cE|Xj |3 , for each j. Applying this to (10.14), we get
K
K
(E|Xj |3 + E|Zj |3 ) E|Xj |3 (1 + c).
6
6
Now, from (10.10), we get
|Ef (S) Ef (T )|
n
(c + 1)K X
E|Xj |3 .
6
j=1
(10.18)
10-6
10.6
2
For Lyapounovs version of the CLT, we looked at a triangular array {X ij } with EXij = 0, EXij
=
P ni 2
2
ij , j=1 ij = 1. Taking Si = Xi1 + Xi2 + + Xin , we saw that we could prove L(Si ) N (0, 1)
P ni
assuming that limi k=1
E|Xij |3 = 0
This is a condition on third moments - we would like to see if a weaker condition will suffice. We
used third moments in a Taylor series expansion as follows:
f (R + X) = f (R) + Xf (1) (R) +
X 3 (3)
X 2 (2)
f (R) +
f (),
2!
3!
(10.19)
X 2 (2)
f ()
2!
(10.20)
X 2 (2)
X 2 (2)
f ()
f (R)
2
2
X 2 (2)
X 3 (3)
[f () f (2) (R)]1 (|X| > ) +
f ()1 (|X| )
2
6
K
3
6 |X| ,
(10.21)
(10.22)
(10.23)
(10.24)
Now we return to the setup of section 10.5 and use our new result to get more refined bounds. From
(10.10) and (10.13), we had
nj
3
X
Zj3 (3)
Xj (3)
f () E f ()
|Ef (S) Ef (T )|
E
6
6
j=1
Using the triangle inequality, the new bound for Xj3 (10.24), the assumption that |f (3) | < K, and
E|Zj |3 = cj3 (10.17), we get
|Ef (S) Ef (T )|
X
n
n
X
K
K 3
KEXj2 1 (|Xj | > ) + EXj2 +
c
6
6 j
j=1
j=1
= K
n
X
j=1
n
K 2 cK X 3
+
6
6 j=1 j
(10.25)
(10.26)
10-7
As i (i.e. we go down the rows of the triangular array), the first term goes to zero by the
Lindeberg condition. The last term goes to zero since
n(i)
X
3
ij
j=1
10.7
max ij
1jn(i)
K
2
6
X
n(i)
2
ij
= 2
j=1
max ij ,
1jn(i)
Applications
n
1 X 2
E Xj 1 (|Xj | > sn )
2
n sn
j=1
lim
(10.28)
and since EX12 < , we can use the dominated convergence theorem to conclude that the
Lindeberg condition holds.
2. Lyapounovs condition
lim
n
1 X
s2+
n
j=1
implies Lindebergs condition. The proof of this is given (essentially) in Lemma 10.3.
3. If X1 , X2 , . . . are uniformly bounded: |Xj | M for all j, and sn . Fix > 0. For n so
large that sn M/, we have
1 (|Xj | > sn ) = 1 (|Xj | > M ) = 0 for all j.
Hence the Lindeberg condition is satisfied.