Stat 212 April 10 Notes
Stat 212 April 10 Notes
1 Final Projects
A list of papers will be sent out soon. Make sure to find groups to work with. If you want to
work on a paper that is not on the provided list, you must convince a full group of people
to work on this paper with you.
2 Markov Chains
Recall the following statement from last lecture:
Definition 1 (Stationary). Let (Xi ) be a sequence of random variables. It is called sta-
tionary if for all k,
d
(xi ) = (xi+k ).
and for all i1 , . . . , ie ,
d
(xi1 , . . . , xie ) = (xi1 +k , . . . , xie +k ).
Definition 2. A chain is irreducible if for all x, y ∈ S, there exists t ∈ N such that pt (x, y)
is positive. There is a unique stationary distribution π such that for all n ∈ N,
P[Xn = x] = π(x)
and
π = (π(1), · · · , π(|S|))
such that π = πP .
The standard central limit theorem, unfortunately, doesn’t hold exactly the same for Markov
Chains. We can adopt the logic accordingly. Start with the following definition:
Definition 3. Let (xi )i∈Z be a stationary sequence. Then, the (α(i)) are called the strong-
mixing coefficients where
1
Technically, there exist “stronger” mixing coefficients — for example, the above with P (A |
B) instead of P (A, B). However, the stated definition is for the canonical strong mixing
coefficient. We do not claim strongest, after all.
Proposition 1. A Markov Chain will have α(i) → 0 as i → ∞ if when the eigenvalues are
indexed λi such that
|λ1 | ≥ |λ2 | ≥ · · · ,
λ1 ̸= λ2 .
We did not prove this claim in class, and it is left as an exercise. Additionally, as Markov
Chains are well-behaved, one can reason out that there exists β ∈ (0, 1) such that α(i) ≤ c·β i
for some constant c.
Re-centering our discussion on working toward a central limit theorem-like result, continue
with the following:
Theorem 1. Let (xi ) be a stationary sequence and (α(i)) be its strong mixing coefficients.
Assume that E[X1 ] = 0. If ϵ > 0 such that x1 ∈ L2+ϵ and
∞
X ϵ
α(i) 2+ϵ < ∞,
i=1
then
n
1 X
√ xi → N (0, σ 2 )
n i=1
where
∞
X
σ 2 = Var(x1 ) + 2 Cov(x1 , xk ).
k=2
Note that this result is general and does not only hold for Markov Chains. This can make
it quite useful.
3 Stein’s Method
Consider the following lemma:
Lemma 1. Let Z ∼ N (0, σ 2 ) and f ∈ c1 (R) such that E[|f ′ (z)|] < ∞ and E[|z · f (z)|] < ∞,
then
E[z · f (z)] = σ 2 · E[f ′ (z)].
Proof. We tackle this with integration by parts. For the sake of simplicity, we assume that
σ 2 = 1. Then,
Z ∞ t2
e− 2
E[z · f (z)] = t · f (t) · √ dt
−∞ 2π
∞ Z ∞
1 2
− t2 1 t2
= √ · f (t) · −e +√ f ′ (t)e− 2 dt
2π 2π −∞
−∞
′
= E[f (z)].
2
The result follows in the case we’ve assumed (σ 2 = 1). The proof follows similarly when you
leave σ in.
As a follow-up:
Define Z ∞
ω2 t2
fx (ω) = e 2 e− 2 (I[t ≤ x] − ϕ(ω)) dt
ω
1
such that ft ∈ c and
Z ∞
ω2 t2 ω2 ω2
fx (ω) = ωe 2 e− 2 (I[t ≤ x]−ϕ(x)) dt−e 2 e− 2 (I[ω ≤ x]−ϕ(x) = ωfx (ω)+ϕ(x)−I[ω ≤ x].
ω