0% found this document useful (0 votes)
61 views6 pages

MIT18 445S15 Lecture13

This document summarizes key concepts from a lecture on countable state space Markov chains: 1) A Markov chain has a stationary distribution if there exists a probability measure π such that π = πP. 2) If a Markov chain is positive recurrent and irreducible, there exists a unique stationary distribution π where π(x) is the reciprocal of the expected return time to state x. 3) If a Markov chain is positive recurrent, irreducible, and aperiodic, the probability of being in any state y converges to the stationary distribution π(y) as n approaches infinity.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
61 views6 pages

MIT18 445S15 Lecture13

This document summarizes key concepts from a lecture on countable state space Markov chains: 1) A Markov chain has a stationary distribution if there exists a probability measure π such that π = πP. 2) If a Markov chain is positive recurrent and irreducible, there exists a unique stationary distribution π where π(x) is the reciprocal of the expected return time to state x. 3) If a Markov chain is positive recurrent, irreducible, and aperiodic, the probability of being in any state y converges to the stationary distribution π(y) as n approaches infinity.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

18.

445 Introduction to Stochastic Processes


Lecture 13: Countable state space chains 2

Hao Wu

MIT

1 April 2015

Hao Wu (MIT) 18.445 1 April 2015 1/5


Recall Suppose that P is irreducible.
The Markov chain is recurrent if and only if

Px [τx+ < ∞] = 1, for some x.

The Markov chain is positive recurrent if and only if

Ex [τx+ ] < ∞, for some x.

Today’s Goal
stationary distribution
convergence to stationary distribution

Hao Wu (MIT) 18.445 1 April 2015 2/5


Stationary distribution

Theorem
An irreducible Markov chain is positive recurrent if and only if there
exists a probability measure π on Ω such that π = πP.

Corollary
If an irreducible Markov chain is positive recurrent, then
there exists a probability measure π such that π = πP ;
π(x) > 0 for all x. In fact,

1
π(x) = .
Ex [τx+ ]

Hao Wu (MIT) 18.445 1 April 2015 3/5


Convergence to the stationary

Theorem
If an irreducible Markov chain is positive recurrent and aperiodic, then

lim Px [Xn = y ] = π(y ) > 0, for all x, y .


n

Theorem
If an irreducible Markov chain is null recurrent, then

lim Px [Xn = y ] = 0, for all x, y .


n

Hao Wu (MIT) 18.445 1 April 2015 4/5


Convergence to the stationary

Recall Consider a Markov chain with state space Ω (countable) and


transition matrix P. For each x ∈ Ω, define

T (x ) = {n ≥ 1 : P n (x, x) > 0}.

Then
gcd(T (x)) = gcd(T (y )), for all x, y .
We say the chain is aperiodic if gcd(T (x)) = 1.
Theorem
Suppose that the Markov chain is irreducible and aperiodic. If the chain
is positive recurrent, then

lim ||P n (x, ·) − π||TV = 0.


n

Hao Wu (MIT) 18.445 1 April 2015 5/5


MIT OpenCourseWare
http://ocw.mit.edu

18.445 Introduction to Stochastic Processes


Spring 2015

For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.

You might also like