Lecture 19 Compressed
Lecture 19 Compressed
1
Logistics
2
Summary
3
Markov Chain Review
4
Markov Chain Review
5
Gambler’s Ruin
5
Gambler’s Ruin
• You and ‘a friend’ repeatedly toss a fair coin. If it hits heads, you
give your friend $1. If it hits tails, they give you $1.
• You start with $ℓ1 and your friend starts with $ℓ2 . When either of
you runs out of money the game terminates.
• What is the probability that you win $ℓ2 ?
6
Gambler’s Ruin Markov Chain
8
Gambler’s Ruin Thought Exercise
What if you always walk away as soon as you win just $1. Then
what is your probability of winning, and what are your
expected winnings?
9
Stationary Distributions
9
Stationary Distribution
10
Claim (Existence of Stationary Distribution)
Any Markov chain with a finite state space, and transition
matrix P ∈ [0, 1]m×m has a stationary distribution π ∈ [0, 1]m
with π = πP.
11
Periodicity
Claim
If a Markov chain is irreducible, and has at least one
self-loop, then it is aperiodic.
13
Fundamental Theorem
15
Stationary Distribution Example 2
16
Stationary Distribution Example 2
∑ ∑ dj 1 ∑ 1 d
πP:,i = π(j)Pj,i = · = = i = π(i).
2|E| dj 2|E| 2|E|
j j j
17