Week 5
Week 5
2. Proposition
i) If y is transient, then
Py (Xn = y, i.o. ) = 0.
3. Definition
We say that x communicates with y, write x → y, if
4. Theorem
If ρxy > 0 and ρyx < 1, then x is transient.
5. Corollary
If x recurrent and ρxy > 0, then ρyx = 1.
6. Definition
A set A is closed if it is impossible to get out.
7. Definition
A set B is irreducible if ∀i, j ∈ B, i → j.
8. Theorem
If C is a finite closed irreducible set, then all states in C are recurrent.
Page 1
SUStech Applied Stochastic Process
9. Theorem
If the state space S is finite, then S can be written as a disjoint union
S = T ∪ R1 ∪ · · · ∪ Rk ,
where T is a set of transient states and Ri , 1 ≤ i ≤ k, are closed irreducible sets of recurrent states.
10. Lemma
Let
N (y) = # of visits to y at n ≥ 1,
Then
ρxy
Ex N (y) = .
1 − ρyy
11. Lemma
∑
∞
Ex N (y) = pn (x, y)
n=1
12. Theorem
y is recurrent iff
∑
∞
pn (y, y) = Ey N (y) = ∞.
n=1
13. Lemma
If x is recurrent and x → y, then y is recurrent.
14. Lemma
In a finite closed set there has to be at lease one recurrent state.
P (Xn+1 = j | Xn = i) = P (X1 = j | X0 = i)
for all n, i and j. The transition matrix P = (pij ) is the N × N matrix of transition probabilities
pij = P (Xn+1 = j | Xn = i)
16. Theorem
Suppose Xn is a Markov chain on S with transition probabilities pij and initial distribution αi =
P {X0 = i}. Then, for any i0 , . . . , in ∈ S and n ⩾ 0,
Page 2
SUStech Applied Stochastic Process
Proof. Proceeding by induction, this statement is obvious for n = 0. Now, assume it is true for
some n, and let An = {X0 = i0 , . . . , Xn = in }. Then the statement is true for n + 1, since
P (An+1 ) = P (An ) P {Xn+1 = xn+1 | An } = αi0 pi0 ,i1 · · · pin−1 ,in pin ,in+1 ,
17. Consider a Markov chain with transition probability Pij . Let A denote a set of states, and
assume that we want to find the probability that this Markov chain has entered any state in A before
time m. That is to say, for the given state i ∈
/ A, we want to determine
β = P (Xk ∈ A f or some k = 1, · · · , m | X0 = i)
To determine the above probability, we define a Markov chain, {Wn }. Its state is: a state that
does not belong to A plus an additional state, which is referred to as state A in our general discussion.
Once the Markov chain {Wn } enters state A, it remains in it forever.
This new Markov chain is defined as follows. The state of a Markov chain with transition proba-
bility Pi,j at time n is denoted as Xn , defined
N = min {n : Xn ∈ A}
And if Xn ∈
/ A for all n, then let N = ∞. In short, N is the time when the Markov chain first
enters the state set A. Now defined
Xn , if n < N
Wn =
A, if n ⩾ N
So, until the original Markov chain {Xn } enters a certain state in A, the state of process {Wn }is
equal to the state of the original Markov chain. At this moment, the new process reaches state A
and remains there forever. From this description, we deduce that {Wn , n ≥ 0}is a Markov chain with
i(i ∈
/ A) and A as states, and its transition probability Qi,j is
Qi,j = Pi,j , if i ∈
/ A, j ∈
/A
∑
Qi,A = j∈A Pi,j , if i ∈
/A
QA,A = 1
Because the original Markov chain entered the state in A before time m, and only if the new
Markov chain’s state at time m is A, we can see from this that
That is to say, the required probability is equal to the mstep transition probability of the new chain.
Page 3
SUStech Applied Stochastic Process
18. Absorbing State. A closed set containing exactly one state is called absorbing
19. (First Visit Time) Let Tj = min {n ≥ 1 : Xn = j}, i.e. the first time to visit state j, with
convention that Tj = ∞ if the visit never occurs. We have P (Ti = ∞ | X0 = i) > 0 if and only if i
is transient and P (Ti = ∞ | X0 = i) = 0 if and only if i is recurrent. A state is recurrent does not
guarantee that E [Ti | X0 = i] < ∞.
Homework
5. A Markov chain {Xn , n ≥ 0} with states 0, 1, 2, has the transition probability matrix
1 1 1
2 3 6
0 1 2
3 3
1 1
2
0 2
So,
∑
2
P (X3 = 1) = P (X3 = 1 | X0 = i) P (X0 = i)
i=0
11 1 4 1 2 1
= × + × + ×
54 4 27 4 9 2
43
=
216
Page 4
SUStech Applied Stochastic Process
∑
2
P (X3 = 2) = P (X3 = 2 | X0 = i) P (X0 = i)
i=0
47 1 11 1 13 1
= × + × + ×
108 4 27 4 36 2
169
=
432
Hence,
43 169 53
E (X3 ) = 1 × +2× =
216 432 54
8. Suppose that coin 1 has probability 0.7 of coming up heads, and coin 2 has probability 0.6 of coming
up heads. If the coin flipped today comes up heads, then we select coin 1 to flip tomorrow, and if it
comes up tails, then we select coin 2 to flip tomorrow. If the coin initially flipped is equally likely to
be coin 1 or coin 2, then what is the probability that the coin flipped on the third day after the initial
flip is coin 1? Suppose that the coin flipped on Monday comes up heads. What is the probability that
the coin flipped on Friday of the same week also comes up heads?
Solution:There are two ways to do this problem. Either use H and T (results of flips) as state or 1
and 2 (coins used) as state. We used the first. Note that
Then
P (X2 = H) = .65p2 (H, H) + .35p2 (T, H) = .6665.
Note that [ ][ ] [ ]
0.67 0.33 0.67 0.33 0.6667 0.3333
P4 = =
0.66 0.34 0.66 0.34 0.6666 0.3334
Hence,
P (X5 = H | X1 = H) = P 4 (H, H) = 0.6667
13. Let P be the transition probability matrix of a Markov chain. Argue that if for some positive
integer r, P r has all positive entries, then so does P n , for all integers n ≥ r.
Page 5
SUStech Applied Stochastic Process
Proof:Note that
∑
n−r r
Pijn = Pik Pkj .
k
Since
∑
n−r
Pik = 1,
k
there exists at least one k such that
n−r
Pik > 0.
r
For Pkj > 0,
∑
n−r r
Pijn = Pik Pkj > 0
k
14. Specify the classes of the following Markov chains, and determine whether they are transient
or recurrent. 4 matrices (omitted) with the following communication diagrams.
Solution
1) It is clear that the states communicate with each other and hence,they have the same status.For a
finite chain, there is at least one recurrent. Therefore, there is exactly one recurrent class R1 = {1, 2, 3}.
2) From the communicate diagram we see this chain is irreducible, and hence,the finite chain has
only one recurrent class R1 = {1, 2, 3, 4}.
3) From the communicate diagram we see that 1 ↔ 3, 4 ↔ 5, 2 → 1 and 2 → 3. Thus R1 = {1, 3}
and R2 = {4, 5} are irreducible closed sets and hence, recurrent.As ρ23 > 0 and ρ32 = 0 < 1, we see
that 2 is transient.
4) From the communicate diagram we see that 5 → 1 ↔ 2 and 4 → 3.R1 = {1, 2} and R2 = {3}
are irreducible closed sets and hence, recurrent.Since ρ51 > 0 and ρ15 = 0 < 1, 5 is transient. Similarly,
4 is transient.Thus, T = {4, 5} is the transient class.
Supplementary exercises
1. (Equivalent conditions for Markov property) Show that the Markov property is equiv-
alent to each of the following conditions:
1. For all n, m ≥ 1 and all s, x0 , · · · , xn ∈ S
P (Xn+m = s | X1 = x1 , · · · , Xn = xn ) = P (Xn+m = s | Xn = xn )
Page 6
SUStech Applied Stochastic Process
Proof. For the first statement, we prove by induction. For m = 1, the statement is just the
definition of Markov chain, it automatically holds. Now assume for m = d we have
P (Xn+d = s | X1 = x1 , · · · , Xn = xn )
= P (Xn+d = s | Xn = xn ) .
P (Xn+d+1 = s | X1 = x1 , · · · , Xn = xn )
∑
= P (Xn+d+1 = s, Xn+d = k | X1 = x1 , · · · , Xn = xn )
k
∑
= P (Xn+d+1 = s | Xn+d = k, X1 = x1 , · · · , Xn = xn ) P (Xn+d = k | X1 = x1 , · · · , Xn = xn ) .
k
∑ (1)
= P (Xn+d+1 = s | Xn+d = k) P (Xn+d = k | Xn = xn )
k
∑
= P (Xn+d+1 = s, Xn+d = k | Xn = xn )
k
= P (Xn+d+1 = s | Xn = xn ) ,
where the second to the last line uses the Markov property of X. Therefore, by induction we have
proved that for m ∈ N+ , the statement holds.
For the second claim, we can use the same strategy.
2.(Subchain from a Markov chain) Assume X = {Xn : n ≥ 0} is a Markov chain and let
{nk : k ≥ 0} be an unbounded increasing sequence of positive integers. Define a new stochastic process
Y = {Yk : k ≥ 0} such that Yk = Xnk . Show that Y is a Markov chain. Is Y timehomogeneous Markov
chain without additional conditions?
Proof. By definition of Markov chain, for Yn to be a Markov chain, we need to show that
for a sequence of increasing integers n0 < n1 < · · · nk . Since X is a Markov chain, (2) automatically
follows because it is just the alternative defintion of Markov chain. Y may not be homogenous if we
do not have further restrictions because n1 , · · · , nk , · · · may not be equally spaced.
Page 7
SUStech Applied Stochastic Process
Page 8