1
Markov Chains
4.1 Introduction
2
Markov chain:
A discrete time process {Xn, n = 0, 1, 2, . . .} with discrete state
Space X n {0,1,2,…,} is a Markov chain if it has the Markov
property:
In words, for a Markov chain, the conditional distribution of any future
state Xn+1, given the past states X0,X1, . . . , Xn−1 and the present state Xn,
is independent of the past states and depends only on the present state.
4.1 Introduction
3
We consider homogeneous Markov chains for which
Define
The value Pij represents the probability that the process will, when
in state i, next make a transition into state j.
Since probabilities are nonnegative and since the process must
make a transition into some state, we have that
4.1 Introduction
4
Let P = [Pij ] denote the (possibly infinite) transition matrix of
the one-step transition probabilities so that
n step transition probability matrix P(n) = (p(n)ij)
pij(n) = P{Xn = j | X0 = i}
Example 4.1.1 (A Gambling model)
5
A gambler either wins 1 EUR with probability p or loses 1 EUR
w.p. 1 − p. The gambler quits if he either goes broke or attains a
fortune N EUR. Then the gambler’s fortune is a Markov chain
with state space {0, 1, . . . ,N} having transition probabilities:
Such Markov chain is a random walk with barriers (states 0 and N).
States 0 and N are called absorbing states since once entered they
are never left.
Example 4.1.1. The simple Random
Walk
6
Example 4.1. The simple
7
Random Walk
Proof.
Example 4.1. The simple Random
Walk
8
Proof. (cont.)
Example 4.1. The simple Random
Walk
9
Proof. (cont.)
Example 4.1 The simple Random
Walk
10
• From the proposition, it follows upon conditioning on whether
Sn = +i or −i that
4.2. Chapman-Kolmogorov
Equations
11
4.2 Chapman-Kolmogorov Equations
12
4.2 Chapman-Kolmogorov Equations
13
Example 4.2.1 (Forecasting the
weather)
14
Suppose that the chance of rain tomorrow depends on previous
conditions only through whether or not it is raining today.
• If it rains today, it will rain tomorrow with probability α.
• If it does not rain today, it will rain tomorrow with probability
β
The process is in state 0 if it rains and in state 1 if it does not
rain. Then we have a two-state Markov chain whose transition
probabilities are given by
Example 4.2.1 (Four-days weather
forecast)
15
Still consider the Example: State 0: rain, State 1: no rain
Take α=0.7, β=0.4. Given it its raining today, what is the chance
that it will rain after 4 days from today?
Solution:
Example 4.2.1 (Four-days weather forecast)
16
State 0: rain, State 1: no rain
4.2 Chapman-Kolmogorov Equations
17
Example 4.2.2 An urn always contains 2 balls. Ball colors are red
and blue. At each stage a ball is randomly chosen and then replaced
by a new ball, which with probability 0.8 is the same color, and
with probability 0.2 is the opposite color, as the ball it replaces. If
initially both balls are red, find the probability that the fifth ball
selected is red.
Solution: Let us define Xn to be the number of red balls in the urn
after the nth selection and subsequent replacement. Then Xn, n ≥ 0,
is a Markov chain with states 0, 1, 2 and with transition probability
matrix P given by
4.2 Chapman-Kolmogorov Equations
18
Example 4.2.2 (cont.)
To determine the probability that the fifth selection is red, condition on
the number of red balls in the urn after the fourth selection. This yields
To calculate the preceding we compute P4. Doing so yields
giving the answer P(fifth selection is red) = 0.7048.
Another weather example
19
Solution: Pnij is the probability that the state at time n is j given that the initial
state at time 0 is i.
p=matrix(c(0.4,.2,0.1,0.6,0.5,0.7,0,0.3,0.2),nrow=3)
pp=0
for (i in 2:3)
for (j in 2:3)
for (k in 2:3)
pp=pp+p[3,i]*p[i,j]*p[j,k]