Chapter3 Part1
Chapter3 Part1
Markov Chains
4.1 Introduction
2
Markov chain:
A discrete time process {Xn, n = 0, 1, 2, . . .} with discrete state
Space X n {0,1,2,…,} is a Markov chain if it has the Markov
property:
Define
The value Pij represents the probability that the process will, when
in state i, next make a transition into state j.
Since probabilities are nonnegative and since the process must
make a transition into some state, we have that
4.1 Introduction
4
Such Markov chain is a random walk with barriers (states 0 and N).
States 0 and N are called absorbing states since once entered they
are never left.
Example 4.1.1. The simple Random
Walk
6
Example 4.1. The simple
7
Random Walk
Proof.
Example 4.1. The simple Random
Walk
8
Proof. (cont.)
Example 4.1. The simple Random
Walk
9
Proof. (cont.)
Example 4.1 The simple Random
Walk
10
Take α=0.7, β=0.4. Given it its raining today, what is the chance
that it will rain after 4 days from today?
Solution:
Example 4.2.1 (Four-days weather forecast)
16
Example 4.2.2 An urn always contains 2 balls. Ball colors are red
and blue. At each stage a ball is randomly chosen and then replaced
by a new ball, which with probability 0.8 is the same color, and
with probability 0.2 is the opposite color, as the ball it replaces. If
initially both balls are red, find the probability that the fifth ball
selected is red.
Solution: Pnij is the probability that the state at time n is j given that the initial
state at time 0 is i.
p=matrix(c(0.4,.2,0.1,0.6,0.5,0.7,0,0.3,0.2),nrow=3)
pp=0
for (i in 2:3)
for (j in 2:3)
for (k in 2:3)
pp=pp+p[3,i]*p[i,j]*p[j,k]