Markov Chains
Markov Chains
Definition 1
A stochastic
Definition 2
Consider
Lecture: 01
Definition 3
The values pij are called transition probabilities and the
matrix
p00
p
10
p20
p01
p11
p21
p02
p12
p22
Example 1
The Land of Oz is blessed by many things, but not by
good weather. They never have two nice days in a row.
If they have a nice day, they are just as likely to have
snow as rain the next day. If they have snow or rain,
they have an even chance of having the same the next
day. If there is change from snow or rain, only half of
the time is this a change to a nice day. Represent the
successive weather in the Land of Oz by a Markov
chain.
Lecture: 01
Lecture: 01
Example 3 (2)
each parent. Consider a process of continued matings
and assume that there is at least one offspring. An
offspring is chosen at random and is mated with a
hybrid and this process repeated through a number of
generations. Represent the genetic type of the chosen
offspring in successive generations by a Markov chain.
Lecture: 01
Lecture: 01
10
Example 5 (2)
Given the following Bonus Malus system
Next state if
State
Prem
0 claims
1 claim
2 claims
>2 claims
200
250
400
600
11
Example 6
Let Zn represent the outcome during the nth roll of a fair
die. Define Xn to be the maximum outcome obtained so
far after the nth roll, i.e., Xn = max{Z1,Z2,,Zn}. Specify
the transition matrix for {Xn}.
Lecture: 01
12
Remark 1
To give
Lecture: 01
13
Example 7
A Markov chain has the transition matrix
.1 .2 .7
P .9 .1 0
.1 .8 .1
and initial distribution 0 = 0.3, 1 = 0.4 and 2 = 0.3 .
Determine Pr{X0=0, X1= 2, X2=2}, Pr{X2=2, X3=1 |
X1=0} and Pr{X2 = 1 | X0 = 2}.
Lecture: 01
14
Example 8
A Markov chain has the transition matrix
1
2
1
3
1
3
P 0
12 0
1
6
2
3
1
2
Lecture: 01
15