ST3236 Note5
ST3236 Note5
Somabha Mukherjee
1 / 18
Outline
1 Introduction
3 Examples
2 / 18
What is a Markov Chain?
A (discrete-time) Markov chain is a stochastic process X0 , X1 , X2 , . . .
satisfying:
Given the present state (time n), the future state (time n + 1) is independent
of the past state (times n − 1, n − 2, . . . , 1).
State Space: The set S of all possible values the Markov chain can take, is
called the state space. If S is discrete, the Markov chain is called
discrete-state. We will work with discrete-state Markov chains only.
In this case, pij := P(X1 = j|X0 = i) are called 1-step transition probabilities,
and the matrix P := ((pij ))i,j∈S is called the transition matrix of the Markov
chain.
P
The transition matrix is stochastic, i.e. j∈S pij = 1 for all i ∈ S.
X X
pij = P(X1 = j|X0 = i) = P(X1 ∈ S|X0 = i) = 1 .
j∈S j∈S
4 / 18
State Diagrams
1 2/3
1 1/3
5 / 18
Outline
1 Introduction
3 Examples
6 / 18
The Chapman-Kolmogorov Equation
An easy calculation
7 / 18
What does the Chapman-Kolmogorov equation say?
(k)
Let us denote by P (k) := ((pij ))i,j∈S the k-step transition matrix.
(k) P
Chapman-Kolmogorov Equation: pij = i1 ,i2 ,...,ik−1 ∈S pii1 pi1 i2 . . . pik−1 j .
Remember this!
The k-step transition matrix is just the k th power of the (1-step) transition matrix.
That’s how you can easily compute the k-step transition probabilities using
any programing platform that can handle matrix computations (for example
MATLAB). Well, of course S needs to be finite for this!
8 / 18
Outline
1 Introduction
3 Examples
9 / 18
The Weather Model
Remember the simple weather model we discussed during our first lecture?
If it rains today, then there is a 60% chance that tomorrow is rainy, and 40%
chance that tomorrow is sunny. If it is sunny today, then there is a 70%
chance that tomorrow is sunny and 30% chance that tomorrow is rainy.
This Markov chain has state space S := {1, 2}, where 1 denotes rainy and 2
denotes sunny.
Question: Given that today is rainy, what is the probability that the same day
next week will be sunny?
10 / 18
k-Step Transition Probabilities for the Weather Model
If X0 denotes today’s weather condition and X7 denotes the weather
condition same day next week, then we are interested in:
Calculate P 7 in R:
7 0.4286964 0.5713036
P =
0.4284777 0.5715223
So, the required probability is 0.5713036.
11 / 18
A Chess Player’s Psychology
Grandmaster and reigning world champion Emmanuel’s mood in a particular
chess game is affected only by his performance in the immediately previous
game. He trained his mind strong enough to forget about results of games
before the immediately previous one, given his immediately previous result.
1 If he wins a game, he will win the next one with probability 0.4 and draw the
next one with probability 0.3.
2 If he draws a game, he will win the next one with probability 0.3 and draw the
next one with probability 0.4.
3 If he loses a game, he will win the next one with probability 0.2 and draw the
next one with probability 0.3.
Their first game starts with a draw. What is the probability that Emmanuel
wins the last (14th ) game?
Seems that no matter what the first game’s result is, Emmanuel’s probability
of winning the last game is 0.2916667.
13 / 18
Random Walks
Let Y1 , Y2 , . . . be independent random variables, and define
Xn := Y1 + . . . + Yn .
Xn is a Markov Chain
15 / 18
Branching Process
16 / 18
Why is the Branching Process a Time-Homogeneous
Markov Chain?
Since (X1 , . . . , Xn ) is a function of {Yi,j }i≥1,j≤n , it is independent with
{Yi,n+1 }i≥1 . Hence, we have:
The state
Pspace is clearly
the set of all non-negative integers. Also,
i
pij = P Y
k=1 k,1 = j .
17 / 18
Transition Probabilities of a Branching Process
Note that
i
X
Yk,1 ∼ Poisson(iλ) .
k=1
Hence, we have:
i
!
X e −iλ (iλ)j
pij = P Yk,1 = j = .
j!
k=1
18 / 18