MarkovChains (Part2)
MarkovChains (Part2)
(Part 2)
More Examples and Chapman-Kolmogorov Equations
Markov Chains - 1
A Stock Price Stochastic Process
• Consider a stock whose price either goes up or down every day. Let
Xt be a random variable that is:
– 0 if the stock price goes up on day t, and
– 1 if the stock price goes down on day t.
• The probability that the stock price goes up tomorrow, given it goes
up today, is 0.7. If the stock goes down today, the probability that it
goes up tomorrow is 0.5.
• Does the stochastic process Xt possess the Markovian property?
• What is the one-step transition probability matrix?
Markov Chains - 2
A Stock Price Stochastic Process
• Consider a stock whose price either goes up or down every day. Let
Xt be a random variable that is:
– 0 if the stock price goes up on day t, and
– 1 if the stock price goes down on day t.
• The probability that the stock price goes up tomorrow, given it goes
up today, is 0.7. If the stock goes down today, the probability that it
goes up tomorrow is 0.5.
• Does the stochastic process Xt possess the Markovian property?
• What is the one-step transition probability matrix?
Markov Chains - 4
A Stock Price Stochastic Process
• We can expand the state space to include a little bit of history, and create a
Markov chain.
• Let Xt be a random variable that has four states:
– 0, 0 if the stock price went up yesterday on day t-1 and up today on day t,
– 1, 0 if the stock price went down yesterday on day t-1 and up today on day t,
– 0, 1 if the stock price went up yesterday on day t-1 and down today on day t,
– 1, 1 if the stock price went down yesterday on day t-1 and down today on day t
• Intuitively, now the stochastic process Xt satisfies the Markovian property
Markov Chains - 5
A Stock Price Stochastic Process
• Now, the one-step transition probability matrix is 4x4
Markov Chains - 6
Multi-step Transition Probabilities
• So far, we have only focused on one-step transition probabilities pij
– But these don’t directly provide answers to some interesting questions
• For example, if it is sunny today, what is the probability that it will
be sunny day after tomorrow?
• If the stock went down today, what is the probability that it will go
down three days later?
– These are called multi-step (or n-step) transition probabilities.
– In particular, we want to find P(Xt+n=j | Xt= i), which is denoted by pij(n)
• The Chapman-Kolmogorov (C-K) equation is a formula to calculate n-step
transition probabilities.
n-step Transition Probabilities
• Interpretation:
Xt
j j
i i
Markov Chains - 8
Inventory Example
n-step Transition Probabilities
Xt
3
t
0 1 2 3
Markov Chains - 9
Two-step Transition Probabilities for
the Weather Example
• Intuition: to go from state 0 to 0 in two steps we can either
– Go from 0 to 0 in one step and then go from 0 to 0 in one step OR
– Go from 0 to 1 in one step and then go from 1 to 0 in one step
• Therefore, p(2)00 = P(X2 = 0 | X0 =0) = p00p00+p01p10
• In short,
(2) 1
p = ! p p
00 0k k0
k=0
• You just wrote down your first Chapman-Kolmogorov equation using
intuition
• Now use the above intuition to write down the other 2-step transition
probabilities p(2)01 ,p(2)10 ,p(2)11
00
p % "p p + p p
(2)
01 00 00 01 10
p p +p p %
00 01 01 11
P =$
(2)
'=$ '
#p p & #p p + p p p p +p p &
(2) (2)
10 11 10 00 11 10 10 01 11 11
Markov Chains - 11
Two-step Transition Probabilities for
General Markov Chains
• For a general Markov chain with states 0,1,…,M, to make a two-step
transition from i to j, we go to some state k in one step from i and
then go from k to j in one step. Therefore, the two-step transition
probability matrix is, P(2)=P2
! (2) (2) $
# p00 p01 ... p0(2)M & M
# p
P (2) = # 10
(2)
p(2)
11 ... (2) &
p1M
&
with p = ! pik pkj
(2)
ij
# ! ! & k=0
# p(2) pM(2)1 ... (2) &
pMM
" M0 %
Markov Chains - 12
n-step Transition Probabilities for
General Markov Chains
• For a general Markov chain with states 0,1,…,M, the n-step transition
from i to j means the process goes from i to j in n time steps
• Let m be a non-negative integer not bigger than n. The Chapman-
Kolmogorov equation is: M
p ="p p
(n)
ij
(m) (n!m)
ik kj
k=0
• Interpretation: if the process goes from state i to state j in n steps then
it must go from state i to some state k in m (less than n) steps and
then go from k to j in the remaining n-m steps.
• In matrix notation, P(n)=P(m) P(n-m). This implies that the n-step
transition matrix is the nth power of the one-step transition matrix
(Why? - substitute m=1 and see what happens!)
Markov Chains - 13
Chapman-Kolmogorov Equations
M
(n)
p
ij
= " pik(m) pkj(n!m) for all i, j, n and 0 ≤ m ≤ n
k=0
j (n "1)
p p
ik kj
i
Markov Chains - 14
! ! t
0 1 n
Chapman-Kolmogorov Equations
P (n) = P ! P (n "1)
P ! P ! P (n " 2)
=
!
= P ! P ! P"P = P n
Markov Chains - 15
How to use C-K Equations
Markov Chains - 16
Weather Example
n-step Transitions
Markov Chains - 17
Inventory Example
n-step Transitions
Question:
Assuming the store starts with 3 cameras, find the
probability there will be 0 cameras in 2 weeks
p30(2) = 0.249
Markov Chains - 19
(Unconditional) Probability in state j at time n
• The transition probability pij(n) is a conditional probability,
P(Xn=j | X0=i)
• How do we un-condition the probabilities?
• That is, how do we find the (unconditional) probability of
being in state j at time n, P(Xn=j)?
• The probabilities P(X0=i) define the initial state distribution
M
P(Xn = j) = ! P(Xn = j | X0 = i)P(X0 = i)
i=0
M
= ! p P(X0 = i)
(n)
ij
Markov Chains - 20
i=0
Inventory Example
Unconditional Probabilities
Markov Chains - 21
Steady-State Probabilities
Markov Chains - 22
Steady-State Probabilities
• In the long-run (e.g., after 8 or more weeks),
the probability of being in state j is …
lim pij( n ) = ! j
n #"
Markov Chains - 23