0% found this document useful (0 votes)
29 views23 pages

MarkovChains (Part2)

Uploaded by

adalina
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
29 views23 pages

MarkovChains (Part2)

Uploaded by

adalina
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 23

Markov Chains

(Part 2)
More Examples and Chapman-Kolmogorov Equations

Markov Chains - 1
A Stock Price Stochastic Process
• Consider a stock whose price either goes up or down every day. Let
Xt be a random variable that is:
– 0 if the stock price goes up on day t, and
– 1 if the stock price goes down on day t.
• The probability that the stock price goes up tomorrow, given it goes
up today, is 0.7. If the stock goes down today, the probability that it
goes up tomorrow is 0.5.
• Does the stochastic process Xt possess the Markovian property?
• What is the one-step transition probability matrix?

Markov Chains - 2
A Stock Price Stochastic Process
• Consider a stock whose price either goes up or down every day. Let
Xt be a random variable that is:
– 0 if the stock price goes up on day t, and
– 1 if the stock price goes down on day t.
• The probability that the stock price goes up tomorrow, given it goes
up today, is 0.7. If the stock goes down today, the probability that it
goes up tomorrow is 0.5.
• Does the stochastic process Xt possess the Markovian property?
• What is the one-step transition probability matrix?

Stock behavior Probability of Probability of


today stock going up stock going
tomorrow down tomorrow
Up 0.7 0.3
Down 0.5 0.5
Markov Chains - 3
A Stock Price Stochastic Process
• Now, suppose the probability of whether the stock goes up or down
tomorrow depends on the stock’s behavior today and yesterday
• Intuitively, can we define a stochastic process Xt that possesses the
Markovian property?

Stock behavior Stock behavior Probability of


yesterday today stock going up
tomorrow
Up Up 0.9
Down Up 0.6
Up Down 0.5
Down Down 0.3

Markov Chains - 4
A Stock Price Stochastic Process
• We can expand the state space to include a little bit of history, and create a
Markov chain.
• Let Xt be a random variable that has four states:
– 0, 0 if the stock price went up yesterday on day t-1 and up today on day t,
– 1, 0 if the stock price went down yesterday on day t-1 and up today on day t,
– 0, 1 if the stock price went up yesterday on day t-1 and down today on day t,
– 1, 1 if the stock price went down yesterday on day t-1 and down today on day t
• Intuitively, now the stochastic process Xt satisfies the Markovian property

Markov Chains - 5
A Stock Price Stochastic Process
• Now, the one-step transition probability matrix is 4x4

State Transition 0,0 1,0 0,1 1,1


(t-1,t) to (t,t+1) (up, up) (down, up) (up, down) (down, down)

0,0 (up, up) 0.9 0 0.1 0

1,0 (down, up) 0.6 0 0.4 0

0,1 (up, down) 0 0.5 0 0.5

1,1 (down, down) 0 0.3 0 0.7

Markov Chains - 6
Multi-step Transition Probabilities
• So far, we have only focused on one-step transition probabilities pij
– But these don’t directly provide answers to some interesting questions
• For example, if it is sunny today, what is the probability that it will
be sunny day after tomorrow?
• If the stock went down today, what is the probability that it will go
down three days later?
– These are called multi-step (or n-step) transition probabilities.
– In particular, we want to find P(Xt+n=j | Xt= i), which is denoted by pij(n)
• The Chapman-Kolmogorov (C-K) equation is a formula to calculate n-step
transition probabilities.
n-step Transition Probabilities

• If the one-step transition probabilities are stationary,


then the n-step transition probabilities are written:
P(Xt+n=j | Xt=i) = P(Xn=j | X0=i) for all t
= pij (n)

• Interpretation:
Xt

j j

i i

0 1 2 … n … t t+1 t+2 … t+n

Markov Chains - 8
Inventory Example
n-step Transition Probabilities

• p12(3) = conditional probability that…


starting with one camera, there will be two
cameras after three weeks
• Four ways that could happen:

Xt
3

t
0 1 2 3

Markov Chains - 9
Two-step Transition Probabilities for
the Weather Example
• Intuition: to go from state 0 to 0 in two steps we can either
– Go from 0 to 0 in one step and then go from 0 to 0 in one step OR
– Go from 0 to 1 in one step and then go from 1 to 0 in one step
• Therefore, p(2)00 = P(X2 = 0 | X0 =0) = p00p00+p01p10
• In short,
(2) 1
p = ! p p
00 0k k0
k=0
• You just wrote down your first Chapman-Kolmogorov equation using
intuition

• Now use the above intuition to write down the other 2-step transition
probabilities p(2)01 ,p(2)10 ,p(2)11

• These four two-step transition probabilities can be arranged in a


matrix P(2) called the two-step transition matrix Markov Chains - 10
Two-step Transition Probabilities for
the Weather Example
"p (2)

00
p % "p p + p p
(2)

01 00 00 01 10
p p +p p %
00 01 01 11
P =$
(2)
'=$ '
#p p & #p p + p p p p +p p &
(2) (2)
10 11 10 00 11 10 10 01 11 11

• Interpretation : p01(2) is the probability that the weather day after


tomorrow will be rainy if the weather today is sunny.
! • An interesting observation: the two-step transition matrix is the
square of the one-step transition matrix!!! That is,
P(2)=P2
• Why?
Recall matrix product to write down P2 and confirm that it
equals P(2) above.

Markov Chains - 11
Two-step Transition Probabilities for
General Markov Chains
• For a general Markov chain with states 0,1,…,M, to make a two-step
transition from i to j, we go to some state k in one step from i and
then go from k to j in one step. Therefore, the two-step transition
probability matrix is, P(2)=P2
! (2) (2) $
# p00 p01 ... p0(2)M & M
# p
P (2) = # 10
(2)
p(2)
11 ... (2) &
p1M
&
with p = ! pik pkj
(2)
ij
# ! ! & k=0
# p(2) pM(2)1 ... (2) &
pMM
" M0 %

Markov Chains - 12
n-step Transition Probabilities for
General Markov Chains
• For a general Markov chain with states 0,1,…,M, the n-step transition
from i to j means the process goes from i to j in n time steps
• Let m be a non-negative integer not bigger than n. The Chapman-
Kolmogorov equation is: M
p ="p p
(n)
ij
(m) (n!m)
ik kj
k=0
• Interpretation: if the process goes from state i to state j in n steps then
it must go from state i to some state k in m (less than n) steps and
then go from k to j in the remaining n-m steps.
• In matrix notation, P(n)=P(m) P(n-m). This implies that the n-step
transition matrix is the nth power of the one-step transition matrix
(Why? - substitute m=1 and see what happens!)

Markov Chains - 13
Chapman-Kolmogorov Equations
M
(n)
p
ij
= " pik(m) pkj(n!m) for all i, j, n and 0 ≤ m ≤ n
k=0

• Consider the case when m = 1:


(n) M (n "1) (n "1)
p = $ p p = P#P
ij k = 0 ik kj
Xt
M
! k

j (n "1)
p p
ik kj
i

Markov Chains - 14
! ! t
0 1 n
Chapman-Kolmogorov Equations

• The pij(n) are the elements of the n-step transition


matrix, P(n)

• Note, though, that

P (n) = P ! P (n "1)
P ! P ! P (n " 2)
=
!
= P ! P ! P"P = P n
Markov Chains - 15
How to use C-K Equations

• To answer the following question: what is the probability


that starting in state i the Markov chain will be in state j
after n steps?
– First write down the one-step transition probability matrix.
– Then use your calculator to calculate the nth power of this one-
step transition probability matrix
– Write down the ijth entry of this nth power matrix.

Markov Chains - 16
Weather Example
n-step Transitions

Two-step transition probability matrix:


Sunny Rainy

P(2) = Sunny &0.5 0.5# 2


Rainy $%0.2 0.8!"
Sunny Rainy
Sunny &0.35 0.65#
=
Rainy $%0.26 0.74!"

Markov Chains - 17
Inventory Example
n-step Transitions

Two-step transition probability matrix:

"0.080 0.184 0.368 0.368%2


$ '
0.632 0.368 0 0 '
P(2) = $
$0.264 0.368 0.368 0 '
$0.080 0.184 0.368 0.368'&
#
&0.249 0.286 0.300 0.165#
$0.283 0.252 0.233 0.233!!
! = $
$ 0.351 0.319 0.233 0.097 !
$ !
%0.249 0.286 0.300 0.165"

Note: even though p12=0, p12(2) >0 Markov Chains - 18


Inventory Example
n-step Transitions

p13(2) = probability that the inventory goes from 1 camera to


3 cameras in two weeks
= 0.233
(note: even though p13 = 0)

Question:
Assuming the store starts with 3 cameras, find the
probability there will be 0 cameras in 2 weeks

p30(2) = 0.249

Markov Chains - 19
(Unconditional) Probability in state j at time n
• The transition probability pij(n) is a conditional probability,
P(Xn=j | X0=i)
• How do we un-condition the probabilities?
• That is, how do we find the (unconditional) probability of
being in state j at time n, P(Xn=j)?
• The probabilities P(X0=i) define the initial state distribution
M
P(Xn = j) = ! P(Xn = j | X0 = i)P(X0 = i)
i=0
M
= ! p P(X0 = i)
(n)
ij
Markov Chains - 20
i=0
Inventory Example
Unconditional Probabilities

• If initial conditions were unknown, we might assume it’s


equally likely to be in any initial state:
P(X0=0) = ¼ = P(X0=1) = P(X0=2) = P(X0=3)
• Then, what is the probability that we order (any) camera in
two weeks?
P(order in 2 weeks) = P(in state 0 at time 2)
= P(X0=0)p00(2)+P(X0=1)p10(2)+P(X0=2)p20(2) +P(X0=3)p30(2)
= ¼(0.249) + ¼(0.283) + ¼(0.351) + ¼(0.249)
= 0.283

Markov Chains - 21
Steady-State Probabilities

• As n gets large, what happens?


• What is the probability of being in any state?
(e.g., in the inventory example, what happens as more
and more weeks go by?)
• Consider the 8-step transition probability for the
inventory example.
&0.286 0.285 0.264 0.166#
$0.286 0.285 0.264 0.166!!
P(8) = P8 = $
$0.286 0.285 0.264 0.166!
$ !
%0.286 0.285 0.264 0.166"

Markov Chains - 22
Steady-State Probabilities
• In the long-run (e.g., after 8 or more weeks),
the probability of being in state j is …

• These probabilities are called the steady state probabilities

lim pij( n ) = ! j
n #"

• Another interpretation is that πj is the fraction of time the process is


in state j (in the long-run)
• This limit exists for any irreducible ergodic Markov chain
(Next, we will define these terms, then return to steady-state
probabilities)

Markov Chains - 23

You might also like