SF2863 Systems Engineering, 7.5 HP: - Intro To Markov Chains
SF2863 Systems Engineering, 7.5 HP: - Intro To Markov Chains
5 HP
- Intro to Markov Chains
X0 , X1 , X2 , · · ·
It says that
“The conditional probability of a future event depends only on the
present state and not on all past states”
Definition
A stochastic process {Xt } is said to be a Markov Chain if it has the
Markovian property.
Definition
A stochastic process {Xt } has stationary transition probabilities if
m
(2)
X
pij = Pr(Xt+2 = j |Xt+1 = k ∩ Xt = i) Pr(Xt+1 = k |Xt = i) =
k =0
m
X m
X
= Pr(Xt+2 = j |Xt+1 = k ) Pr(Xt+1 = k |Xt = i) = pkj pik .
k =0 k =0
P. Enqvist Systems Engineering
Transition matrices
Define the one-step transition matrix
p00 p01 · · · p0m
..
p10 p11 .
P = pij =
.. .. ..
. . .
pm0 · · · ··· p0m
Define the two-step transition matrix
(2) (2) (2)
p00 p01 ··· p0m
i (2) (2) ..
(2)
h
(2) p10 p11 .
P = pij =
.. .. ..
. . .
(2) (2)
pm0 ··· ··· p0m
Theorem
For any n ≥ 0, m ≥ 0, it holds that P (n+m) = P (n) P (m) .
In particular P (n) = P n .
(n)
pij = Pr(Xt+n = j | Xt = i)
h i
(n)
P (n) = pij (n-step transition matrix)
Definition
(n)
State j is said to be accessible from state i if pij > 0 for some n ≥ 0.
Sometimes denoted i → j.
Definition
State j and i communicate if
i is accessible from j and
j is accessible from i.
Sometimes denoted i ↔ j.
[a] = {b ∈ X | a ∼ b}
Definition
The Markov Chain is irreducible if there is only class, i.e, all states
communicate with each other.
Definition
A state is said to be transient if
upon entering this state, the process never return to this state again
with a probability > 0.
State i is transient if, and only if, there exists a state j that is accessible
from i, but i is not accessible from j, i.e., i → j, j 6→ i.
If one state in a class is transient, then all states in the class are
transient.
Definition
A state is said to be recurrent if
upon entering this state, the process definitely will return to this state
again.
Definition
A state is said to be absorbing if
upon entering this state, the process never will leave this state.
Definition
The period of state i is defined by
gcd(N(i)), if N(i) 6= ∅
d(i) =
0, if N(i) = ∅
Definition
A state is aperiodic if d(i) = 1.
Consider a Markov Chain with three states where p12 = p23 = p31 = 1.
Consider a Markov Chain with ten states where arcs indicate positive
transition probabilities.
Then N(1) = {8, 10, 16, 18, 20, 24, 26, 28, 30, · · · }, and
d(1) = gcd(N(1)) = 2.
Definition
In a finite-state Markov Chain, recurrent state that are aperiodic are
called ergodic states.
A Markoc Chain is said to be ergodic if all states are ergodic states.
Theorem
(n)
For any irreducible ergodic Markov Chain, lim pij exists and is
n→∞
independent of i.
(n)
Furthermore, lim pij = πj > 0 where π = (π0 , π1 , · · · , πM ) satisfy the
n→∞
steady state equations
M
X
π = πP, πj = 1.
j=0
Yes, first p01 , p10 > 0 so the states are communicating and the chain is
irreducibel.
Second, note that p11 > 0, so state 1 is aperiodic and then the chain is
aperiodic, hence it is ergodic.
have a solution?
π0 = 0.4π1 , π1 = π0 + 0.6π1 , π0 + π1 = 1
have a solution?
There is a unique solution π = (0, 0, 1, 0).
In the long run the chain will end up in the absorbing state.
Since the chain is not irreducible πj > 0 will not hold for all j.
p11 p12 p13 p14 0 qA pA 0
p21 p22 p23 p23 qB 0 0 pB
P=
p31
=
p32 p33 p34 0 0 1 0
p41 p42 p43 p44 0 0 0 1
have a solution?
Theorem
If P is the transition matrix of a finite irreducible chain with period d,
then
1 λ0 = 1 is an eigenvalue of P
2 the d complex roots of unity
λ0 = 1, λ1 = ω, · · · , λd−1 = ω d−1 ,
Let µij = Expected number of steps from state i to first visit to state j.
Then for fixed j and i = 0, 1, · · · , m,
m
X
µij = E(# steps) = E(# steps| first step is i to k )pik
k =0
X X
= 1pij + (1 + µkj )pik = 1 + µkj pik .
k 6=j k 6=j
Let fik = Probability of absorption in state k given that the system start
in state i.
Then for fixed k and i = 0, 1, · · · , m,
m
X m
X
fik = Pr( absorption in k | first step is i to j)pij = pij fjk
j=0 j=0
It says that
“The conditional probability of a future event depends only on the
present state and not on all past states”
Definition
A stochastic process {X (t)} is said to be a Markov process if it has the
Markovian property.
P. Enqvist Systems Engineering
Stationarity
Definition
A stochastic process {Xt } has stationary transition probabilities if
Theorem (Chapman-Kolmogorov)
For any s, t ≥ 0, it holds that P(s + t) = P(s)P(t).
Let
pi (t) = Pr(X (t) = i)
Then
m
X m
X
pi (t + s) = Pr(X (t + s) = i |X (t) = k ) Pr(X (t) = k ) = pik (s)pk (t)
k =0 k =0
i.e., p(t + s) = p(t)P(s) if p(t) = p0 (t) p1 (t) · · · pm (t) .
Hence p(t) = p(0)P(t).
Then
pii (h) ≈ 1 + hqii
is the probability of no jump in the next time interval of length h.
Note: qii ≤ 0 is necessary to get a probability.
And
pij (h) ≈ hqij
is the probability of jump i to j in the next time interval of length h.
Note: qij ≥ 0 is necessary to get a probability.
P. Enqvist Systems Engineering
Transition rates
qij can be interpreted as the transition rate from state i to j,
i.e. the average number of jumps from i to j in one time unit.
Since qii is negative, one usually defines the transition rate out of i as
X
qi = −qii = qij ,
j6=i
i.e. the average number of jumps out from i in one time unit.
P. Enqvist Systems Engineering
Stationarity
πQ = 0.
Definition
The Markov Chain is irreducible if all states communicate with each
other.
Theorem
For a finite irreducible Markov process there always exists unique
steady state probabilities π that solve the steady state equations
m
X
πQ = 0, πj = 1.
j=0
Theorem
If the process is not finite, the existence of steady-state probabilities is
equivalent to the existence of solutions to the steady state equations.
Then
1 − FTi (t)
FT0 i (t) = lim+ Pr(Ti ≤ t + h | Ti > t).
h→0 h
P. Enqvist Systems Engineering
The time to next transition
Using that
gives that
FT0 i (t) = (1 − FTi (t))qi .
Then
qij
Letting h → 0+ we get pij = −qii .
Each time the process enters state i it stays there for a stochastic time
Ti before it jumps to a new state.
Where
1 Ti ∈ Exp(qi )
2 pij is the probability that the next state is j
3 the next state visited after state i is independent of the time spent
in state i
qe−qt if t ≥ 0
fT (t) =
0 if t < 0
If the event has not happened at time t, the probability that it will
happen the next ∆t time units is the same as it was when we started
waiting.
Proof: (Property 2 - 17.4)
Pr ((T > t + ∆t) ∩ (T > t)) Pr (T > t + ∆t)
Pr ((T > t + ∆t)|T > t) = = =
Pr(T > t) Pr(T > t)
e−q(t+∆t)
= = e−q∆t = Pr(T > ∆t).
e−qt
(qt)n e−qt
Then Pr(X (t) = n) = ,
n!
i.e., X (t) is Poisson distributed with expected value E(X (t)) = qt.
Birth process with constant birth rate q.
If {Xt }t≥0 is a Poisson process with rate λX and {Yt }t≥0 is a Poisson
process with rate λY , then {Zt = Xt + Yt }t≥0 is a Poisson process with
rate λZ = λX + λY .
If {Zt }t≥0 is a Poisson process with rate λZ , and two new arrival
processes {Xt }t≥0 and {Yt }t≥0 , are created by letting each arrival in
the Z process with probability pX be allocated to the X process and
otherwise be allocated to the Y process. Then the X and Y processes
are Poisson processes with rates λX = pX λZ and λY = (1 − pX )λZ .
What is the mean waiting time until the next car passes the hitchhiker?
n
1 X1 2 1 1 1 1
W ≈ Ti → nE(Ti2 ) = 2E(Ti )2 = E(Ti ) = 10m
nE(Ti ) 2 nE(Ti ) 2 E(Ti ) 2
i=1