Chapter 1.3 Markov Chain
Chapter 1.3 Markov Chain
chain
Recap
• Briefing Deterministic model
• Unique nature of Deterministic model
Contents:
• Characteristics of Markov analysis
• Application of Markov analysis
• State and transitions probabilities
In a Markov chain, the future depends only upon the present:
NOT upon the past.
Introduction
• The text-book image of a Markov chain has a flea
hopping about at random on the vertices of the
transition diagram, according to the probabilities
shown. The transition diagram below shows a system
with 7 possible states:
• state space S = { 1,2,3,4,5,6,7}
Introduction
• Questions of interest
• Starting from state 1, what is the probability of ever
reaching state 7?
• Starting from state 2, what is the expected time
taken to reach state 4?
•Starting from state 2, what is the long-run proportion
of time spent in state 3?
• Starting from state 1, what is the probability of being
in state 2 at time t ? Does the probability converge
as t→∞, and if so, to what?
Introduction
• A Markov chain is a mathematical model of a
random phenomenon evolving with time in a way
that the past affects the future only through the
present.
• The “time” can be discrete (i.e. the integers),
continuous (i.e. the real numbers), or, more
generally, a totally ordered set.
• In Mathematics, a phenomenon which evolves with
time in a way that only the present affects the
future is called a dynamical system.
Markov chains and Markov processes
• Important classes of stochastic processes are
Markov chains and Markov processes.
• A Markov chain is a discrete-time process for which
the future behavior, given the past and the present,
only depends on the present and not on the past.
• A Markov process is the continuous-time version of
a Markov chain.
• Many queuing models are in fact Markov processes.
Markov chain, characteristics
It is a particular class of probabilistic models known as stochastic
probabilities.
The transition probabilities between Xt and Xt+n are
and 0≤ Pij ≤ 1.
Application of Markov Analysis
Production: helpful in evaluating alternative maintenance
policies, certain classes of inventory and queuing problems,
inspection analysis.
Marketing: useful in analysis and predicting customer’s buying
behavior in terms of loyalty to a particular product brand,
switching patterns to other brands, and marketing share of the
company versus its competitors
Personnel: Determining future requirements of an organization
taking into consideration retirements, deaths, resignations, etc.
Finance: Customer accounts receivable behavior.
Example Markov chain in weather
prediction
• Design a Markov Chain to predict the weather of
tomorrow using previous information of the past
days.
Next month
This month Petroco National
This This
→ month
month
.60
.40
Petroco National .80
.20
Fig. .1 Transition Diagram for the brand switching problem
Time (shift) Joint
probability
1 2 3
P 11=
.6 0 Petroco .36
.6 0 Petroco
P1
=
1
P12 =.
40 National .24
Petroco
P1 =
2 . P 21=
.2 0 Petroco .08
40
National
P22 =.80 National .32
Fig.2: Tree diagram Sum = 1.00
If a customer trades with pertoco in month (1)
The probability that a customer purchasing gasoline from Petroco
in month 3;
.36 + .08 = .44
The probability of a customer’s trading with National in month 3
is
Time (shift) Joint
probability
1 2 3
= .6 0 Petroco
P1 1 .12
= .2
0 Petroco
P1
1 P12=
. 40 National .08
National
P12 P2 1= .2 0 Petroco .16
= .80
National
P22=.8
0 National .64
Fig.3: Tree diagram Sum = 1.00
If a customer trades with National in month (1)
The probability that a customer purchasing gasoline from Petroco
in month 3;
.12 + .16 = .28
The probability of a customer’s trading with National in month 3
is
Solution… cont’d
Pn = p n-1 x P
• Let, V = represents the vector transition matrix ( fore example V 1=
Vector of state probabilities at period, (n=1), then
and
V2(month2) = V1P
= =
V3 (month2) = V2P
= =
Solution
To determine the steady-state probability for period i+1, we
normally do the following equations.
Steady- state probability …cont’d
• From our previous discussion
• Vi+1= Vi P
=
For the first raw of the matrix;
Pi+111 = .60Pi11 + .2pi12 Substituting …………………..(eq.4) in (eq.2)
Pi+112 = .40pi11 + .80pi12 Pi11= .33 and Pi12= .67
Once steady state is reached
Pi+111= pi11, Pi+1 12= pi12
P11= .6p11 + .2p12………………(1) For the second row of the matrix;
Pi+121 = .60Pi21 + .2pi22
P12= .4p11 + .8 p12…………………….(2) Pi+122 = .40pi21 + .80pi22
P11 + p12 = 1.0………………… (3) By solving the problem in similar manner for
row1, the values of raw 2 are:
P11 = 1.0 – p12 ………………………………(4)
P 21= .33 P 22= .67
i i
• Example3:inventory model
• A camera store stocks a particular model camera that can be
ordered weekly. Let D1, D2, … represent the demand for this
camera (the number of units that would be sold if the inventory
is not depleted) during the first week, second week, …,
respectively. It is assumed that the Di’s are independent and
identically distributed random variables having a Poisson
distribution with a mean of 1. Let X0 represent the number of
cameras on hand at the outset, X1 the number of cameras on
hand at the end of week 1, X2 the number of cameras on hand
at the end of week 2, and so on.
– Assume that X0 = 3.
– On Saturday night the store places an order that is delivered
in time for the next opening of the store on Monday.
– The store using the following order policy: If there are no
cameras in stock, 3 cameras are ordered. Otherwise, no
order is placed.
– Sales are lost when demand exceeds the inventory on hand
• Draw the transition diagram and show the
Inventory model continued
• A random variable X satisfies the Poisson Distribution if
• 1. The mean number of occurrences of the event, m, over a
fixed interval of time or space is a constant. This is called
the average characteristic. (This implies that the number of
occurrences of the event over an interval is proportional to
the size of the interval.)
• 2. The occurrence of the event over any interval is
independent of what happens in any other non-
overlapping interval.
If X follows a Poisson Distribution with mean m,
the probability of x events occurring is given
by the formula:
Inventory model Continued
• Xt is the number of Cameras in stock at the end
of week t (as defined earlier), where Xt
represents the state of the system at time t
• Given that Xt = i, Xt+1 depends only on Dt+1 and Xt
(Markovian property)
• A Poisson random variable can take an infinite
number of values. Since the sum of the
probabilities of all the outcomes is 1 and if, for
example, you require the probability of 2 or
more events, you may obtain this from the
identity P(X ≥ 2)= 1- p(0) – p(1)
Inventory model continued
• Dt has a Poisson distribution with mean equal
to one. This means that P(Dt+1 = n) = e-11n/n!
for n = 0, 1, …
• P(Dt = 0 ) = e-1 = 0.368
• P(Dt = 1 ) = e-1 = 0.368
• P(Dt = 2 ) = (1/2)e-1 = 0.184
• P(Dt 3 ) = 1 – P(Dt 2) = 1 – (.368 + .368
+ .184) = 0.08
• Xt+1 = max(3-Dt+1, 0) if Xt = 0 and Xt+1 = max(Xt –
D , 0) if X 1, for t = 0, 1, 2, ….
• For the first row of P, we are dealing with a transition
from state X t = 0 to some state Xt+1.
• As indicated at the above Xt+1 = max{3 – Dt+1, 0} if
Xt =0.
• Therefore, for the transition to Xt+1 = 3 or Xt+1 = 2 or
Xt+1 = 1,
• p03 = P{Dt+1 = 0} = 0.368,
• p02 = P{Dt+1 =1} = 0.368,
• p01 = P{Dt+1 = 2} = 0.184.
• A transition from Xt = 0 to Xt+1 = 0 implies that the
demand for cameras in week t +1 is 3 or more after 3
cameras are added to the depleted inventory at the
beginning of the week, so
• p00 = P{Dt+1 3} 0.080. For the other rows of P, the
formula at the end of Sec. 16.1 for the next state is
Xt1 max {Xt Dt1, 0} if Xt1 1. This implies that Xt1 Xt,
so p12 0, p13 0, and p23 0. For the other
transitions,
Inventory Example: (One-Step) Transition Matrix
0 1 2 3
0 1
0 .080 .184 .368 .368
1 .632 .368 0 0
2 .264 .368 .368 0
2
3
3 .080 .184 .368 .368
Representation of a Markov Chain as a Digraph
A B C D
0.95 0.95 0 0.05 0
A
0.2 0.5 0 0.3
B
0.2 0.5
A B 0 0.2 0 0.8
C
0 0 1 0
0.05 0.2 0.3 D
0.8
C D
1
a) Develop a
markovian
model for
the system
b) Find the
transition
matrix
• The transition diagram of the markov chain
To model the process as a Markov chain model, we
first define the states:
State Description
1 Machine 1
2 Inspection 1
3 Machine 2
4 Inspection 2
5 Machine 3
6 Inspection 3
7 Pack and Ship (absorbing)
8 Scrap Bin (absorbing)
Inventory Example: (One-Step) Transition Matrix
0 1 2 3
0 .080 .184 .368 .368
1 .632 .368 0 0
2 .264 .368 .368 0
P(2) = PP
3 .080 .184 .368 .368
0 1 2 3
0 .249 .286 .300 .165
P (2) 1 .283 .252 .233 .233
2 .351 .319 .233 .097
3 .249 .286 .300 .165
Transition Matrix: Four-Step
T =
State 3 is a transient state. Once state 3 is achieved, the system will never
The system will move out of state 3 to state 1 ( with a 1.0 probability) but
T1 = T2 =
T=
Scrap