0% found this document useful (0 votes)
71 views7 pages

Markov Chains: Stochastic Models

The document discusses stochastic processes and Markov chains. Some key points are: 1) Stochastic processes involve random variables that may depend on previous values, violating independence assumptions. Markov chains are a type of stochastic process where the future state depends only on the present state. 2) Markov chains can be classified by whether states are absorbing, transient, recurrent, or periodic. Communicating states can reach each other. 3) Transition probabilities between states can be represented using matrices. The n-step transition probabilities are given by raising the one-step transition matrix to the nth power.

Uploaded by

ivy_26
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
71 views7 pages

Markov Chains: Stochastic Models

The document discusses stochastic processes and Markov chains. Some key points are: 1) Stochastic processes involve random variables that may depend on previous values, violating independence assumptions. Markov chains are a type of stochastic process where the future state depends only on the present state. 2) Markov chains can be classified by whether states are absorbing, transient, recurrent, or periodic. Communicating states can reach each other. 3) Transition probabilities between states can be represented using matrices. The n-step transition probabilities are given by raising the one-step transition matrix to the nth power.

Uploaded by

ivy_26
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Stochastic Models

Markov Chains
Note: Random variables are often treated as independent
– Often not the case
– Dependence exists between successive outcomes
• Stochastic Process: an indexed collection of random
variables {Xt}
Xt = state of system (some measure or
characteristic) at time t
t e T, and T is often the set of non-negative integers
Ex. Daily sequence of high quotations of a particular stock
– Not series of IID random variables
– More like the following: Xt+1 = Xt + St
St’s may be IID, but Xt’s are not

Stochastic Models

Time in Stochastic Processes

• Discrete Time
– Time can be regularly space (daily, weekly, etc.)
– Can be imbedded - occurrences of some phenomenon
in the system (each time the stock price reaches 100,
or each time the inventory level reaches 0)
– Sequence of realizations (i.e., outcomes) is called a
Time Series

• Continuous Time - t can take on any value


Stochastic Models
Markov Property
• For general case of discrete time stochastic process with
Xt = it,

there must be a probability distribution on the sequence

P[X0 = i0, X1 = i1, X2 = i2, . . . Xt = it, . . .]


• Markov Property: means that the state of the system at
time t+1 only depends on the state of the system at t

P[Xt+1 = it+1| Xt = it, Xt-1 = it-1, . . . , X1 = i1, X0 = i0]


= P[Xt+1 = it+1| Xt = it]

Stochastic Models

Stationarity Assumption

• Probabilities are independent of t when the process is


“stationary”

So, P[Xt+1 = j|Xt = i] = pij

This means that if system is in state i, the probability that


the system will transition to state j is pij no matter what
the value of t is
Stochastic Models
Probabilities
• Initial Probability distribution
– Defines the probability that the system starts in a
particular state
– Represented as: P[X0 = i] = qi and
q = (q1 , q 2 ,K, q s ) a vector
where s = number of states
• Transition Probabilities displayed as s x s matrix

æ p11 p12 L p 1s ö
ç ÷
ç p 21 p 22 L p 2s ÷
P=ç
M M O M ÷
çç ÷
è p s1 p s 2 L p ss ÷ø

Stochastic Models

Simple Example
Weather:

– If it is raining today, it will rain tomorrow with


probability prr = 0.4
– If it is raining today, it will not rain tomorrow with
probability prn = 0.6
– If it is not raining today, it will rain tomorrow with
probability pnr = 0.2
– If it is not raining today, it will not rain tomorrow
with probability prr = 0.8
Stochastic Models

Transition Matrix for Example

æ .4 .6 ö
P=ç ÷
è . 2 . 8 ø
• Note that rows sum to 1
• Such a matrix is called a Stochastic Matrix
• If the rows of a matrix and the columns of a matrix all
sum to 1, we have a Doubly Stochastic Matrix
• We’ll use this matrix later

Stochastic Models
N Step Transition Probabilities
• Often the probability that the process is in a certain state
n steps from now needs to be computed
– Denoted Pij(n) = Prob{transition from i to j in n steps}

= P[Xn+m = j|Xm = i]

Time 0 Time m Time n+m


Stochastic Models

Chapman-Kolmogorov Equations

Pij (n + m ) = å Pik (n )Pkj (m ), for all n, m ³ 0; and all i, j


k

• This leads to a very important result:


– Pij(n) is the ijth element of the matrix Pn

– Pn obtained by matrix multiplication of matrix P

– Pn called the n step probability matrix


• Back to weather example:

æ .4 .6 öæ .4 .6 ö æ .16 + .12 .24 + .28ö æ .28 .72ö


P2 = ç ÷ç ÷=ç ÷=ç ÷
è .2 .8 øè .2 .8 ø è .08 + .16 .12 + .64ø è .24 .76ø
\Prr (2) = 0.28

Stochastic Models

Classification of States

• We now know probabilities associated with states


• We can classify the states of the system
– Whether you can get from one state to another
– Whether you can return to a state
• To help in classifying states, we use a state diagram
From the weather example:
.8
.6

n
r

.2
.4
Stochastic Models

Classification of States - Definitions


• Def: Path - a sequence of transitions from state i to state
j exists and has positive probability, i.e., Pij(n)>0 for
some n.
• Def: State j is Reachable from state i if there is a path
from i to j
• Def: Two states, i and j, Communicate if j is reachable
from i, and i is reachable from j.
• Def: A set of states S in a Markov Chain is a closed set if
no state outside of S is reachable.
• Def: A state i is an absorbing state if pii = 1 (closed set
with 1 member)

Stochastic Models
Classification of States - Definitions
(continued)

• Example of Absorbing State - The Gambler’s Ruin


– At each play we have the following:
• Gambler wins $1 with probability p, or
• Gambler loses $1 with probability 1-p
– Game ends when gambler goes broke, or gains a
fortune of $N
– Then both $0 and $N are absorbing states
• Def: A state i is a transient state if there exists a state j
that is reachable from i, but i is not reachable from j.
• Def: A state is recurrent if it is not transient
Stochastic Models

Classification of States - Definitions


(continued)

• Def: State i is periodic with period k>1 if k is the


smallest number such that all paths leading from state i
back to state i have a length which is a multiple of k (a
recurrent state that is not periodic is called aperiodic)

• Def: If all states in a Markov Chain are recurrent,


aperiodic, and communicate with one another (a “nice”
chain), then the Markov Chain is said to Ergotic

You might also like