Markov Chains: Stochastic Models
Markov Chains: Stochastic Models
Markov Chains
Note: Random variables are often treated as independent
– Often not the case
– Dependence exists between successive outcomes
• Stochastic Process: an indexed collection of random
variables {Xt}
Xt = state of system (some measure or
characteristic) at time t
t e T, and T is often the set of non-negative integers
Ex. Daily sequence of high quotations of a particular stock
– Not series of IID random variables
– More like the following: Xt+1 = Xt + St
St’s may be IID, but Xt’s are not
Stochastic Models
• Discrete Time
– Time can be regularly space (daily, weekly, etc.)
– Can be imbedded - occurrences of some phenomenon
in the system (each time the stock price reaches 100,
or each time the inventory level reaches 0)
– Sequence of realizations (i.e., outcomes) is called a
Time Series
Stochastic Models
Stationarity Assumption
æ p11 p12 L p 1s ö
ç ÷
ç p 21 p 22 L p 2s ÷
P=ç
M M O M ÷
çç ÷
è p s1 p s 2 L p ss ÷ø
Stochastic Models
Simple Example
Weather:
æ .4 .6 ö
P=ç ÷
è . 2 . 8 ø
• Note that rows sum to 1
• Such a matrix is called a Stochastic Matrix
• If the rows of a matrix and the columns of a matrix all
sum to 1, we have a Doubly Stochastic Matrix
• We’ll use this matrix later
Stochastic Models
N Step Transition Probabilities
• Often the probability that the process is in a certain state
n steps from now needs to be computed
– Denoted Pij(n) = Prob{transition from i to j in n steps}
= P[Xn+m = j|Xm = i]
Chapman-Kolmogorov Equations
Stochastic Models
Classification of States
n
r
.2
.4
Stochastic Models
Stochastic Models
Classification of States - Definitions
(continued)