0% found this document useful (0 votes)
123 views15 pages

Markov Chains

This document introduces Markov chains and stochastic processes. It defines a stochastic process as a collection of random variables indexed by a set T. If T is countable, it is a discrete-time process, and if T is an interval of real numbers, it is continuous. The document then provides the formal definitions of a Markov chain as a stochastic process where the future state depends only on the present state. Several examples are given to illustrate Markov chains, including weather patterns, genetic inheritance, random walks, and insurance risk classification. Transition matrices and probability calculations involving Markov chains are also demonstrated.

Uploaded by

Justin Benitez
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
123 views15 pages

Markov Chains

This document introduces Markov chains and stochastic processes. It defines a stochastic process as a collection of random variables indexed by a set T. If T is countable, it is a discrete-time process, and if T is an interval of real numbers, it is continuous. The document then provides the formal definitions of a Markov chain as a stochastic process where the future state depends only on the present state. Several examples are given to illustrate Markov chains, including weather patterns, genetic inheritance, random walks, and insurance risk classification. Transition matrices and probability calculations involving Markov chains are also demonstrated.

Uploaded by

Justin Benitez
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 15

Markov Chains

Math 182 Introduction to Stochastic Processes


Lecture: 01

Definition 1

A stochastic

process is a collection of random variables.


The set T is called the index set of the process. When T
is countable the process is said to be a discrete-time
stochastic process. If T is an interval of the real line, the
process is said to be a continuous stochastic process. The
set of all possible values that the random variables can
assume is called the state space of the process. The
elements of the state space are referred to as the states of
the system.
Lecture: 01

Introduction to Stochastic Processes - Math182

Definition 2

Consider

a discrete-time stochastic process with state


space in {0,1,2,}. The stochastic process is said to be
a Markov chain if
.
Note that .

Lecture: 01

Introduction to Stochastic Processes - Math182

Definition 3
The values pij are called transition probabilities and the
matrix

p00
p
10

p20

p01
p11
p21

p02
p12
p22

is called the transition matrix for the Markov chain.


Lecture: 01

Introduction to Stochastic Processes - Math182

Example 1
The Land of Oz is blessed by many things, but not by
good weather. They never have two nice days in a row.
If they have a nice day, they are just as likely to have
snow as rain the next day. If they have snow or rain,
they have an even chance of having the same the next
day. If there is change from snow or rain, only half of
the time is this a change to a nice day. Represent the
successive weather in the Land of Oz by a Markov
chain.

Lecture: 01

Introduction to Stochastic Processes - Math182

Example 2 (Ehrenfest Model)


We have two urns that, between them, contain four
balls. At each step, one of the four balls is chosen at
random and moved from the urn that it is in into the
other urn. Let Xn be the number of balls in the first urn
after n steps. Specify the transition matrix.

Lecture: 01

Introduction to Stochastic Processes - Math182

Example 3 (Gene Model)(1)


The simplest type of inheritance of traits in animals
occurs when a trait is governed by a pair of genes, each
of which may be of two types, say G and g. An
individual is called dominant if he or she has GG genes,
recessive if he or she has gg, and hybrid with a Gg
mixture. In the mating of a dominant (recessive) and a
hybrid animal, each offspring must get a G (g) gene
from the former and has an equal chance of getting G or
g from the latter. In the mating of two hybrids, the
offspring has an equal chance of getting G or g from
Lecture: 01

Introduction to Stochastic Processes - Math182

Example 3 (2)
each parent. Consider a process of continued matings
and assume that there is at least one offspring. An
offspring is chosen at random and is mated with a
hybrid and this process repeated through a number of
generations. Represent the genetic type of the chosen
offspring in successive generations by a Markov chain.

Lecture: 01

Introduction to Stochastic Processes - Math182

Example 4 (Drunkards Walk)


A man walks along a four-block stretch of Park Avenue.
If he is at corner 1,2, or 3, then he walks to the left or
right with equal probability. He continues until he
reaches corner 4, which is a bar, or corner 0, which is
his home. If he reaches either home or bar, he stays
there. Represent his location at each period by a Markov
chain.

Lecture: 01

Introduction to Stochastic Processes - Math182

Example 5 (Bonus-Malus) (1)


One way automobile insurance premiums are
determined is through the Bonus-Malus system. Each
policyholder is given an integer valued state and the
annual premium is a function of this state. A
policyholders state changes from year to year in
response to the number of claims made by that
policyholder. Because lower numbered states
correspond to lower premiums, a policyholders state
will usually decrease if he/she had no accidents the
previous year and increase otherwise.
Lecture: 01

Introduction to Stochastic Processes - Math182

10

Example 5 (2)
Given the following Bonus Malus system
Next state if
State

Prem

0 claims

1 claim

2 claims

>2 claims

200

250

400

600

If the number of claims is Poisson distributed with mean


1, specify the transition matrix for a policyholders state
from one year to the next.
Lecture: 01

Introduction to Stochastic Processes - Math182

11

Example 6
Let Zn represent the outcome during the nth roll of a fair
die. Define Xn to be the maximum outcome obtained so
far after the nth roll, i.e., Xn = max{Z1,Z2,,Zn}. Specify
the transition matrix for {Xn}.

Lecture: 01

Introduction to Stochastic Processes - Math182

12

Remark 1

To give

probabilities for a Markov chain, we need to


give an initial probability distribution i = Pr{X0 = i}
and the transition matrix P, for then

Lecture: 01

Introduction to Stochastic Processes - Math182

13

Example 7
A Markov chain has the transition matrix

.1 .2 .7

P .9 .1 0
.1 .8 .1
and initial distribution 0 = 0.3, 1 = 0.4 and 2 = 0.3 .
Determine Pr{X0=0, X1= 2, X2=2}, Pr{X2=2, X3=1 |
X1=0} and Pr{X2 = 1 | X0 = 2}.
Lecture: 01

Introduction to Stochastic Processes - Math182

14

Example 8
A Markov chain has the transition matrix

1
2

1
3
1
3

P 0
12 0

1
6
2
3
1
2

If Pr{X0 = 0} = Pr{X0 = 1} = , find E[X2].

Lecture: 01

Introduction to Stochastic Processes - Math182

15

You might also like