0% found this document useful (0 votes)
32 views4 pages

PSP Presentation

The document defines a discrete time discrete state stochastic process and Markov chain. A Markov chain has the Markov property that future states only depend on the present state, not past states. The transition probability matrix describes the probabilities of moving between states. Higher order transition probabilities can be calculated using the Chapman-Kolmogorov equation.

Uploaded by

DHRUV SHARMA
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views4 pages

PSP Presentation

The document defines a discrete time discrete state stochastic process and Markov chain. A Markov chain has the Markov property that future states only depend on the present state, not past states. The transition probability matrix describes the probabilities of moving between states. Higher order transition probabilities can be calculated using the Chapman-Kolmogorov equation.

Uploaded by

DHRUV SHARMA
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Definition

If we consider a discrete time discrete state stochastic process, assume that Xn takes a finite
or countable number of possible values, this set of possible values will be denoted by the set of non-
negative integers S ={ 0, 1, 2,...}.

Here S is the state space, (copy)Suppose the P {Xn+1=j/X0 = i0, X1=i1,..., Xn=i} = P{Xn+1= j/Xn=i} , for
all states i0 whatever be the value of I0,I 1,.. and j and also for all n ≥ 0,(book) if this property is
satisfied by for all states i0, i1, ..i, j as well as for all n ≥ 0. Then this stochastic process that is a
discrete time discrete state stochastic process is going to be known as a discrete time Markov chain.

So this is a Markov property and the Markov property is satisfied by all the states as well as
all the random variables so if this Markov property is satisfied by any stochastic process
then it is called a Markov process and since it is the time space is discrete and parameter
space is discrete therefore it is called a discrete time Markov chain.

Next is transition probability matrix

transition probability matrix


Let be a homogeneous Markov chain with a discrete infinite state space E = (0, 1, 2, . . .).
Then

regardless of the value of n. A transition probability matrix of (Xn n>= 0) is defined by matrix as
shown

In the case where the state space S is finite and equal to (1, 2, . . . , m), P is m x m dimensional as
shown

A square matrix whose elements satisfy eq 1 and 2 is called a Markov matrix or stochastic matrix.

Higher-Order Transition Probabilities-Chapman-Kolmogorov Equation:


Tractability of Markov chain models is based on the fact that the probability distribution of (Xn
n>=0) can be computed by matrix manipulations. Let P = [pij] be the transition probability matrix of
a Markov chain {Xn n>= 0). Matrix powers of P are defined by
Probability distribution of Markov chain

CLASSIFICATION OF STATES

You might also like