0% found this document useful (0 votes)
96 views2 pages

Chapt 12

This document provides an introduction to Markov chains, which are discrete-value random sequences where the next value depends only on the current value and not on previous values. It defines a Markov chain as a discrete-value random sequence where Xn+1 depends on Xn but not earlier values of X0 through Xn-1. It focuses on the case where each variable X in the sequence is a discrete random variable that can take on the values 0, 1, 2, etc.

Uploaded by

Nazifa Nawer
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
96 views2 pages

Chapt 12

This document provides an introduction to Markov chains, which are discrete-value random sequences where the next value depends only on the current value and not on previous values. It defines a Markov chain as a discrete-value random sequence where Xn+1 depends on Xn but not earlier values of X0 through Xn-1. It focuses on the case where each variable X in the sequence is a discrete random variable that can take on the values 0, 1, 2, etc.

Uploaded by

Nazifa Nawer
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 2

Random Signals and Processes

Chapter 12: Markov Chains

Dr. Mohammad Rakibul Islam


Professor, EEE Department,
Islamic University of Technology
• Introduction:
• We consider a discrete-value random
sequence {Xn, {n = 0, 1,2, . . .}}
• that is not an iid random sequence.
• In Markov chains Xn+1 depends on Xn, but not
on the earlier values X0, . . . , Xn-1 of the random
sequence.
• To keep things reasonably simple, we restrict
our attention to the case where each X, is a
discrete random variable with range Sx =
{0,1,2, . . .}.

You might also like