Unit 3
Unit 3
RANDOM PROCESSES
B. Thilaka
Applied Mathematics
Random Processes
A family of random variables {X(t,s)tT, sS}
defined over a given probability space and indexed
by the parameter t, where t varies over the index
set T, is known as random process/ chance
process/ stochastic process.
X: SxTR
X(s,t)=x
Notation: X(t)
Random Processes
The values assumed by the random
variables X(t) are called the states and the
set of all possible values is called the state
space of the random process.
Observations:
1. s and t are fixed : X(t) real number
2. t is fixed : X(t) random variable
3. s is fixed : X(t) - function of time
called the sample function or realisation of
the process
4. s and t are varying : X(t) random
process
Classification
Three types
Type I
State Space, Parameter Space:
Discrete/ Continuous
Classification
S Discrete Continuous
T
Discrete Discrete Continuous
random random
sequence sequence
Continuous Discrete Continuous
random random
process process
Classification
Predictable/ Deterministic :
Future values can be predicted from past
values.
eg: X(t)= A cos (wt+), where any non-
empty combination of A, w, may be
random variables
Unpredictable/ Non- deterministic:
Future values cannot be predicted from
past values.
eg: Brownian motion
Classification
Stationarity
FX x1; t1 FX x1 PX (t1 ) x1
f X x1; t1 FX x1; t
x1
First order Distribution Function
Statistical Average
The statistical average of the random
process {X(t)} is defined as
(t ) E[ X (t )] x1 f X x1; t1 dx1
provided the quantity on the RHS exists.
First order stationary process
The random process {X(t)} is said to be a
first order stationary process / stationary to
order one if the first order density/
distribution function is invariant with respect
to a shift in the time origin
f X x1 ; t 1 f X x1 ; t 1 x1 , t 1 ,
(or)
FX x1; t1 FX x1; t1 x1 , t1 ,
First order stationary process
Result:
The statistical mean of a first order
stationary random process is a constant
with respect to time.
Proof:
Let the random process {X(t)} be a first
order stationary process .
Then its first order density function
satisfies the property that
First order stationary process
f X x1 ; t 1 f X x1 ; t 1 x1 , t 1 ,
The mean of the random process {X(t)} at
time t1 is defined as
E[ X (t1 )] x1 f X x1 ; t1 dx1
First order stationary process
Consider
E[ X (t2 )] x2 f x2 ; t2 dx2
Let t2 = t1 +. Then
E[ X (t 2 )] x f x
2 2 ; t1 dx2
x f x ; t dx
2 2 1 2
First order stationary process
E[ X (t )] E[ X (t )]
Hence E[X(t)] is a constant with respect to
time.
Second order Distribution
function
The second order joint distribution function
of a random process {X(t)} is defined as
FX x1 , x2 ; t1 , t2 PX (t1 ) x1 , X (t2 ) x2
The second order joint density function of
the random process {X(t)} is defined as
2
f X x1 , x2 ; t1 , t 2 FX x1 , x2 ; t1 , t 2
x1x2
Second order stationary
process
A random process {X(t)} is said to be a
second order stationary process/
stationary to order two if its second order
distribution/ density function is invariant
with respect to a shift in the time origin.
In other words,
FX x1 , x2 ; t1 , t 2 FX x1 , x2 ; t1 , t 2 x1 , x2 , t1 , t 2 ,
(and / or)
f X x1 , x2 ; t1 , t 2 f X x1 , x2 ; t1 , t 2 x1 , x2 , t1 , t 2 ,
Second order Processes
Auto-correlation function:
The auto-correlation function of a random
process {X(t)} is defined as
R XX t 1 , t 2 EX(t 1 )X(t 2 )
provided the quantity on the RHS exists.
Second order Processes
Significance of auto-correlation function
1. It provides a measure of similarity
between two observations of the random
process {X(t)} at different points of time t1
and t2.
2. It also defines how a signal is similar to a
time-shifted version of itself
Second order Processes
Auto-covariance function:
The auto-covariance function of a random
process {X(t)} is defined as
E[ X(t )]
R XX t , t R XX
Remark: A second order stationary process
is a WSS process, but the converse need
not be true.
nth order distribution/ density
function
The nth order joint distribution function of a
random process {X(t)} is defined as
n
f X x1 , x2 ,.., xn ; t1 , t 2 ,.., t n FX x1 , x2 ,.., xn ; t1 , t 2 ,.., t n
x1x2 ...xn
nth order stationary process
A random process {X(t)} is said to be nth order
stationary/ stationary to order n if the nth order
density/ distribution function is invariant with
respect to a shift in the time origin
(i.e.)
f X x1 , x2 ,..xn ; t1 , t2 ,.., tn f X x1 , x2 ,..xn ; t1 , t2 ,..tn ,
x1 , x2 ,.., xn , t1 , t2 ,.., tn ,
(and / or)
FX x1 , x2 ,..xn ; t1 , t 2 ,.., t n FX x1 , x2 ,..xn ; t1 , t 2 ,..t n ,
x1 , x2 ,.., xn , t1 , t 2 ,.., t n ,
Strictly stationary Process
Remark :
If a random process is stationary to order
n, then it is stationary to all orders kn.
WHY ?
Evolutionary Process
Random processes which are not
stationary to any order are called non-
stationary/ evolutionary processes
Further Properties
Cross correlation function:
The cross correlation function of two
random processes {X(t)} and {Y(t)} is
defined as RXY(t1,t2)=E[X(t1)Y(t2)]
Cross covariance function:
The cross correlation function of two
random processes {X(t)} and {Y(t)} is
defined as
CXY(t1,t2)=E{[X(t1)-E(X(t1))][Y(t2)-E(Y(t2))]}
=RXY(t1,t2)-E[X(t1)]E[Y(t2)]
Further Properties
Two random processes {X(t)} and {Y(t)}
are said to be jointly WSS if
(i) both {X(t)} and {Y(t)} are each WSS
(ii) the cross-correlation function
RXY(t1,t2)=E[X(t1)Y(t2)] is exclusively a
function of the length of the time interval
(t2-t1)
Independent Random Process
A random process {X(t)} is said to be an
independent random process if its nth
order joint distribution function satisfies the
property that
T
lt 1
A[ X (t1 ) X (t 2 )]
T 2T T
x(t ) x(t )dt , t1 t , t 2 t
Ergodic Process
A random process is said to be ergodic if
its time averages are all equal to the
corresponding statistical averages.
Examples ???
Random Process
Correlation coefficient of a random process :
The correlation coefficient (normalized
auto-covariance function) of a random
process {X(t)} is defined as the correlation
coefficient between its random variables
X(t1) and X(t2) for arbitrary t1 and t2 .
In other words,
C XX (t1 , t 2 )
XX (t1 , t 2 )
Var[ X (t1 )] Var[ X (t 2 )]
Discrete Continuous
PS n k nCk p (1 p)
k n k
, k 0,1,..., n
E[S n ] np
Var[S n ] np(1 p) npq
Can you classify Binomial process even
further??
Binomial Process
Consider Sn+1= Y1+Y2++Yn+1
= Y1+Y2++Yn+Yn+1
= Sn+Xn+1
Hence
P[Sn+1= k Sn= k] = P[Yn+1= 0]=1-p
P[Sn+1= k+1 Sn= k] = P[Yn+1= 1]=p
GSn ( z) q pz
n
m n m l
PS m k , S n l p (1 p) nl , k 0,1,.., m, m n, k l , l 0,1,.., n
k l k
Binomial Process
Observation:
The total number of trials T from the
beginning of the process until and
including the first success is a geometric
random variable.
The number of trials after (i-1)th success
upto and including the ith success will have
the same distribution as T
n!
Proof:
Let {N(t)} be a Poisson process with
parameter . We now consider
Poisson Processes- pmf
pn (t t ) P[ N (t t ) n]
n
P[ N (t t ) n / N (t ) k ]P[ N (t ) k ]
k 0
(homogeneity)
Poisson Processes- pmf
n
P[ N (t ) n k ]P[ N (t ) k ]
k 0
P[ N (t ) n]P[ N (t ) 0] P[ N (t ) n 1]P[ N (t ) 1]
n2
P[ N (t ) k ]P[ N (t ) n k ] o(t )
k 0
m2
pn (t ) p0 (t ) pn1 (t ) p1 (t ) pk (t ) pnk (t )
k 0
(Regularity)
n2
pn (t t ) pn (t ) tp n (t ) tp n1 (t ) o(t ) pk (t )o(t ) o(t )
k 0
Dividing throughout by t
pn (t t ) p n (t ) o(t )
p n (t ) pn1 (t )
t t
Poisson Processes- pmf
Taking limits as t 0 on both sides of the
above equation,
dpn t
pn t pn1 t , n 1 --------(1)
dt
At n=0, we have
p0 (t t ) P[ N (t t ) 0]
P[ N (t t ) 0 / N (t ) 0]P[ N (t ) 0]
Poisson Processes- pmf
P[ N (t t ) N (t ) 0]P[ N (t ) 0]
P[ N (t , t t ) 0]P[ N (t ) 0]
P[ N (t ) 0]P[ N (t ) 0]
(homogeneity)
1 t o(t )p0 (t ) (regularity)
p0 (t t ) p0 (t ) o(t )
p0 (t )
t t
Taking limits as t 0 on both sides of the
above equation,
Poisson Processes- pmf
dp0 t
p0 t
dt -----(2)
We now solve the above system of
differential difference equations (1) and (2)
subject to the initial conditions
p0 (0) 1, pn (0) 0n 1
-------(3)
Consider equation (2) namely,
Poisson Processes- pmf
dp0 t subject to p0 (0) 1
p0 t
dt
dp0 t
dt
p0 t
log p0 t t c
p0 t e t c
p0 t Ke t
At t=0, p0 0 1 K
p0 t e t -----(4a)
Poisson Processes- pmf
Substituting n=1 in equation (1), we have
dp1 t
p1 t p0 t subject to p1 (0) 0
dt
dp1 t
p1 t e t
dt
dp1 t
p1 t e t , p1 (0) 0
dt
On solving the above equation, we have
t dt
p1 t e e e dt c
dt
p1 t e t e t e t dt c
Poisson Processes- pmf
p1 t e (t ) ce
t t
dt t
p2 t e e te dt c
dt
p2 t e t 2 e t te t dt c
2 t 2
c
2
Poisson Processes- pmf
2 t 2
p2 t e t ce t
2
At t=0,
p2 0 e (0) (0) ce (0)
c0
e t ( t ) 2
p2 (t ) -----(4c)
2!
N(t ) N(t )
lt E , lt Var 0
t
t t
t
Hence is called the arrival rate of the
process
Poisson Process
Characterization: If {N(t)} is a Poisson
process with mean t, then the
occurrence/ inter-arrival time follows an
exponential distribution with mean 1/ .
Proof: Let {N(t)} be a Poisson process with
parameter . Then
e t
p n (t ) t , n 0,1,2,.....
n
n!
Poisson Process
If W denotes the time between two successive arrivals/
occurrences of the event, the CDF of W is
FW w P[ W w] 1 P[ W w]
1 P[N( w ) 0]
FW w 1 e w
M Nk (t ) ( ) E e N k (t )
k t (1e )
e
By property of moment generating n
functions, the mg.f. of the sum N (t ) N k (t )
k 1
is given by
Poisson Process- Properties
n
M N (t ) ( ) M Nk (t ) ( )
k 1
n
e k t (1e )
k 1
n
k t (1e )
e k 1
Proof: HW
Poisson Process
Note:
If N(t) is a Poisson process with mean
arrival rate , then the time between
k successive arrivals/ occurrences follows
a
WHY??
Poisson Process
Second order joint p.m.f. :
Given a Poisson process with mean arrival
rate , the second order joint p.m.f. is
obtained as follows:
PN (t1 ) n1 PN (t 2 ) N (t1 ) n2 n1 , t1 t 2 , n1 n2
PN (t1 ) n1 PN (t1 , t 2 ) n2 n1 , t1 t 2 , n1 n2
Poisson Process
PN (t1 ) n1 PN (t 2 t1 ) n2 n1 , t1 t 2 , n1 n2
(homogeneity)
e t1
t1 n1
e (t 2 t1 )
( t1 t 2 ) ( n2 n1 )
, t1 t 2 , n1 n2
n1! n2 n1 !
Poisson Process
Hence
e t2 n2 t1 n1 (t 2 t1 ) ( n2 n1 )
, t1 t 2 , n1 n2
PN (t1 ) n1 , N (t 2 ) n2 n1!n2 n1 !
0, elsewhere
Poisson Process
Similarly the third order joint p.m.f. of the
Poisson process with mean arrival rate is
given by
Poisson Process
.
et3 n3 t1n1 (t2 t1 )( n2 n1 ) (t3 t2 )( n3 n2 )
, t1 t2 t3 , n1 n2 n3
PN (t1 ) n1 , N (t2 ) n2 , N (t3 ) n3 n1!n2 n1 !(n3 n2 )!
0, elsewhere
Poisson Process
Auto-correlation function :
If N(t) is a Poisson process with mean
arrival rate , then
E[N(t)] = t
Var[N(t)] = t
E[N2(t)] = Var[N(t)] + E[N(t)] = t + 2t2
Poisson Process
The auto-correlation function of N(t) is now
obtained as follows:
RNN (t1 , t 2 ) E[ N (t1 ) N (t 2 )] (by definition)
E N (t1 )N (t 2 ) N (t1 ) N (t1 ) , t1 t 2
2
EN (t1 ) N (t1 , t 2 ) E N (t1 ) , t1 t 2
2
Poisson Process
EN (t1 ) N (t 2 t1 ) E N (t1 ) , t1 t 2
2
EN (t1 )EN (t 2 t ) EN
1
2
(t1 ) , t1 t 2
(independence)
RNN (t1 , t 2 ) t1[ (t 2 t1 )] t1 t1 , t1 t 2
2
t1t 2 t1 , t1 t 2
2
C NN (t1 , t 2 ) min t1 , t 2
Poisson Process
The correlation function of a Poisson
process N(t) with mean arrival rate is
given by
C NN (t1 , t 2 )
NN t1 , t 2
Var[ N (t1 )] Var[ N (t 2 )]
min t1 , t 2
t1 t 2
min t1 , t 2
t1t 2
Poisson Process
t1
Hence NN t1 , t 2 , t1 t 2
t2
Random Processes
Process with stationary increments:
A random process {X(t)} is said to be a
process with stationary increments if the
distribution of the increments X(t+h)-X(t)
depends only on the length h of the
interval and not on end points.
Random Processes
Wiener process/ Wiener-Einstein Process/
Brownian Motion Process:
A stochastic process {X(t)} is said to be
a Wiener Process with drift and varaince
2, if
Solution : Consider
PN (s) k N (t ) n PN ( s) k N (t ) n
PN (t ) n (defn)
Poisson Process
PN (s) k N (t s) n k
PN (t ) n (by definition)
PN (s) k PN (t s) n k
PN (t ) n
e s (s ) k e ( t s ) ( (t s )) n k
k! ( n k )!
e t ( t ) n
n!
Poisson Process
n! e s k s k e t e s nk (t s) nk
k!(n k )! e t n t n
nk
s
s t 1
n k nk
nC k t
n t n
nk
k n k s
s t t 1
nC k t
tn
Poisson Process
nk
s
k
s
nC k k 1
t t
k nk
s s
nC k 1
t t
k nk
s s
PN (s) k N (t ) n nCk 1 , k 0,1,2,....n
t t
Hence the proof.
Poisson Process
2. Suppose that customers arrive at a
bank according to a Poisson process with
a mean rate of 3 per minute. Find the
probability that during a time interval of 2
minutes (a) exactly 4 customers arrive
(b) more than 4 customers arrive.
Solution: Let N(t) be a Poisson process
with mean arrival rate .
Poisson Process
Given that =3.
e t ( t ) k
Hence PN (t ) k , k 0,1,2,..,
k!
e 3( 2 ) (3.2) 4
(a) PN (2) 4
4!
e 6 6 4
0.133
4!
Poisson Process
(b) PN (2) 4 1 PN (2) 4
PX (2) 0 PX (2) 1
1
PX (2) 2 PX (2) 3 PX (2) 4
e 2t
2
2
1
1 0 e 2
0.1353
Poisson Process
(b) 2
P(1 T 2) 2e 2t dt
1
2
e 2t
2
2 1
1 e 4 e 2
2 4
e e
0.1353 0.0183
0.1170
Poisson Process
(c) P(T 4) = 1-e-2(4)=1-e-8=0.9996
2t
2 te , t 0
2
Poisson Process
3
P[T1 T2 3] 4te 2t dt
0
te 2t
3
e
2t
3
4
2 0 4 0
3e 6 0 e 6 1
4
2 4
6
7e 1
=0.9826
Poisson Process
(iv) Since E[N(t)]= t,
the average number of customers arriving
in one hour is E[N(60)]= 2x60 = 120.
Hence
e 3( 4) (3x 4)10 e 12 (12)10
PN1 (4) 10
10! 10!
Poisson Process
5. A machine goes out of order whenever
a component fails. The failure of this part
follows a Poisson process with a mean
rate of 1 per week. Find the probability that
2 weeks have elapsed since last failure. If
there are 5 spare parts of this component
in an inventory and that the next supply is
not due in 10 weeks, find the probability
that the machine will not be out of order in
the next 10 weeks.
Poisson Process
Solution:
Let X(t) denote the number of failures of
the component in t units of time.
Then X(t) follows a Poisson process with
mean failure rate= mean number of
failures in a week = = 1
P[2 weeks have elapsed since last failure]
= P[ Zero failures in the 2 weeks since last
failure] =P[X(2)=0]
Poisson Process
e 2 (2 ) 0
P[X(2) 0]
0!
P[X(10)5]=0.068
Stationary Processes
6. Determine the mean and variance of the
random process {X(t)} is given by
(at ) n 1
, n 1,2,3,..
(1 at ) n 1
PX (t ) n
at n0
1 at
Verify whether {X(t)} is stationary or not.
Stationary Processes
Solution:
Consider the random process {X(t)}
defined by
(at ) n 1
, n 1,2,3,..
(1 at ) n 1
PX (t ) n
at n0
1 at
at 1 at (at ) 2
0 1 2 3 ...
1 at (1 at ) 2
(1 at ) 3
(1 at ) 4
1 at (at ) 2
1 2 3 ....
(1 at ) 2 1 at (1 at ) 2
Stationary Processes
1 at at
2
1 2 3 ....
(1 at ) 2 1 at 1 at
2
1 at
1 1 at
(1 at ) 2
2
1 1 at at 1 1
2
(1 at ) 2 1 at (1 at ) 2 1 at
1
(1 at ) 2
(1 at ) 2
E[X(t)]=1
We now compute Var[X(t)]
Stationary Processes
Var [X(t)]=E[X2(t)]-{E[X(t)]}2.
Consider EX (t ) n P[ X (t ) n]
2
2
n 0
[n 2 n n]P[ X (t ) n]
n 0
n(n 1) P[ X (t ) n] nP[ X (t ) n]
n 0 n 0
at 1 at (at ) 2
0 1x2 2 x3 3x 4 .... 1
1 at (1 at ) (1 at ) (1 at )
2 3 4
2 at at
2
at
3
2
1 3 6 10 .... 1
(1 at ) 1 at 1 at 1 at
Stationary Processes
3
2 at
1 1 at 1
(1 at ) 2
3
2 1 at at
1 at 1
(1 at ) 2
2
(1 at ) 3
1
(1 at ) 2
E[X2(t)]=2+2at-1
=3at-1
Stationary Processes
Var[X(t)]=2at+1-1
Var[X(t)]=2at.
Consider E[ X (t )] E A cos(t )
Stationary Processes
AEcos(t )
2
1
A cos(t ) d
0
2
A
sin(t )0
2
2
A
sint 2 sint
2
E[X(t)]=0, a constant with respect to time.
----------(1)
Stationary Processes
We next consider the auto-correlation
function RXX(t,t+ )=E[X(t)X(t+ )]
EA cos(t ) A cos({t } )
A2
Ecos(t t ) cos(t t )
2
A2
Ecos(2t 2 ) cos( )
2
A2 A2
Ecos(2t 2 ) Ecos( )
2 2
Stationary Processes
A2 A2
Ecos(2t 2 ) cos( )
2 2
2
2 2
0
1
sin(2t 2 ) sin(2t ) 0
4
------(2)
Substituting the above expression in
RXX(t,t+ ), we have
Stationary Processes
A2 -----(3)
R XX (t , t ) cos( )
2
A2
Consider E[X2(t)]= RXX(t,t)= RXX(0) <. 2
Hence, the given random process is WSS.
T
1 lt
A
T 2T cos(t )dt
T
1 sin(t )
T
lt
A
T 2T
T
lt
A 1
sin(T ) sin(T )
T 2T
T
lt 1
A[ X (t ) X (t )]
T 2T x(t ) x(t )dt
T
Stationary Processes
Consider
T
lt 1
A[ X (t ) X (t )]
T 2T A cos(t ) A cos({t } )dt
T
cos(t t ) cos(t t )
T
lt 1
A2
T 2T T 2
dt
T T
A2 lt 1 A2 lt 1
2 T 2T T
cos(2t 2 )dt
2 T 2T T
cos( )dt
sin(2t 2 )
T T
A 2 lt 1 A2 lt 1
cos( ) dt
2 T 2T 2 T 2 T 2T T
Stationary Processes
A2 lt sin(2T 2 ) sin(2T 2 )
4 T 2T
A 2 cos( ) lt
1
T (T )
2 T 2T
A 2 cos( ) lt
0 1 sin 1,
2 T
A2
AXX (t , t ) cos( ) ----(5)
2
Solution : HW
Solution : HW
Stationary Processes
10. Two random processes X(t) and Y(t)
are defined by X (t ) A cos 0t B sin 0t and
Y (t ) B cos 0t A sin 0t . Show that X(t) and
Y(t) are jointly wide sense stationary, if A
and B are uncorrelated zero mean random
variables having the same variance and 0
is a constant.
Solution: HW
Stationary Processes
11. If X(t) is a WSS process with
autocorrelation function RXX( )=Ae-2
where A is any constant , obtain the
second order moment of the random
variable X(8)-X(5).
Solution : HW
Solution :
A random process {X(t)} is said to be WSS if
it satisfies the following conditions:
1. E[X(t)] is a constant with respect to time.
2. The auto-correlation function RXX(t1,t2) is
a function of the length of the time
difference. i.e. RXX(t1,t2)= RXX(t1-t2)
3. E[X2(t)]<.
Stationary Processes
Since () is the characteristic function of
the random variable Y, we have
()=E[eiY] =E[cos (Y) +i sin (Y)]
Solution: HW
Stationary Processes
15. Verify whether the random process
X(t)=Ycos t where is a constant and Y
is a uniformly distributed random variable
in (0,1) is a strict sense stationary process.
Solution : HW
Sine Wave Processes
Definition: A random process {X(t)} of the form
X(t) = A cos (t+) or X(t) = A sin(t+)
where any non-empty combination of A, ,
are random variables is called a sine wave
process.
A is called the amplitude
is called the frequency
is called the phase.
Random Processes
Orthogonal Processes:
Two random processes {X(t)} and {Y(t)}
are said to be orthogonal if their cross
correlation function RXY(t1,t2)=0.
Uncorrelated Processes:
Two random processes {X(t)} and {Y(t)}
are said to be uncorrelated if their cross
covariance function CXY(t1,t2)=0.
i.e. RXY(t1,t2)=E[X(t1)]E[Y(t2)]
Random Processes
Note: Two independent random processes
are uncorrelated but the converse need
not be true. (Can you give a Counter-
example???)
1
[ X X ]C X1 [ X X ]T
f X x1 , x2 ,.., xn ; t1 , t 2 ,.., t n
1 2
e
C X (2 ) n
Normal/ Gaussian Process
where X1 X1
T
X
2 X 2
X X .
.
X X
n n
Remarks:
1. A Gaussian process is completely
determined by its mean and auto-
covariance functions. (WHY???)
Normal/ Gaussian Process
2. If a Gaussian process is WSS, then it is
strictly stationary. (WHY???)
X (10) E[ X (10)] 8 10
PX (10) 8 P
C XX 10,10 16
2
P Z P[Z 0.5]
4
1 0.6915 0.3085
Normal/ Gaussian Process
Consider the random variable X(10)-X(8) .
Then E[X(10)-X(8)]=E[X(10)]-E[X(8)]=0.
Var[X(10)-X(8)]
=Var[X(10)]+Var[X(8)]-2Cov[X(8),X(10)]
=16 + 16 2x16e8-10
= 32 32 e-2
= 32(0.8646)
= 27.6692
Normal/ Gaussian Process
40
P X (10) X (8) 4 P Z
27.6602
P Z 0.76
2P[0 Z 0.76]
= 2P[(Z<0.76)-0.5]
= 2[0.7764-0.5]
= 0.5528
Random Telegraph Process
Let {X(t)} be a random process satisfying
the following conditions:
1. {X(t)} assumes only one of the 2
possible levels +1 or -1 at any time.
2. X(t) switches back and forth between its
two levels randomly with time.
3. The number of level transitions in any
time interval of length is a Poisson
random variable. i.e. the probability of
Random Telegraph Process
exactly k transitions with average rate of
transitions is given by e ( ) , k 0,1,2,....
k
k!
4. Transitions occurring in any time
interval are statistically independent of the
transitions in any other interval.
5. The levels at the start of any interval are
equally probable.
e
2
1
e e
P[Y(t)=1] = et cosh (t) -------(3)
Random Telegraph Process
P[Y(t) = -1 ,Y(t, t+ ) = 1]
= e sinh ( ) et sinh (t) -----(7c)
p
j
ij 1, i 1,2,.., n
p 00 p 01 p 02 ...
p p 11 p 12
...
10
P . . . ...
.
p i0 p i1 p i2
.
The row sum is 1 for each row of P Stochastic matrix
All the entries lie in [0,1]
The stochastic matrix is said to be doubly stochastic iff all the column entries add up
to 1 for every column.
A stochastic matrix is said to be regular if all its entries are positive.
Discrete-Time Markov Chain
The joint PMF of Xn , Xn-1 ,, X0 is given
by
PX n i n , X n1 i n1 ,.., X 0 i 0 p in 1 ,i n ...p i 0 ,i1, p i 0 (0)
(0.6)
(0.1) (0.3)
For computer repair example:
(1) (0.8)
2 1
(0.2)
Discrete-Time Markov Chain
The n-step transition probabilities :
Let P(n)=[pij(n)] be the matrix of n-step
transition probabilities, where
pij(n)=P[Xn+k=j/ Xk=i] = P[Xn=j/X0=i] for all
n0 and k0, since the transition
probabilities do not depend on time.
Then P(n)=Pn
Discrete-Time Markov Chain
Consider P(2)
P[ X 2 j, X 1 k , X 0 i ]
P[ X 2 j, X 1 k / X 0 i ]
P[ X 0 i ]
P[ X 2 j / X 1 k ]P[ X 1 k / X 0 i ]P[ X 0 i ]
P[ X 0 i ]
P[ X 2 j / X 1 k ]P[ X 1 k / X 0 i ] p ik (1)p kj (1)
p ij ( 2) p ik (1)p kj (1)i, j
k
Discrete-Time Markov Chain
P(2)=P(1)P(1)=P2
P(n)=P(n-1)P
=P(n-2)PP
=P(n-2)P2
=Pn
Hence the n-step transition probability
matrix is the nth power of the one-step
transition probability matrix.
Discrete-Time Markov Chain
Chapman- Kolmogorov Equations:
p ij (n) p ik (r )p kj (n r )
k
Interpretation:
RHS is the probability of going from i to k in r steps &
then going from k to j in the remaining n r steps,
summed over all possible intermediate states k.
Discrete-Time Markov Chain
State Probabilities:
The state probabilities at time n are given by
the row vector pj(n)={pj(n)}. Now,
p j (n) P[X n j / X n 1 i]P[ X n 1 i] p ij p i n 1
i i
Therefore p(n)=p(n-1)P.
Similarly, p(n)=p(0)P(n)=p(0)Pn , n=1,2,
Hence the state PMF at time n is obtained by
multiplying the initial PMF by Pn.
Discrete-Time Markov Chain
Limiting State/ Steady-state Probabilities:
Let {Xn} be a discrete-time Markov chain with N
states
P[Xn=j] - the probability that the process is in
state j at the end of the first n transitions,
j=1,2,..,N.
Then
N
P[ X n j ] p j (n) P[ X 0 i] pij (n)
k 1
Discrete-Time Markov Chain - Steady-state
Probabilities
As n, the n- step transition probability pij(n)
does not depend on i, which means that
P[Xn=j] approaches a constant as n.
The limiting- state probabilities are defined as
lim P[ X n j ] j , j 1,2,..., N
n
P
1 j
j
f ij f n
n 1
ij
State j is called
transient (non-recurrent) if there is a
positive probability that the process will
never return to j again if it leaves j
recurrent (persistent) if with probability
1, the process will eventually return to j
after it leaves j
A set of recurrent states forms a single
chain if every member of the set communicates
with all the members of the set.
Discrete-Time Markov Chain - Classification of
States
Recurrent state j is called a
periodic state if there exists an integer d,
d>1, such that pjj(n) is zero for all values
of n other than d, 2d, 3d,; d is called the
period. If d=1, j is called aperiodic.
States
Ergodic Ergodic
Discrete-Time Markov Chain -
Classification of States
p j (t ) PX(t ) j
Continuous-Time Markov Chain
In other words, pij(t) is the probability that a MC
presently in state i will be in state j after an
additional time t and pj(t) is the probability that a
MC is in state j at time t.
The transition probabilities satisfy
0 pij (t ) 1
p
j
ij (t ) 1
Further pj
j (t ) 1
Continuous-Time Markov Chain
Chapman-Kolmogorov equation
0 1 11 13
1 1
1 11
p ( 3) p ( 2) P
12 12 2 2 24 24
Therefore,
P(the man travels by train on the third day) =
11/24
Let = (1, 2) be the limiting form of the
state probability distribution or stationary
Markov Chains
state distribution of the Markov chain.
By property of , P=
0 1
( 1 , 2 ) 1 1 ( 1 , 2 )
2 2
1
2 1 -----(1) and
2
1
1 2
------(2)
2
Equations (1) and (2) are the same.
Alongwith the equation 1+ 2 = 1, ---(3)
Markov Chains
since is a probability distribution, we
obtain 1 2 1 2
2
3
2 1
2
2
2
3
2 1
1 1
3 3
Hence 1=1/3, 2 = 2/3.
Therefore P[man travels by car in the long
run]= 2/3
Markov Chains
18. Three boys A,B and C are throwing a
ball to each other. A always throws the ball
to B and B always throws the ball to C, but
C is just as likely to throw the ball to B as
to A. Show that the process is Markovian.
Find the transition probability matrix and
classify the states.
0.4
( 0)
p (0.7,0.2,0.1)
(i) PX 2 3 (ii) PX 3 2, X 2 3, X 1 3, X 0 2 .
Solution: Consider
Markov Chains
0.1 0.5 0.4 0.1 0.5 0.4
P ( 2) [ p 2 ] 0.6 0.2 0.2 0.6 0.2 0.2
0.3 0.4 0.3 0.3 0.4 0.3
0.43 0.31 0.26
0.24 0.42 0.34
0.36 0.35 0.29
PX 2 3 PX 2 3 X 0 i PX 0 i
3
i 1
.
PX 2 3 X 0 1PX 0 1 PX 2 3 X 0 2PX 0 1 PX 2 3 X 0 1PX 0 3
p13
2
P[ X 0 1] p23
2
P[ X 0 2] p33
2
P[ X 0 3]
Markov Chains
P[X2=3] 0.26 0.7 0.34 0.2 0.29 0.1
= 0.182 + 0.068 + 0.029 =0.279
(ii) PX 3 2, X 2 3, X 1 3, X 0 2
PX 3 2 X 2 3, X 1 3, X 0 2PX 2 3, X 1 3, X 0 2
(conditional probability)
PX 3 2 X 2 3PX 2 3 X 1 3, X 0 2PX 1 3, X 0 2
(Markov property)
p32
(1) (1)
. p33 (1)
p23 .PX 0 2
= 0.4x0.3x0.2x0.2= 0.0048
Markov Chains
20. A gambler has Rs.2. He bets Re. 1 at a
time and wins Re. 1 with probability 0.5.
He stops playing if he loses Rs. 2 or wins
Rs. 4. What is the transition probability
matrix of the related Markov chain?
Solution : HW
Markov Chains
21. There are 2 white marbles in urn A and
3 red marbles in urn B. At each step of the
process, a marble is selected from each
urn and the 2 marbles selected are
interchanged. Let the state ai of the
system be the number of red marbles in A
after i changes. What is the probability that
there are 2 red marbles in A after 3 steps?
In the long run, what is the probability that
there are 2 red marbles in urn A?
Solution : HW
Markov Chains
22.Find the nature of the states of the
Markov chain with the TPM,
0 1 0
1 0 1
2 2
P=
0 1 0
Solution : HW
Thank You