0% found this document useful (0 votes)
117 views227 pages

Unit 3

1. A random process is a family of random variables indexed by a parameter t that varies over a set T. It models randomly changing phenomena over time or space. 2. A random process can be classified based on whether its state space, parameter space, and values are discrete or continuous. It can also be predictable/deterministic or unpredictable/non-deterministic. 3. Stationary processes have statistical properties that do not change over time, such as a constant mean and auto-correlation that depends only on time difference. Weaker forms of stationarity also exist.

Uploaded by

deeksha
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
117 views227 pages

Unit 3

1. A random process is a family of random variables indexed by a parameter t that varies over a set T. It models randomly changing phenomena over time or space. 2. A random process can be classified based on whether its state space, parameter space, and values are discrete or continuous. It can also be predictable/deterministic or unpredictable/non-deterministic. 3. Stationary processes have statistical properties that do not change over time, such as a constant mean and auto-correlation that depends only on time difference. Weaker forms of stationarity also exist.

Uploaded by

deeksha
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 227

Unit - III

RANDOM PROCESSES

B. Thilaka
Applied Mathematics
Random Processes
A family of random variables {X(t,s)tT, sS}
defined over a given probability space and indexed
by the parameter t, where t varies over the index
set T, is known as random process/ chance
process/ stochastic process.

X: SxTR
X(s,t)=x

Notation: X(t)
Random Processes
The values assumed by the random
variables X(t) are called the states and the
set of all possible values is called the state
space of the random process.

Since the random process {X(t)} is a


function of both s and t, we have the
following
Random Processes

Observations:
1. s and t are fixed : X(t) real number
2. t is fixed : X(t) random variable
3. s is fixed : X(t) - function of time
called the sample function or realisation of
the process
4. s and t are varying : X(t) random
process
Classification
Three types

Type I
State Space, Parameter Space:
Discrete/ Continuous
Classification
S Discrete Continuous

T
Discrete Discrete Continuous
random random
sequence sequence
Continuous Discrete Continuous
random random
process process
Classification
Predictable/ Deterministic :
Future values can be predicted from past
values.
eg: X(t)= A cos (wt+), where any non-
empty combination of A, w, may be
random variables
Unpredictable/ Non- deterministic:
Future values cannot be predicted from
past values.
eg: Brownian motion
Classification
Stationarity

Probability for a random process: For a


fixed time t1, X(t1) is a random variable that
describes the state of the process at time
t1.
The first order distribution function of a
random process is defined as
First order Distribution and
Density Function

FX x1; t1 FX x1 PX (t1 ) x1

The first order density function of a


random process is defined as


f X x1; t1 FX x1; t
x1
First order Distribution Function
Statistical Average
The statistical average of the random
process {X(t)} is defined as

(t ) E[ X (t )] x1 f X x1; t1 dx1

provided the quantity on the RHS exists.
First order stationary process
The random process {X(t)} is said to be a
first order stationary process / stationary to
order one if the first order density/
distribution function is invariant with respect
to a shift in the time origin
f X x1 ; t 1 f X x1 ; t 1 x1 , t 1 ,
(or)

FX x1; t1 FX x1; t1 x1 , t1 ,
First order stationary process
Result:
The statistical mean of a first order
stationary random process is a constant
with respect to time.
Proof:
Let the random process {X(t)} be a first
order stationary process .
Then its first order density function
satisfies the property that
First order stationary process

f X x1 ; t 1 f X x1 ; t 1 x1 , t 1 ,
The mean of the random process {X(t)} at
time t1 is defined as


E[ X (t1 )] x1 f X x1 ; t1 dx1

First order stationary process
Consider

E[ X (t2 )] x2 f x2 ; t2 dx2

Let t2 = t1 +. Then

E[ X (t 2 )] x f x

2 2 ; t1 dx2


x f x ; t dx

2 2 1 2
First order stationary process

E[ X (t )] E[ X (t )]
Hence E[X(t)] is a constant with respect to
time.
Second order Distribution
function
The second order joint distribution function
of a random process {X(t)} is defined as

FX x1 , x2 ; t1 , t2 PX (t1 ) x1 , X (t2 ) x2
The second order joint density function of
the random process {X(t)} is defined as
2
f X x1 , x2 ; t1 , t 2 FX x1 , x2 ; t1 , t 2
x1x2
Second order stationary
process
A random process {X(t)} is said to be a
second order stationary process/
stationary to order two if its second order
distribution/ density function is invariant
with respect to a shift in the time origin.
In other words,
FX x1 , x2 ; t1 , t 2 FX x1 , x2 ; t1 , t 2 x1 , x2 , t1 , t 2 ,
(and / or)
f X x1 , x2 ; t1 , t 2 f X x1 , x2 ; t1 , t 2 x1 , x2 , t1 , t 2 ,
Second order Processes
Auto-correlation function:
The auto-correlation function of a random
process {X(t)} is defined as

R XX t 1 , t 2 EX(t 1 )X(t 2 )
provided the quantity on the RHS exists.
Second order Processes
Significance of auto-correlation function
1. It provides a measure of similarity
between two observations of the random
process {X(t)} at different points of time t1
and t2.
2. It also defines how a signal is similar to a
time-shifted version of itself
Second order Processes
Auto-covariance function:
The auto-covariance function of a random
process {X(t)} is defined as

C XX (t1 , t 2 ) EX (t1 ) E[ X (t1 )]X (t 2 ) E[ X (t 2 )]


EX (t1 ) X (t 2 ) X (t1 )E[ X (t 2 )] E[ X (t1 )]X (t 2 ) E[ X (t1 )]E[ X (t 2 )]
EX (t1 ) X (t 2 E[ X (t1 )]E[ X (t 2 )] E[ X (t1 )]E[ X (t 2 )] E[ X (t1 )]E[ X (t 2 )]
C XX (t1 , t 2 ) RXX (t1 , t 2 ) E[ X (t1 )]E[ X (t 2 )]
Wide-sense Stationary Process
A random process {X(t)} is said to be WSS/
weakly stationary/ covariance stationary
process if it satisfies the following
conditions:
1. E[X(t)] is a constant with respect to time.
2. RXX(t1,t2) is a function of the length of the
time difference.
i.e. RXX(t1,t2)= RXX(t1-t2)
3. E[X2(t)]<.
Wide-sense Stationary Process
Remark:
Since C XX (t1 , t 2 ) RXX (t1 , t 2 ) E[ X (t1 )]E[ X (t 2 )]

CXX(t1,t2)= CXX(t1-t2), the auto-covariance


function CXX(t1,t2) is a function of the length
of the time difference.
Hence, a WSS process is also called a
covariance stationary process.
Wide-sense Stationary Process
Alternately, a random process (second
order) is said to be WSS/ weakly stationary
process if it satisfies

E[ X(t )]
R XX t , t R XX
Remark: A second order stationary process
is a WSS process, but the converse need
not be true.
nth order distribution/ density
function
The nth order joint distribution function of a
random process {X(t)} is defined as

FX x1 , x2 ,.., xn ; t1 , t 2 ,.., t n PX (t1 ) x1 , X (t 2 ) x2 ,.., X (t n )


The nth order joint density function of a random
process {X(t)} is defined as

n
f X x1 , x2 ,.., xn ; t1 , t 2 ,.., t n FX x1 , x2 ,.., xn ; t1 , t 2 ,.., t n
x1x2 ...xn
nth order stationary process
A random process {X(t)} is said to be nth order
stationary/ stationary to order n if the nth order
density/ distribution function is invariant with
respect to a shift in the time origin
(i.e.)
f X x1 , x2 ,..xn ; t1 , t2 ,.., tn f X x1 , x2 ,..xn ; t1 , t2 ,..tn ,
x1 , x2 ,.., xn , t1 , t2 ,.., tn ,

(and / or)
FX x1 , x2 ,..xn ; t1 , t 2 ,.., t n FX x1 , x2 ,..xn ; t1 , t 2 ,..t n ,
x1 , x2 ,.., xn , t1 , t 2 ,.., t n ,
Strictly stationary Process

A random process is said to be strictly


stationary (SSS)/ stationary in the strict
sense if is stationary to all orders n
Aside: For a SSS- CDF/PDF (nth order) is
invariant with respect to a shift in the time
origin
Mean-constant
Auto-correlation, Auto-covariance-
functions of length of time intervals.
Strictly stationary Process

Remark :
If a random process is stationary to order
n, then it is stationary to all orders kn.

WHY ?
Evolutionary Process
Random processes which are not
stationary to any order are called non-
stationary/ evolutionary processes
Further Properties
Cross correlation function:
The cross correlation function of two
random processes {X(t)} and {Y(t)} is
defined as RXY(t1,t2)=E[X(t1)Y(t2)]
Cross covariance function:
The cross correlation function of two
random processes {X(t)} and {Y(t)} is
defined as
CXY(t1,t2)=E{[X(t1)-E(X(t1))][Y(t2)-E(Y(t2))]}
=RXY(t1,t2)-E[X(t1)]E[Y(t2)]
Further Properties
Two random processes {X(t)} and {Y(t)}
are said to be jointly WSS if
(i) both {X(t)} and {Y(t)} are each WSS
(ii) the cross-correlation function
RXY(t1,t2)=E[X(t1)Y(t2)] is exclusively a
function of the length of the time interval
(t2-t1)
Independent Random Process
A random process {X(t)} is said to be an
independent random process if its nth
order joint distribution function satisfies the
property that

FX x1 , x2 ,..xn ; t1 , t 2 ,.., t n FX ( x1 ; t1 ) FX ( x2 ; t 2 ).....FX ( xn ; t n ),


x1 , x2 ,.., xn , t1 , t 2 ,.., t n

A similar condition holds for joint p.m.f./


p.d.f.
Process with independent
increments

A random process {X(t)} is defined to be a


process with independent increments if for
all 0<t1< t2< <tn<t, the random variables
X(t2)- X(t1), X(t3)- X(t2),.., X(tn)- X(tn-1) are
independent.
Time averages of a random
process
Prelude:
The time average of a quantity f(t) (t-
time) is defined as
T
lt 1
A[ f (t )]
T 2T f (t )dt
T

Time average of a random process:


The time average of a random process
{X(t)} is defined as
Time averages of a random
process
T
lt 1
A[ X (t )]
T 2T x(t )dt
T

Time auto-correlation function :


The time auto-correlation function of a
random process {X(t)} is defined as

T
lt 1
A[ X (t1 ) X (t 2 )]
T 2T T
x(t ) x(t )dt , t1 t , t 2 t
Ergodic Process
A random process is said to be ergodic if
its time averages are all equal to the
corresponding statistical averages.

Examples ???
Random Process
Correlation coefficient of a random process :
The correlation coefficient (normalized
auto-covariance function) of a random
process {X(t)} is defined as the correlation
coefficient between its random variables
X(t1) and X(t2) for arbitrary t1 and t2 .
In other words,
C XX (t1 , t 2 )
XX (t1 , t 2 )
Var[ X (t1 )] Var[ X (t 2 )]

Note: Var[X(t)] = CXX(t, t)


Markov Processes
A random process {X(t)} is called a Markov
process if for all t0<t1< t2< <tn<t, the
conditional distribution of X(t) given the
values of X(t0), X(t1), X(t2),.., X(tn) depends
only on X(tn).
P[X(t) x X(t o ) x 0 , X(t 1 ) x1 ,..., X(t n ) x n ] PX(t ) x X(t n ) x n

P[a X(t) b X(t o ) x 0 , X(t 1 ) x1 ,..., X(t n ) xn ] Pa X(t) b X(t n ) xn


Markov Processes
In other words, a random process {X(t)} is
called a Markov process if the future
values of the process depend only on the
present and are independent of the past.

Examples: Binomial process, Poisson


process, Random telegraph process
Markov Processes
Note:
If the state space of a Markov process is
discrete, then the Markov process is called
a Markov chain.
Markov Processes
State Space

Discrete Continuous

Time Discrete Discrete- Time Discrete- Time


Markov Chain Markov
Process
Continuous Continuous - Continuous -
Time Markov Time Markov
Chain Process
Markov Processes
Markov Processes
If in a Markov chain, the conditional distribution is
invariant with respect to a shift in the time origin, the
Markov chain is said to be time homogeneous.

In a homogeneous Markov chain, the past history of


the process is completely summarized in the current
state.

Hence, the distribution of time that the process


spends in the current state must be memoryless.
Counting Process
A random process {X(t)} is called a
counting process if X(t) represents the
total number of events that have
occurred in the interval [0,t).
1. X(t) 0; X(0)=0- begins at time t=0
2. X(t) is integer valued
3. If s t, then X(s) X(t)
4. X(t) - X(s) : number of events in [s,t]
Counting Process
Alternately, the random process {N(t)} is
called a counting process if it assumes
only integer values and it is an increasing
function of time.
Types of Random Processes
Bernoulli process : {Xn:n1}, Xns are i.i.d.
Bernoulli variates with parameter p.
Consider a sequence of independent
and identical Bernoulli trials.
Let the random variable Yi denote the
outcome of the ith trial so that the event
{Yi=1} indicates success with probability p
on the ith trial for all i and the event {Yi=0}
indicates failure with probability (1-p)=q on
the ith trial for all i.
Bernoulli Process
Hence the Yis may be considered as
independently and identically distributed
random variables.
The random process {Yn} is called a
Bernoulli process with P[Yn=1]=p and
P[Yn=0]=1-p, for all n.

Discussion: Classify Bernoulli process ( 3


types)
Binomial Process
Binomial process :
Consider a Bernoulli process {Yi} where
the Yis are independently and identically
distributed Bernoulli random variables with
parameter p. Form another stochastic
process {Sn} with Sn= X1+X2+..+Xn
The random variable Sn=follows a
Binomial distribution with parameters n
and p.
Binomial Process
The random process {Sn:n1} is called a
Binomial process.

Discussion: Preliminary classification of


Binomial Process
Binomial Process
First order pmf:
The first order p.m.f. of the random
process {Sn} is given by

PS n k nCk p (1 p)
k n k
, k 0,1,..., n
E[S n ] np
Var[S n ] np(1 p) npq
Can you classify Binomial process even
further??
Binomial Process
Consider Sn+1= Y1+Y2++Yn+1
= Y1+Y2++Yn+Yn+1
= Sn+Xn+1
Hence
P[Sn+1= k Sn= k] = P[Yn+1= 0]=1-p
P[Sn+1= k+1 Sn= k] = P[Yn+1= 1]=p

Can you now classify Binomial process


even further?
Binomial Process
The Binomial process is also called the
Binomial counting process.
Probability generating function:

GSn ( z) q pz
n

Second order joint p.m.f.

m n m l
PS m k , S n l p (1 p) nl , k 0,1,.., m, m n, k l , l 0,1,.., n
k l k
Binomial Process
Observation:
The total number of trials T from the
beginning of the process until and
including the first success is a geometric
random variable.
The number of trials after (i-1)th success
upto and including the ith success will have
the same distribution as T

Can you generalise?


Binomial Process
If Tn is the number of trials upto and
including the ith success, then Tn is the n-
fold convolution of T with itself and follows
a negative binomial distribution with
E[Tn]=n/p and Var[Tn]=n/1-p.
Binomial Process
If in the Binomial process {Sn}, n is large
and p is small such that np is finite, then
the Binomial process approaches a
Poisson process with parameter =np.
Poisson Process
Let E be any random event and {N(t)}
denote the number of occurrences of the
event E in an interval of length t.
Let pn(t)= P[N(t)=n].
The counting process {N(t)} is called a
Poisson process if it satisfies the following
postulates:
Poisson Process
1. Independence : {N(t)} is independent of the
number of occurrences of E in an interval prior to
(0,t) i.e. future changes in N{t} are independent
of the past changes.
2. Homogeneity in time: pn(t) depends only on
the length t of the time interval and is
independent of where the interval is situated i.e.
pn(t) gives the probability of the number of
occurrences of E in the interval (t0, t0 +t) for all t0
3. Regularity: In an interval of infinitesimal length
h, the probability of exactly one event occurring
is h+o(h), the probability of zero events
occurring is 1-h+o(h), the probability of more
than one event occurring is o(h).
Poisson Process
Poisson Process
Relax the postulates:
3. Regularity- Compound Poisson Process
(multiple occurrences at any instant)
2. Homogeneity in time: is a function of
time ((t))- Non-homogeneous
1. Independence: future depends on
present- Markov process
Poisson Process
Result : The Poisson process defined above
follows Poisson distribution with mean t.
i.e. t
pn (t ) P[N(t ) n]
e
t , n 0,1,2,.....
n

n!

Proof:
Let {N(t)} be a Poisson process with
parameter . We now consider
Poisson Processes- pmf
pn (t t ) P[ N (t t ) n]
n
P[ N (t t ) n / N (t ) k ]P[ N (t ) k ]
k 0

(Theorem on total probability)


n
P[ N (t t ) N (t ) n k ]P[ N (t ) k ]
k 0
n
P[ N (t , t t ) n k ]P[ N (t ) k ]
k 0

(homogeneity)
Poisson Processes- pmf
n
P[ N (t ) n k ]P[ N (t ) k ]
k 0

P[ N (t ) n]P[ N (t ) 0] P[ N (t ) n 1]P[ N (t ) 1]
n2
P[ N (t ) k ]P[ N (t ) n k ] o(t )
k 0

m2
pn (t ) p0 (t ) pn1 (t ) p1 (t ) pk (t ) pnk (t )
k 0

Assuming t to be of infinitesimal length,


we have
Poisson Processes- pmf
n2
pn (t t ) pn (t )1 t o(t ) pn1 (t )t o(t ) pk (t )o(t )
k 0

(Regularity)
n2
pn (t t ) pn (t ) tp n (t ) tp n1 (t ) o(t ) pk (t )o(t ) o(t )
k 0

Dividing throughout by t
pn (t t ) p n (t ) o(t )
p n (t ) pn1 (t )
t t
Poisson Processes- pmf
Taking limits as t 0 on both sides of the
above equation,
dpn t
pn t pn1 t , n 1 --------(1)
dt
At n=0, we have

p0 (t t ) P[ N (t t ) 0]

P[ N (t t ) 0 / N (t ) 0]P[ N (t ) 0]
Poisson Processes- pmf

P[ N (t t ) N (t ) 0]P[ N (t ) 0]
P[ N (t , t t ) 0]P[ N (t ) 0]
P[ N (t ) 0]P[ N (t ) 0]
(homogeneity)
1 t o(t )p0 (t ) (regularity)

p0 (t t ) p0 (t ) o(t )
p0 (t )
t t
Taking limits as t 0 on both sides of the
above equation,
Poisson Processes- pmf
dp0 t
p0 t
dt -----(2)
We now solve the above system of
differential difference equations (1) and (2)
subject to the initial conditions
p0 (0) 1, pn (0) 0n 1
-------(3)
Consider equation (2) namely,
Poisson Processes- pmf
dp0 t subject to p0 (0) 1
p0 t
dt

dp0 t
dt
p0 t

log p0 t t c

p0 t e t c

p0 t Ke t
At t=0, p0 0 1 K

p0 t e t -----(4a)
Poisson Processes- pmf
Substituting n=1 in equation (1), we have
dp1 t
p1 t p0 t subject to p1 (0) 0
dt
dp1 t
p1 t e t
dt
dp1 t
p1 t e t , p1 (0) 0
dt
On solving the above equation, we have
t dt
p1 t e e e dt c
dt

p1 t e t e t e t dt c
Poisson Processes- pmf
p1 t e (t ) ce
t t

At t=0, p1 0 e (0) (0) ce (0)


c0
p1 t e (t )
t
-----(4b)
Substituting n=2 in equation (1), we have
dp2 t
p2 t p1 t
dt
Poisson Processes- pmf
.
dp2 t
p2 t e t (t ), p2 (0) 0
dt

dt t
p2 t e e te dt c
dt

p2 t e t 2 e t te t dt c
2 t 2
c
2
Poisson Processes- pmf
2 t 2
p2 t e t ce t
2
At t=0,
p2 0 e (0) (0) ce (0)
c0
e t ( t ) 2
p2 (t ) -----(4c)
2!

Proceeding in this manner, we have


Poisson Processes- pmf
e t
pn (t ) P[ N (t ) n] t n , n 0,1,2,.....
n!

Hence, the number of events {N(t)} in an


interval of length t follows a Poisson
distribution with mean t.

Can you classify Poisson process??


Poisson Processes
Poisson process is an evolutionary
process
E[N(t)]=t and Var[N(t)]=t . Further,

N(t ) N(t )
lt E , lt Var 0
t
t t
t
Hence is called the arrival rate of the
process
Poisson Process
Characterization: If {N(t)} is a Poisson
process with mean t, then the
occurrence/ inter-arrival time follows an
exponential distribution with mean 1/ .
Proof: Let {N(t)} be a Poisson process with
parameter . Then
e t
p n (t ) t , n 0,1,2,.....
n

n!
Poisson Process
If W denotes the time between two successive arrivals/
occurrences of the event, the CDF of W is

FW w P[ W w] 1 P[ W w]
1 P[N( w ) 0]

FW w 1 e w

The above is the CDF of an exponential variate with


mean 1/
Poisson Process
If the inter-arrival times are i.i.d. (not
necessarily exponential), then we have a
renewal process.
Poisson Process- Properties
Superposition
Decomposition
Markov Process
Difference of two Poisson processes is not
Poisson.
Poisson Process- Properties
Result:
The superposition of n independent
Poisson processes with means 1t, 2t, .,
nt, respectively is a Poisson process with
mean 1t+ 2t+ . +nt.
Proof:
Consider n independent Poisson
processes N1(t), N2(t),, n(t), with
respective means 1t, 2t, .. nt.
Poisson Process- Properties
The moment generating function(m.g.f.) of
each Nk(t) is given by


M Nk (t ) ( ) E e N k (t )

k t (1e )
e
By property of moment generating n
functions, the mg.f. of the sum N (t ) N k (t )
k 1
is given by
Poisson Process- Properties
n
M N (t ) ( ) M Nk (t ) ( )
k 1

n
e k t (1e )

k 1

n
k t (1e )
e k 1

which is the moment generating function


of a Poisson distribution with mean 1t+
2t+ . +nt.
Poisson Process- Properties
Hence by uniqueness property, the sum of
n independent Poisson processes with
means 1t, 2t, ., nt, respectively is a
Poisson process with mean t=1t+ 2t+
. +nt.
Poisson Process- Properties
Decomposition of Poisson process:
A Poisson process N(t) with mean arrival
rate can be decomposed into n mutually
independent Poisson processes with
arrival rates p1 , p2 ,.., pn such that
p1 + p2 + + pn =1

Proof: HW
Poisson Process
Note:
If N(t) is a Poisson process with mean
arrival rate , then the time between
k successive arrivals/ occurrences follows
a

(k-1) Erlang distribution .

WHY??
Poisson Process
Second order joint p.m.f. :
Given a Poisson process with mean arrival
rate , the second order joint p.m.f. is
obtained as follows:

PN (t1 ) n1 , N (t 2 ) n2 PN (t 2 ) n2 N (t1 ) n1 PN (t1 ) n1 , t1 t 2 , n1 n2

PN (t1 ) n1 PN (t 2 ) N (t1 ) n2 n1 , t1 t 2 , n1 n2

PN (t1 ) n1 PN (t1 , t 2 ) n2 n1 , t1 t 2 , n1 n2
Poisson Process
PN (t1 ) n1 PN (t 2 t1 ) n2 n1 , t1 t 2 , n1 n2

(homogeneity)

e t1
t1 n1
e (t 2 t1 )
( t1 t 2 ) ( n2 n1 )

, t1 t 2 , n1 n2
n1! n2 n1 !
Poisson Process
Hence

e t2 n2 t1 n1 (t 2 t1 ) ( n2 n1 )
, t1 t 2 , n1 n2
PN (t1 ) n1 , N (t 2 ) n2 n1!n2 n1 !

0, elsewhere
Poisson Process
Similarly the third order joint p.m.f. of the
Poisson process with mean arrival rate is
given by
Poisson Process
.
et3 n3 t1n1 (t2 t1 )( n2 n1 ) (t3 t2 )( n3 n2 )
, t1 t2 t3 , n1 n2 n3
PN (t1 ) n1 , N (t2 ) n2 , N (t3 ) n3 n1!n2 n1 !(n3 n2 )!

0, elsewhere
Poisson Process
Auto-correlation function :
If N(t) is a Poisson process with mean
arrival rate , then
E[N(t)] = t
Var[N(t)] = t
E[N2(t)] = Var[N(t)] + E[N(t)] = t + 2t2
Poisson Process
The auto-correlation function of N(t) is now
obtained as follows:
RNN (t1 , t 2 ) E[ N (t1 ) N (t 2 )] (by definition)

EN (t1 )N (t 2 ) N (t1 ) N (t1 ), t1 t 2


E N (t1 )N (t 2 ) N (t1 ) N (t1 ) , t1 t 2
2


EN (t1 ) N (t1 , t 2 ) E N (t1 ) , t1 t 2
2

Poisson Process

EN (t1 ) N (t 2 t1 ) E N (t1 ) , t1 t 2
2

EN (t1 )EN (t 2 t ) EN
1
2

(t1 ) , t1 t 2
(independence)
RNN (t1 , t 2 ) t1[ (t 2 t1 )] t1 t1 , t1 t 2
2

t1t 2 t1 , t1 t 2
2

RNN (t1 , t 2 ) t1t 2 min t1 , t 2


2
Poisson Process
Auto-covariance function of a Poisson
process :
The auto-covariance function of a Poisson
process N(t) with mean arrival rate is
C NN (t1 , t 2 ) RNN (t1 , t 2 ) E[ N (t1 )]E[ N (t 2 )]
(by definition)
t1t 2 min t1 , t 2 t1t 2
2

C NN (t1 , t 2 ) min t1 , t 2
Poisson Process
The correlation function of a Poisson
process N(t) with mean arrival rate is
given by
C NN (t1 , t 2 )
NN t1 , t 2
Var[ N (t1 )] Var[ N (t 2 )]

min t1 , t 2

t1 t 2

min t1 , t 2

t1t 2
Poisson Process

t1
Hence NN t1 , t 2 , t1 t 2
t2
Random Processes
Process with stationary increments:
A random process {X(t)} is said to be a
process with stationary increments if the
distribution of the increments X(t+h)-X(t)
depends only on the length h of the
interval and not on end points.
Random Processes
Wiener process/ Wiener-Einstein Process/
Brownian Motion Process:
A stochastic process {X(t)} is said to be
a Wiener Process with drift and varaince
2, if

(i) X(t) has independent increments


(ii) every increment X(t)-X(s) is normally
distributed with mean (t-s) and variance
2(t-s)
Poisson Process
1. For a Poisson process with parameter
and for show that
k nk
s s
PN ( s) k N (t ) n nC k 1 , k 0,1,2,...n
t t

Solution : Consider

PN (s) k N (t ) n PN ( s) k N (t ) n
PN (t ) n (defn)
Poisson Process
PN (s) k N (t s) n k

PN (t ) n (by definition)
PN (s) k PN (t s) n k

PN (t ) n

e s (s ) k e ( t s ) ( (t s )) n k
k! ( n k )!

e t ( t ) n
n!
Poisson Process
n! e s k s k e t e s nk (t s) nk

k!(n k )! e t n t n

nk
s
s t 1
n k nk

nC k t
n t n
nk
k n k s
s t t 1
nC k t
tn
Poisson Process
nk
s
k
s
nC k k 1
t t
k nk
s s
nC k 1
t t
k nk
s s
PN (s) k N (t ) n nCk 1 , k 0,1,2,....n
t t
Hence the proof.
Poisson Process
2. Suppose that customers arrive at a
bank according to a Poisson process with
a mean rate of 3 per minute. Find the
probability that during a time interval of 2
minutes (a) exactly 4 customers arrive
(b) more than 4 customers arrive.
Solution: Let N(t) be a Poisson process
with mean arrival rate .
Poisson Process
Given that =3.
e t ( t ) k
Hence PN (t ) k , k 0,1,2,..,
k!

e 3( 2 ) (3.2) 4
(a) PN (2) 4
4!

e 6 6 4
0.133
4!
Poisson Process
(b) PN (2) 4 1 PN (2) 4
PX (2) 0 PX (2) 1
1
PX (2) 2 PX (2) 3 PX (2) 4

e 6 (6) 0 e 6 (6)1 e 6 (6) 2 e 6 (6) 3 e 6 (6) 4


1
0 ! 1! 2 ! 3 ! 4 !
= 0.715
Poisson Process
3. If a customer arrives at a counter
according to a Poisson process with a
mean rate of 2 per minute, find the
probability that (i) 5 customers arrive in a
10 minute period (ii) the interval between 2
successive arrivals is (a) more than 1
minute (b) between 1 and 2 minutes (c) 4
minutes or less (iii) the first two customers
arrive within 3 minutes (iv) the average
number of customers arriving in 1 hour.
Poisson Process
Solution:
Let N(t) denote the number of customers
who arrive at the counter in an interval of
length t. We are given that (t) follows a
Poisson process with mean arrival rate
=2 per minute.
Therefore we have,
e t
P[ N (t ) n] t n , n 0,1,2,.....
n!
Poisson Process
(i) e 2(10) (2 x10) 5
PN (10) 5
5!
e 20 ( 20) 5

5!
(ii) Since N(t) is a Poisson process with
mean arrival rate =2, the inter-arrival time
T between 2 successive arrivals follows an
exponential distribution with p.d.f.
Poisson Process
2t
f (t ) 2e , t 0

(a) P(T 1) 2e 2t dt
1


e 2t

2
2

1


1 0 e 2
0.1353
Poisson Process
(b) 2
P(1 T 2) 2e 2t dt
1
2
e 2t

2
2 1


1 e 4 e 2
2 4
e e
0.1353 0.0183

0.1170
Poisson Process
(c) P(T 4) = 1-e-2(4)=1-e-8=0.9996

(iii) Since N(t) is a Poisson process with


mean arrival rate , we know that the inter-
arrival time follows an exponential
distribution with mean 1/.
If Ti denotes the inter arrival time
between the (i-1)th customer and the ith
customer, then the time taken for the first
Poisson Process
2 customers to arrive is given by T1+T2.
Since T1 andT2 are independently and
identically distributed exponential variates
with parameter , T1+T2 follows a second
order Erlang distribution with p.d.f.
2 te t
, t 0, 0
(2)

2t
2 te , t 0
2
Poisson Process
3
P[T1 T2 3] 4te 2t dt
0

te 2t
3
e
2t

3

4
2 0 4 0
3e 6 0 e 6 1
4
2 4
6
7e 1

=0.9826
Poisson Process
(iv) Since E[N(t)]= t,
the average number of customers arriving
in one hour is E[N(60)]= 2x60 = 120.

4. A radioactive source emits particles at


the rate of 5 per minute in accordance with
a Poisson process. Each particle has a
probability of 0.6 of being recorded. Find
the probability that 10 particles are
recorded in a 4 minute period.
Poisson Process
Solution:
Given that the emission of particles follows
a Poisson process with arrival rate =5 per
minute. From the decomposition property
of Poisson process, the number of emitted
particles N1(t) follows a Poisson process
with mean arrival rate 1= p =5x0.6/ min
i.e. 1=3 per minute
Poisson Process
e 1t
P[ N1 (t ) n] 1t n , n 0,1,2,.....
n!

Hence
e 3( 4) (3x 4)10 e 12 (12)10
PN1 (4) 10
10! 10!
Poisson Process
5. A machine goes out of order whenever
a component fails. The failure of this part
follows a Poisson process with a mean
rate of 1 per week. Find the probability that
2 weeks have elapsed since last failure. If
there are 5 spare parts of this component
in an inventory and that the next supply is
not due in 10 weeks, find the probability
that the machine will not be out of order in
the next 10 weeks.
Poisson Process
Solution:
Let X(t) denote the number of failures of
the component in t units of time.
Then X(t) follows a Poisson process with
mean failure rate= mean number of
failures in a week = = 1
P[2 weeks have elapsed since last failure]
= P[ Zero failures in the 2 weeks since last
failure] =P[X(2)=0]
Poisson Process
e 2 (2 ) 0
P[X(2) 0]
0!

=e-2(1) =e-2 =0.135


There are only 5 spare parts and the
machine should not go out of order in the
next 10 weeks.
Hence
P[ the machine will not be out of order in
the next 10 weeks] = P[X(10)5]
Poisson Process
e 10 10 e 10 10 e 10 10 e 10 10 e 10 10 e 10 10
0 1 2 3 4 5

0! 1! 2! 3! 4! 5!
100 1000 10000 100000
e 10 1 10
2 6 24 120

P[X(10)5]=0.068
Stationary Processes
6. Determine the mean and variance of the
random process {X(t)} is given by
(at ) n 1
, n 1,2,3,..
(1 at ) n 1
PX (t ) n
at n0

1 at
Verify whether {X(t)} is stationary or not.
Stationary Processes
Solution:
Consider the random process {X(t)}
defined by
(at ) n 1
, n 1,2,3,..
(1 at ) n 1
PX (t ) n
at n0

1 at

Since the first order p.m.f. of X(t) is a


function of t, the random process {X(t)}
Stationary Processes
is not a stationary process. It is an
evolutionary process.
Now, the mean of the process is given by

EX (t ) nP[ X (t ) n] (by definition)
n 0

at 1 at (at ) 2
0 1 2 3 ...
1 at (1 at ) 2
(1 at ) 3
(1 at ) 4

1 at (at ) 2
1 2 3 ....
(1 at ) 2 1 at (1 at ) 2

Stationary Processes
1 at at
2

1 2 3 ....
(1 at ) 2 1 at 1 at
2
1 at
1 1 at
(1 at ) 2
2
1 1 at at 1 1
2

(1 at ) 2 1 at (1 at ) 2 1 at

1
(1 at ) 2

(1 at ) 2

E[X(t)]=1
We now compute Var[X(t)]
Stationary Processes
Var [X(t)]=E[X2(t)]-{E[X(t)]}2.
Consider EX (t ) n P[ X (t ) n]
2

2

n 0


[n 2 n n]P[ X (t ) n]
n 0


n(n 1) P[ X (t ) n] nP[ X (t ) n]
n 0 n 0

at 1 at (at ) 2
0 1x2 2 x3 3x 4 .... 1
1 at (1 at ) (1 at ) (1 at )
2 3 4

2 at at
2
at
3

2
1 3 6 10 .... 1
(1 at ) 1 at 1 at 1 at
Stationary Processes
3
2 at
1 1 at 1
(1 at ) 2

3
2 1 at at
1 at 1
(1 at ) 2

2
(1 at ) 3
1
(1 at ) 2

E[X2(t)]=2+2at-1
=3at-1
Stationary Processes
Var[X(t)]=2at+1-1
Var[X(t)]=2at.

7. Show that the random process {X(t)}


defined by. X (t ) A cos(t ) where A and
are constants and is a uniform random
variable over (0,2) is wide sense
stationary. Further, determine whether
{X(t)} is mean ergodic, correlation ergodic.
Stationary Processes
Solution :
A random process {X(t)} is said to be WSS
if it satisfies the following conditions:
1. E[X(t)] is a constant with respect to
time.
2. The auto-correlation function RXX(t1,t2) is
a function of the length of the time
difference. i.e. RXX(t1,t2)= RXX(t1-t2)
3. E[X2(t)]<.
Stationary Processes
Consider the random process given by
X (t ) A cos(t ) where A and are
constants and is uniform distributed in
(0,2). Hence, the p.d.f. of is given by
1
,0 2
f ( ) 2

0, elsewhere


Consider E[ X (t )] E A cos(t )
Stationary Processes
AEcos(t )
2
1
A cos(t ) d
0
2


A
sin(t )0
2

2

A
sint 2 sint
2
E[X(t)]=0, a constant with respect to time.
----------(1)
Stationary Processes
We next consider the auto-correlation
function RXX(t,t+ )=E[X(t)X(t+ )]

EA cos(t ) A cos({t } )
A2
Ecos(t t ) cos(t t )
2
A2
Ecos(2t 2 ) cos( )
2
A2 A2
Ecos(2t 2 ) Ecos( )
2 2
Stationary Processes
A2 A2
Ecos(2t 2 ) cos( )
2 2
2

Consider Ecos(2t 2 ) cos(2t 2 ) 2 d


1
0
2
1 sin(2t 2 )

2 2
0

1
sin(2t 2 ) sin(2t ) 0
4
------(2)
Substituting the above expression in
RXX(t,t+ ), we have
Stationary Processes
A2 -----(3)
R XX (t , t ) cos( )
2
A2

Consider E[X2(t)]= RXX(t,t)= RXX(0) <. 2
Hence, the given random process is WSS.

Now, the random process {X(t)} is said to


be mean ergodic if E[X(t)] = A[X(t)],
T
where A[ X (t )] lt 1
x(t )dt
T 2T
T
Stationary Processes
Consider T
lt 1
A[ X (t )]
T 2T A cos(t )dt
T

T
1 lt
A
T 2T cos(t )dt
T

1 sin(t )
T
lt
A
T 2T

T

lt

A 1
sin(T ) sin(T )
T 2T

= 0 (since sin1 for


all ) ----(4)
Stationary Processes
From equations (1) and (4), we see that
E[X(t)] = A[X(t)].
Hence {X(t)} is mean ergodic.

The random process {X(t)} is said to be


correlation ergodic if
E[X(t)X(t+ )] = A [X(t)X(t+ )] where

T
lt 1
A[ X (t ) X (t )]
T 2T x(t ) x(t )dt
T
Stationary Processes
Consider
T
lt 1
A[ X (t ) X (t )]
T 2T A cos(t ) A cos({t } )dt
T

cos(t t ) cos(t t )
T
lt 1
A2
T 2T T 2

dt

T T
A2 lt 1 A2 lt 1

2 T 2T T
cos(2t 2 )dt
2 T 2T T
cos( )dt

sin(2t 2 )
T T
A 2 lt 1 A2 lt 1
cos( ) dt
2 T 2T 2 T 2 T 2T T
Stationary Processes
A2 lt sin(2T 2 ) sin(2T 2 )

4 T 2T

A 2 cos( ) lt

1
T (T )
2 T 2T

A 2 cos( ) lt
0 1 sin 1,
2 T

A2
AXX (t , t ) cos( ) ----(5)
2

From equations (2) and (5), we see that


E[X(t)X(t+ )] = A [X(t)X(t+ )]
Stationary Processes
Hence the given random process {X(t)} is
correlation ergodic.

Note: Since the above process is WSS, it is


first order stationary.

8. Show that the process X (t ) A cos t B sin t


where A and B are uncorrelated random
variables is wide sense stationary if
Stationary Processes
EA EB 0; EA EB
2 2

Solution : HW

9. Determine the mean and the variance of


the random process X (t ) A cos(t )
where A and are constants and is
uniformly distributed in 0,
2

Solution : HW
Stationary Processes
10. Two random processes X(t) and Y(t)
are defined by X (t ) A cos 0t B sin 0t and
Y (t ) B cos 0t A sin 0t . Show that X(t) and
Y(t) are jointly wide sense stationary, if A
and B are uncorrelated zero mean random
variables having the same variance and 0

is a constant.

Solution: HW
Stationary Processes
11. If X(t) is a WSS process with
autocorrelation function RXX( )=Ae-2
where A is any constant , obtain the
second order moment of the random
variable X(8)-X(5).

Solution : HW

12. Given a random variable Y with


characteristic function ()=E[eiY] and a
a random process X(t)=Cos(t+Y). Show
Stationary Processes
that X(t) is WSS if (1)=(2)=0.

Solution :
A random process {X(t)} is said to be WSS if
it satisfies the following conditions:
1. E[X(t)] is a constant with respect to time.
2. The auto-correlation function RXX(t1,t2) is
a function of the length of the time
difference. i.e. RXX(t1,t2)= RXX(t1-t2)
3. E[X2(t)]<.
Stationary Processes
Since () is the characteristic function of
the random variable Y, we have
()=E[eiY] =E[cos (Y) +i sin (Y)]

Since (1)=0, we have


E[cos (Y) +i sin (Y)]=0
E[cos Y] + i E[sin Y]=0 (WHY??)
This implies that
E[cos Y] =0 and E[sin Y] =0 (WHY??)
-------(1)
Stationary Processes
Also, (2)=0 yields
E[cos (2Y) +i sin (2Y)]=0
E[cos 2Y] + i E[sin 2Y]=0
This implies that
E[cos 2Y] =0 and E[sin 2Y] =0
------(2)
Consider E[X(t)]=E[Cos(t+Y)]
= E[cos t cos Y sin t sin Y]
= cos t E[cos Y] sin t E[sin Y]
= 0 (from (1)), a constant.
Stationary Processes
Now consider RYY(t, t+ )=E[X(t)X(t+ )]
Therefore,
RXX(t, t+ )= E[Cos(t+Y) Cos([t+ ]+Y)]
=E[{Cos({t+Y}+{ [t+ ]+Y})
+Cos({t+Y }-{[t+ ]+Y})}/2]
= E[cos(2t+ +2Y)]/2+ E[cos(- )]/2 (why ??)
= E[cos (2t+ ) cos 2Y- sin (2t+ ) sin 2Y]/2
+ E[cos( )]/2 (why???)
= cos (2t+ )E[cos 2Y]/2
- sin (2t+ )E[sin 2Y]/2+ E[cos( )]/2
Stationary Processes
Hence RXX(t, t+ )= E[cos( )]/2 (from (2)),
is exclusively a function of the length of
the time interval and not the end points.

Further, E[X2(t)]= RXX(t,t)= E[cos(x0)]/2


= 1/2 )]<.

Hence the given random process {X(t)} is


WSS
Stationary Processes
HW
13. Verify whether the random process
X(t)=ACos(t+) is WSS given that A and
are constants and is uniformly
distributed in (i) (-, ) (ii) (0, ) (iii) (0,
/2).
Stationary Processes
14. If X(t)= Y cos t + Z sin t, where Y
and Z are two independent normal RVs
with E[X]=E[Y]=0,E[X2]=E[Y2]= 2 and is
a constant, prove that {X(t)} is a stationary
process of order 2.

Solution: HW
Stationary Processes
15. Verify whether the random process
X(t)=Ycos t where is a constant and Y
is a uniformly distributed random variable
in (0,1) is a strict sense stationary process.

Solution : HW
Sine Wave Processes
Definition: A random process {X(t)} of the form
X(t) = A cos (t+) or X(t) = A sin(t+)
where any non-empty combination of A, ,
are random variables is called a sine wave
process.
A is called the amplitude
is called the frequency
is called the phase.
Random Processes
Orthogonal Processes:
Two random processes {X(t)} and {Y(t)}
are said to be orthogonal if their cross
correlation function RXY(t1,t2)=0.

Uncorrelated Processes:
Two random processes {X(t)} and {Y(t)}
are said to be uncorrelated if their cross
covariance function CXY(t1,t2)=0.
i.e. RXY(t1,t2)=E[X(t1)]E[Y(t2)]
Random Processes
Note: Two independent random processes
are uncorrelated but the converse need
not be true. (Can you give a Counter-
example???)

Remark: If two random processes {X(t)}


and {Y(t)} are statistically independent,
then their cross-correlation function is
given by RXY(t1,t2)=E[X(t1)]E[Y(t2)]
Random Processes
Remark:
If two random processes {X(t)} and {Y(t)}
are at least WSS, then RXY(t1,t2)=XY, a
constant with respect to time.
Normal/ Gaussian Process
Definition:
A random process {X(t)} is called a Normal/
Gaussian process if its nth order joint density
function is given by

1

[ X X ]C X1 [ X X ]T
f X x1 , x2 ,.., xn ; t1 , t 2 ,.., t n
1 2
e
C X (2 ) n
Normal/ Gaussian Process
where X1 X1
T


X
2 X 2

X X .

.
X X
n n

CX is the covariance matrix given by


C11 C12 .. C1n

C 21 C 22 .. C 2 n
CX
.. .. .. ..

C .. C nn
n1 C n 2
Normal/ Gaussian Process
where Ci,j=CXX(ti,tj) is the auto-covariance
function of X(t). Also, X X is the
T

transpose of the matrix X X where


X i = E[X(ti)].

Remarks:
1. A Gaussian process is completely
determined by its mean and auto-
covariance functions. (WHY???)
Normal/ Gaussian Process
2. If a Gaussian process is WSS, then it is
strictly stationary. (WHY???)

(Hint: If {X(t)} is WSS, then its mean is a


constant with respect to time and its auto-
covariance function is only a function of
the length of the time interval and not on
the end points.)
Normal/ Gaussian Process
16. If {X(t)} is a Gaussian process with mean
(t)=0 and the auto-covariance function
CXX(t1,t2)=16e -t1-t2 . Find the probability
that X(10)8 and X(10)-X(8) 4.

Solution: Consider the Gaussian process


with mean (t)=0 or E[X(t)]=0 and the
auto-covariance function
CXX(t1,t2)=16e -t1-t2 .
Normal/ Gaussian Process
The random variable X (t i ) E[ X (t i )] where ti
C XX (t i , t i )

is any fixed time point follows a standard


normal distribution.

X (10) E[ X (10)] 8 10
PX (10) 8 P
C XX 10,10 16

2
P Z P[Z 0.5]
4
1 0.6915 0.3085
Normal/ Gaussian Process
Consider the random variable X(10)-X(8) .
Then E[X(10)-X(8)]=E[X(10)]-E[X(8)]=0.
Var[X(10)-X(8)]
=Var[X(10)]+Var[X(8)]-2Cov[X(8),X(10)]
=16 + 16 2x16e8-10
= 32 32 e-2
= 32(0.8646)
= 27.6692
Normal/ Gaussian Process
40
P X (10) X (8) 4 P Z
27.6602
P Z 0.76

2P[0 Z 0.76]
= 2P[(Z<0.76)-0.5]
= 2[0.7764-0.5]
= 0.5528
Random Telegraph Process
Let {X(t)} be a random process satisfying
the following conditions:
1. {X(t)} assumes only one of the 2
possible levels +1 or -1 at any time.
2. X(t) switches back and forth between its
two levels randomly with time.
3. The number of level transitions in any
time interval of length is a Poisson
random variable. i.e. the probability of
Random Telegraph Process
exactly k transitions with average rate of
transitions is given by e ( ) , k 0,1,2,....
k

k!
4. Transitions occurring in any time
interval are statistically independent of the
transitions in any other interval.
5. The levels at the start of any interval are
equally probable.

X(t) is called a semi-random telegraph


signal process.
Random Telegraph Process
Alternately, if N(t) represents the number
of occurrences of a specified event in (0,t),
N(t) is Poisson with parameter t and
Y(t)= (-1) N(t), then the random process Y(t)
is called a semi-random telegraph signal
process.

If Y(t) is a semi-random telegraph signal


process, A is a random variable which is
Random Telegraph Process
independent of Y(t) and assumes values
+1 and -1 with equal probability, then the
ransom process {X(t)} defined by
X(t)=AY(t) is called a random telegraph
signal process
We now show that the random telegraph
process defined above is a WSS process.
To show that X(t) is WSS, we need to
prove the following:
Random Telegraph Process
1. E[X(t)] is a constant with respect to
time.
2. The auto-correlation function RXX(t1,t2) is
exclusively a function of the length of the
time difference. i.e. RXX(t1,t2)= RXX(t1-t2)
3. E[X2(t)]<.

Consider E[X(t)] = E[AY(t)]


Random Telegraph Process
E[X(t)] = E[A]E[Y(t)] (A and Y(t) are
independent).
Consider E[A] = (1) P(A=1) + (-1) P(A=-1)
= (1)() + (-1)()
= 0 -------------(1)
Now, E[A2] = (1)2 P(A=1) + (-1)2 P(A=-1)
= (1)() + (1)()
= 1 --------------(2)
Random Telegraph Process
By definition of Y(t), Y(t) takes the value
+1 whenever the number of level
transitions N(t) in the interval (0,t) is even
and Y(t) assumes the value -1 whenever
the number of level transitions N(t) in the
interval (0,t) is odd.

Hence P[Y(t)=1] = P[N(t)= even]


= P[N(t)= 0] + P[N(t)= 2]
+ P[(t)= 4] + ..
Random Telegraph Process
e ( ) 2 e ( ) 4
e ....
2! 4!
( ) 2 ( ) 4
e 1 ....
2! 4!

e
2

1
e e
P[Y(t)=1] = et cosh (t) -------(3)
Random Telegraph Process

Also, P[Y(t)=-1] = P[N(t)= odd]


= P[N(t)= 1] + P[N(t)= 3]
+ P[(t)= 5] + ..
e ( )1 e ( ) 3
.....
1! 3!
( )1 ( ) 3
e ....
1! 3!

P[Y(t)= -1] = et sinh (t) -------(4)


Random Telegraph Process
E[Y(t)] = (1) et cosh (t) + (-1) et sinh (t)
e
e
e e
e e
2 2
E[Y(t)] = e2t ------(5)

Substituting expressions (1) and (5) in


E[X(t)], we obtain E[X(t)] = 0, which is a
constant with respect to time.
Random Telegraph Process
Hence E[X(t)] is a constant with respect to
time.

We next consider the auto-correlation


function of X(t), viz,
RXX(t,t+ )= E[X(t)X(t+ )]
= E[AY(t) AY(t+ )]
= E[A2Y(t)Y(t+ )]
Random Telegraph Process
RXX(t,t+ )= E[A2]E[Y(t)Y(t+ )]
( A is independent of Y(t))
= (1) RYY(t,t+ )

We now compute RYY(t,t+ )= E[Y(t)Y(t+ )]


If Y(t) = 1, then Y(t+ ) = 1, if the number of
level transitions in the interval (t, t+ ) is
even.
Random Telegraph Process
Hence
P[Y(t, t+ ) = 1 Y(t) = 1] = P[ number of
level transitions in (t, t+ ) is even]
= e cosh ( )
P[Y(t) = 1 ,Y(t, t+ ) = 1]
= P[Y(t, t+ ) = 1 Y(t) = 1] P[Y(t) = 1]
= e cosh ( ) et cosh (t) -----(7a)
Random Telegraph Process
Similarly,
P[Y(t) = 1 ,Y(t, t+ ) = -1]
= e sinh ( ) et cosh (t) -----(7b)

P[Y(t) = -1 ,Y(t, t+ ) = 1]
= e sinh ( ) et sinh (t) -----(7c)

P[Y(t) = -1 ,Y(t, t+ ) = -1]


= e cosh ( ) et sinh (t) -----(7d)
Random Telegraph Process
Hence,
RYY(t,t+ )= (1)(1) e cosh ( ) et cosh (t)
+ (1)(-1) e sinh ( ) et cosh (t)
+ (-1)(1) e sinh ( ) et sinh (t)
+ (-1)(-1) cosh ( ) et sinh (t)
= e et {cosh ( ) cosh (t)
- sinh ( ) cosh (t) - sinh ( ) sinh (t)
+ cosh ( ) sinh (t)}
Random Telegraph Process
RYY(t,t+ ) = e et {cosh ( ) [cosh (t)
+ sinh (t)] - sinh ( ) [cosh (t) + sinh (t)]}
= e et [cosh ( ) - sinh ( )] [cosh (t) +
sinh (t)]
= e et e et
RYY(t,t+ ) = e2 -------(8)

Substituting equation (8) in


RXX(t,t+ ) = RYY(t,t+ ) we have
Random Telegraph Process
RXX(t,t+ ) = e2 , which is exclusively a
function of .

Further E[X2(t)] = RXX(t,t) = 1 < .


Hence, the random telegraph process X(t)
is a WSS process.
Random Telegraph Process
Remark:
From equations (3), (4), (5) and (8) we see
that even though the auto-correlation
function of the semi-random telegraph
signal {Y(t)} is exclusively a function of ,
since the first order p.m.f is itself a function
of , {Y(t)} is not even a first order
stationary process. It is an evolutionary
process.
Markov Chains
Discrete-Time Markov Chains:
Without loss of generality, assume that the
parameter space T={0,1,2,,.}. Hence the
state of the system is observed at the time
points 0,1,2, These observations are
denoted by X0,X1,X2,

If Xn=j, then the state of the system at time


step n is said to be j.
Let pj(n)=P[Xn=j] denote the probability
that Xn is in state j.
Markov Chains
In other words, pj(n) denotes the p.m.f. of
the random variable Xn.

The conditional p.m.f.


pjk(m,n)= P[Xn=kXm=j] is called the
transition p.m.f.
Markov Chains
Discrete-Time Markov Chains:
Let {Xn} be a discrete-time integer valued
Markov chain (starting at n=0) with initial
PMF
p j (0) P[X 0 j], j 0,1,2,...

The joint PMF for the first n+1 values of


the process is
PXn i n , Xn1 i n1 ,.., X0 i 0 P[Xn i n / Xn1 i n1 ]...PX1 i 1 X0 i 0 P[X0 i 0 ]
Discrete-Time Markov Chain
If the one-step state transition probabilities
are fixed and do not change with time, i.e.
P[X n 1 j / X n i] p ijn
{Xn} is said to have homogeneous
transition probabilities.

A Markov chain is said to be a


homogeneous Markov chain if the
transition p.m.f. pjk(m,n) depends only on
the difference n-m.
Discrete-Time Markov Chain
The homogeneous state transition
probability satisfies the following
conditions:
0 p ij 1

p
j
ij 1, i 1,2,.., n

since the states are mutually exclusive


and collectively exhaustive
Discrete-Time Markov Chain
The transition probability matrix P : The one- step transition probabilities of a discrete
parameter Markov Chain are completely specified in the form of a transition
probability matrix (t.p.m.) given by P=[pij]

p 00 p 01 p 02 ...
p p 11 p 12
...
10
P . . . ...
.
p i0 p i1 p i2
.
The row sum is 1 for each row of P Stochastic matrix
All the entries lie in [0,1]
The stochastic matrix is said to be doubly stochastic iff all the column entries add up
to 1 for every column.
A stochastic matrix is said to be regular if all its entries are positive.
Discrete-Time Markov Chain
The joint PMF of Xn , Xn-1 ,, X0 is given
by
PX n i n , X n1 i n1 ,.., X 0 i 0 p in 1 ,i n ...p i 0 ,i1, p i 0 (0)

Thus {Xn} is completely specified by the


initial PMF and the matrix of the one-step
transition probabilities P
Discrete-Time MC Computer Repair Example

Two aging computers are used for word processing.


When both are working in morning, there is a 30%
chance that one will fail by the evening and a 10%
chance that both will fail.
If only one computer is working at the beginning of the
day, there is a 20% chance that it will fail by the close of
business.
If neither is working in the morning, the office sends all
work to a typing service.
Computers that fail during the day are picked up the
following morning, repaired, and then returned the next
morning.
The system is observed after the repaired computers
have been returned and before any new failures occur.
States for Computer Repair Example

Index State State definitions

0 s = (0) No computer has failed. The office


starts the day with both computers
functioning properly.
1 s = (1) One computer has failed. The
office starts the day with one
working computer and the other in
the shop until the next morning.
2 s = (2) Both computers have failed. All
work must be sent out for the day.
Events and Probabilities for Computer Repair
Index Current Events Probability Next state
state

0 s0 = (0) Neither computer fails. 0.6 s' = (0)

One computer fails. 0.3 s' = (1)

Both computers fail. 0.1 s' = (2)

1 s1 = (1) Remaining computer does 0.8 s' = (0)


not fail and the other is
returned.
Remaining computer fails 0.2 s' = (1)
and the other is returned.

2 s2 = (2) Both computers are 1.0 s' = (2)


returned.
State-Transition Matrix and
Network
The events associated with a Markov chain can be
described by the m m matrix: P = (pij).
For computer repair example, we have:

0.6 0.3 0.1



P 0.8 0.2 0
1 0 0
State-Transition Matrix and
Network
State-Transition Network
Node for each state
Arc from node i to node j if pij > 0.

(0.6)

(0.1) (0.3)
For computer repair example:
(1) (0.8)

2 1

(0.2)
Discrete-Time Markov Chain
The n-step transition probabilities :
Let P(n)=[pij(n)] be the matrix of n-step
transition probabilities, where
pij(n)=P[Xn+k=j/ Xk=i] = P[Xn=j/X0=i] for all
n0 and k0, since the transition
probabilities do not depend on time.
Then P(n)=Pn
Discrete-Time Markov Chain
Consider P(2)

P[ X 2 j, X 1 k , X 0 i ]
P[ X 2 j, X 1 k / X 0 i ]
P[ X 0 i ]
P[ X 2 j / X 1 k ]P[ X 1 k / X 0 i ]P[ X 0 i ]

P[ X 0 i ]
P[ X 2 j / X 1 k ]P[ X 1 k / X 0 i ] p ik (1)p kj (1)

p ij ( 2) p ik (1)p kj (1)i, j
k
Discrete-Time Markov Chain
P(2)=P(1)P(1)=P2
P(n)=P(n-1)P
=P(n-2)PP
=P(n-2)P2
=Pn
Hence the n-step transition probability
matrix is the nth power of the one-step
transition probability matrix.
Discrete-Time Markov Chain
Chapman- Kolmogorov Equations:

p ij (n) p ik (r )p kj (n r )
k
Interpretation:
RHS is the probability of going from i to k in r steps &
then going from k to j in the remaining n r steps,
summed over all possible intermediate states k.
Discrete-Time Markov Chain
State Probabilities:
The state probabilities at time n are given by
the row vector pj(n)={pj(n)}. Now,
p j (n) P[X n j / X n 1 i]P[ X n 1 i] p ij p i n 1
i i

Therefore p(n)=p(n-1)P.
Similarly, p(n)=p(0)P(n)=p(0)Pn , n=1,2,
Hence the state PMF at time n is obtained by
multiplying the initial PMF by Pn.
Discrete-Time Markov Chain
Limiting State/ Steady-state Probabilities:
Let {Xn} be a discrete-time Markov chain with N
states
P[Xn=j] - the probability that the process is in
state j at the end of the first n transitions,
j=1,2,..,N.
Then
N
P[ X n j ] p j (n) P[ X 0 i] pij (n)
k 1
Discrete-Time Markov Chain - Steady-state
Probabilities
As n, the n- step transition probability pij(n)
does not depend on i, which means that
P[Xn=j] approaches a constant as n.
The limiting- state probabilities are defined as

lim P[ X n j ] j , j 1,2,..., N
n

Since, pij (n) pik n 1 pkj


k

lim pij (n) lim pik n 1 pkj k pkj


n n
k k
Discrete-Time Markov Chain - Steady-state
Probabilities
Defining the steady-state/ limiting state
probability vector 1 , 2 ,..., N , we have
j k pkj
k

P
1 j
j

The last equation is due to the law of total


probability.
The probability j is interpreted as the long
proportion of time that the MC spends in
state j.
Discrete-Time Markov Chain
Classification of States:
A state j is said to be accessible from state
i (j can be reached from i) if, starting from
state i, it is possible that the process will
ever enter state j.
pij(n)>0 for some n>0.
Two states that are accessible from each
other are said to communicate with each
other.
Discrete-Time Markov Chain - Classification of
States

Communication induces a partition of


states
States that communicate belong to the
same class
All members of a class communicate with
each other.
If a class is not accessible from any state
outside the class, the class is said to be a
closed communicating class.
Discrete-Time Markov Chain - Classification of
States

A Markov chain in which all the states


communicate is called an irreducible
Markov Chain.
In an irreducible Markov chain, there is
only one class.
Discrete-Time Markov Chain - Classification of
States

States that the process enters infinitely


often and states that the process enters
finitely often.
Process will be found in those states that it
enters infinitely often
Discrete-Time Markov Chain - Classification of
States

Probability of first passage from state i to


state j in n transitions - fij(n)
The conditional probability that given
that the process is in state i, the first time
the process enters state j occurs in exactly
n transitions.
Discrete-Time Markov Chain - Classification of
States
Probability of first passage from state i to
state j fij


f ij f n
n 1
ij

Conditional probability that the process will


ever enter state j given that it was initially in
state i
Discrete-Time Markov Chain - Classification of
States
Clearly f ij (1) pij

and f ij (n) pil f lj (n 1)


l j

fii denotes the probability that a process that


starts at state i will ever return to state I
If fii =1, then state i is called a recurrent state.
If fii <1, then state i is called a transient state.
Discrete-Time Markov Chain - Classification of
States

State j is called
transient (non-recurrent) if there is a
positive probability that the process will
never return to j again if it leaves j
recurrent (persistent) if with probability
1, the process will eventually return to j
after it leaves j
A set of recurrent states forms a single
chain if every member of the set communicates
with all the members of the set.
Discrete-Time Markov Chain - Classification of
States
Recurrent state j is called a
periodic state if there exists an integer d,
d>1, such that pjj(n) is zero for all values
of n other than d, 2d, 3d,; d is called the
period. If d=1, j is called aperiodic.

positive recurrent state if, starting at state


j the expected time until the process
returns to state j is finite; otherwise it is
called a null-recurrent state.
Discrete-Time Markov Chain - Classification of
States

Positive recurrent states are called ergodic


states
A chain consisting of ergodic states is
called an ergodic chain.
A state j is called an absorbing (trapping)
state if pij=1. Thus, once the process
enters an absorbing/ trapping state, it
never leaves the state.
Discrete-Time Markov Chain - Classification of
States

States

Transient/ Non - recurrent


Recurrent/ persistent

Null Recurrent Periodic Aperiodic


Positive recurrent

Ergodic Ergodic
Discrete-Time Markov Chain -
Classification of States

If a Markov chain is irreducible, all its


states are of the same type. They are
either all transient or all null persistent or
all non-null persistent.
Further, all the states are either aperiodic
or periodic with the same period.
If a Markov chain is finite and irreducible,
then all its states are non-null persistent.
Continuous-Time Markov Chain
A random process {X(t)/t0} is a continuous-time Markov
chain if, for all s,t0 and nonnegative integers i,j,k,

P[ X (t s) j X (s) i , X (u) k ,0 u s] PX (t s) j X (s) i

In a continuous time Markov chain, the conditional


probability of the future state at time t+s, given the present
state at s and all past states depends only on the present
state and not on the past.
Continuous-Time Markov Chain
If in addition, P[X(t+s)=j/X(s)=i] is
independent of s, then the process
{X(t),t0} is said to be time-homogeneous
or have the time-homogeneity property.
Time-homogeneous Markov chains
have stationary (or homogeneous)
transition probabilities.
Let p (t ) PX(t s) j X(s) i
ij

p j (t ) PX(t ) j
Continuous-Time Markov Chain
In other words, pij(t) is the probability that a MC
presently in state i will be in state j after an
additional time t and pj(t) is the probability that a
MC is in state j at time t.
The transition probabilities satisfy

0 pij (t ) 1
p
j
ij (t ) 1

Further pj
j (t ) 1
Continuous-Time Markov Chain
Chapman-Kolmogorov equation

pij t s pik t pks s


k
Markov Chains
17. A man either drives a car or catches a
train to go to office each day. He never
goes 2 days in a row by train, but if he
drives one day , then the next day he is
just as likely to drive again as he is to
travel by train. Now suppose that on the
first day of the week, the man tossed a fair
die and drove to work if and only if a 6
appeared, find (i) the probability that he
takes a train on the third day and (ii) the
probability that he drives to work in the
long run.
Markov Chains
The travel pattern forms a Markov chain, with
state space = (train,car)
0 1
The TPM of the chain is given by P 1 1
2 2

The initial state probability distribution is given


5 1
p (0)

by 6 , since
6
P(traveling by car)=P(getting a 6 in the
toss of the die)= 1/6.
Also, P(traveling by train)= 5/6
Markov Chains
Now, 5 1 10 11 1 11
p ( 2)
p P
(1)

6 6
2 2 12 12

0 1 11 13
1 1
1 11
p ( 3) p ( 2) P
12 12 2 2 24 24

Therefore,
P(the man travels by train on the third day) =
11/24
Let = (1, 2) be the limiting form of the
state probability distribution or stationary
Markov Chains
state distribution of the Markov chain.
By property of , P=
0 1
( 1 , 2 ) 1 1 ( 1 , 2 )

2 2
1
2 1 -----(1) and
2
1
1 2
------(2)
2
Equations (1) and (2) are the same.
Alongwith the equation 1+ 2 = 1, ---(3)
Markov Chains
since is a probability distribution, we
obtain 1 2 1 2
2
3
2 1
2
2
2
3
2 1
1 1
3 3
Hence 1=1/3, 2 = 2/3.
Therefore P[man travels by car in the long
run]= 2/3
Markov Chains
18. Three boys A,B and C are throwing a
ball to each other. A always throws the ball
to B and B always throws the ball to C, but
C is just as likely to throw the ball to B as
to A. Show that the process is Markovian.
Find the transition probability matrix and
classify the states.

Solution: Let A,B,C denote the states of


the Markov chain.
Markov Chains
The transition probability matrix of {Xn} is
given by
0 1 0
P 0 0 1
1 1
0
2 2
Since, the states of Xn depend only on Xn-1
and not on Xn-2, Xn-3, , the process {Xn}
is a Markov chain.
We first observe that the chain is finite.
(Draw the tpm/ network)
Markov Chains
We observe that all the states
communicate with each other

Hence, the MC is irreducible

Since the MC is finite, all the states are


positive recurrent.

Further since state A is aperiodic


(WHY???) all the states are aperiodic,
ergodic
Markov Chains
19. The transition probability matrix of a
Markov chain {Xn} with three states
0.1 0.5 0.4
1,2 and 3 is P 0.6 0.2 0.2 and the

0.3 0.3

0.4
( 0)
p (0.7,0.2,0.1)

initial distribution is p (0.7,0.2,0.1) . Find( 0)

(i) PX 2 3 (ii) PX 3 2, X 2 3, X 1 3, X 0 2 .

Solution: Consider
Markov Chains
0.1 0.5 0.4 0.1 0.5 0.4
P ( 2) [ p 2 ] 0.6 0.2 0.2 0.6 0.2 0.2

0.3 0.4 0.3 0.3 0.4 0.3
0.43 0.31 0.26
0.24 0.42 0.34
0.36 0.35 0.29

PX 2 3 PX 2 3 X 0 i PX 0 i
3

i 1
.
PX 2 3 X 0 1PX 0 1 PX 2 3 X 0 2PX 0 1 PX 2 3 X 0 1PX 0 3
p13
2
P[ X 0 1] p23
2
P[ X 0 2] p33
2
P[ X 0 3]
Markov Chains
P[X2=3] 0.26 0.7 0.34 0.2 0.29 0.1
= 0.182 + 0.068 + 0.029 =0.279

(ii) PX 3 2, X 2 3, X 1 3, X 0 2
PX 3 2 X 2 3, X 1 3, X 0 2PX 2 3, X 1 3, X 0 2
(conditional probability)
PX 3 2 X 2 3PX 2 3 X 1 3, X 0 2PX 1 3, X 0 2
(Markov property)
p32
(1) (1)
. p33 (1)
p23 .PX 0 2

= 0.4x0.3x0.2x0.2= 0.0048
Markov Chains
20. A gambler has Rs.2. He bets Re. 1 at a
time and wins Re. 1 with probability 0.5.
He stops playing if he loses Rs. 2 or wins
Rs. 4. What is the transition probability
matrix of the related Markov chain?

Solution : HW
Markov Chains
21. There are 2 white marbles in urn A and
3 red marbles in urn B. At each step of the
process, a marble is selected from each
urn and the 2 marbles selected are
interchanged. Let the state ai of the
system be the number of red marbles in A
after i changes. What is the probability that
there are 2 red marbles in A after 3 steps?
In the long run, what is the probability that
there are 2 red marbles in urn A?

Solution : HW
Markov Chains
22.Find the nature of the states of the
Markov chain with the TPM,
0 1 0

1 0 1
2 2
P=
0 1 0

Solution : HW
Thank You

You might also like