0% found this document useful (0 votes)
70 views

Random Process

The document compares random variables and random processes. A random variable is a function of possible outcomes of an experiment, while a random process is also a function of time. Random processes are classified as discrete or continuous based on whether the time and outcomes are discrete or continuous. There are four main classifications: discrete random process, continuous random process, discrete random sequence, and continuous random sequence. The document also defines stationary processes as those whose statistical properties do not change over time, including first-order stationary, second-order stationary, and strictly/widely stationary processes. Evolutionary processes are non-stationary. Auto-correlation measures the correlation between values of a random process at different times.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
70 views

Random Process

The document compares random variables and random processes. A random variable is a function of possible outcomes of an experiment, while a random process is also a function of time. Random processes are classified as discrete or continuous based on whether the time and outcomes are discrete or continuous. There are four main classifications: discrete random process, continuous random process, discrete random sequence, and continuous random sequence. The document also defines stationary processes as those whose statistical properties do not change over time, including first-order stationary, second-order stationary, and strictly/widely stationary processes. Evolutionary processes are non-stationary. Auto-correlation measures the correlation between values of a random process at different times.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 30

UNIT – 3 : RANDOM PROCESSES

Comparison of random variables and random process

Random variable Random process

A function of the possible outcomes of an A function of the possible outcomes of an


experiment. i.e., ( ) experiment and also time. i.e., * ( )+

Outcomes are mapped into wave form which is


Outcome is mapped in to a number ’
a function of time’

RANDOM PROCESS (OR) STOCHASTIC PROCESS


A Random process is a collection (ensemble) of Random variables * ( )+ that are functions of a real
variable namely time where (sample space) and (parameter set or index set).
Example : The wireless signal received by a cell phone over time.

NOTE
If s and t are fixed,* ( )+is a number.
If t is fixed, * ( )+is a random variable.
If s is fixed, * ( )+is a single time function.
If s and t are variables * ( )+is a collection of random variables that are time functions.

3.1 CLASSIFICATION OF RANDOM PROCESSES


Random processes

Random variable Random variable


(time t continuous) (time t discrete)

Discrete continuous Discrete continuous


Random Random random random
Process process sequence sequence

1. Discrete random process


If T is continuous and S discrete, the random process is called a discrete random process.
Example : ( ) represents the number of telephone calls received in the interval (0,t). S = {0, 1, 2, …+

2. Continuous random process


If both T and S are continuous random process, the random process is called a continuous random
process.
Example : ( ) represents the maximum temperature at a place in the interval (0,t).

3. Discrete random sequence


If both T and S are discrete, the random process is called a discrete random sequence.
Example : If ( ) represents the outcome of the nth toss of a fair dies then
{ ( ) + is a discrete random sequence

4. Continuous random sequence:


If T is discrete and S is continuous, the random process is called continuous random sequence.
Example : represents the temperature at the end of the nth hour of a day, in the interval (1,24).
Problem : 1 Define a random process .Explain the Classifications of random process. Give an example to each
case.
Solution : Refer above topics.

3.2 STATIONARY PROCESS

A random process * ( )+ is said to be stationary if all its statistical properties do not change with time.
(i.e., statistical properties are constants). Statistical properties means, Moments, Mean, Variance etc.

FIRST ORDER STATIONARY PROCESS


A Random Process * ( )+ is said to be first order stationary (to order one) if its first order density
function does not change with a shift in time orgin (i.e.,) ( ) ( ) must be true for any
and any real number .

SECOND ORDER STATIONARY PROCESS


A Random process is said to be second order stationary (to order two) if its second order density function
does not change with a shift in time orgin (i.e.,) ( ) ( ) for any c.

STRONGLY STATIONARY PROCESS (or) STRICT SENSE STATIONARY PROCESS (SSS)


A random Process is said to be strongly stationary process if all its finite dimensional distributions are
invariant under translation of time ‘t’.
i.e., ( …… … ) ( …… … ) for any

WIDE- SENSE STATIONARY (WSS) (or) WEAKLY STATIONARY PROCESS (or) COVARIANCE STATIONARY
PROCESS
A random process * ( )+ is said to be Wide sense stationary if the following conditions are satisfied
(i) Mean = , ( )- = Constant.
(ii) , ( ) ( )- ( ) the autocorrelation depends only on where –

EVOLUTIONARY PROCESS
A random process that is not stationary in any sense is called as evolutionary process.

AUTO CORRELATION OF A RANDOM PROCESS


Let ( )and ( ) be a two given numbers of the random process * ( )+ The Auto correlation of a
random process * ( )+is the product of the expected value of the random variables x(t ) and x (t ) and
is denoted by ( )(or) ( ) (or) ( )

( ) *x(t ) x(t )} ∫ ∫ ( ) .

NOTE :

First order stationary , ( )-= a constant. -

stationary of second order, , ( )-= a constant E,x (t)- = a Constant.

Strictly -Sense Stationary , ( )-= aconstant ar,x(t)-= a Constant.

( ) E, (t) (t )-
wide-sense stationary , ( )-a constant
depends only on .
x(t) is W.S.S and (t t )depends only
jointly wide-sense stationary
y(t) is on .
Formula

Discrete Case Continuous case

, ( )- ∑ ( ) ( ) , ( )- ∫ ( ) ( )

, ( )- ∑ ( ) ( ) , ( )- ∫ ( ) ( )

, ( )- , ( )- , ( )- , ( )- , ( )- , ( )-

SOLVED PROBLEMS BASED ON STATIONARY PROCESS


Problem : 1 Examine whether the Poisson process * ( )+ given by the probability law. {* ( )+ }
( )
, … … Is not a covariance stationary?

Solution: The probability distribution of ( ) is a Poisson distribution with parameter Therefore,


, ( )- a function of constant. Therefore, the Poisson process is not covariance stationary
(or) WSS.

Problem : 2 The process * ( )+whose probability distribution under certain condition is given by
( )
{* ( )+ } ( )
, …
( )
Show that * ( )+is not stationary.
( )

Solution:

( ) ………
( )
* ( ) + ………
( ) ( ) ( ) ( )

The probability distribution of * ( )+is


( )
Now, , ( )- ∑ ( ) ∑
( )

( )
( ) ( ) …………
( ) ( ) ( )
( )
………
( ) ( ) ( )
( )
, ………-
( ) ( ) ( )

ut .
( )

, - ( )
( )

( )
( )

( )
0 ( )
1

[ ]
( )

( )
( ) ( ) Since 0 1 =0 1 ( )

Therefore , ( )- , a constant.
Now, , ( )- ∑ ( ).
( )

( )
( )
∑, ( ) -
( )

( ) ( )
∑ ( ) ∑
( ) ( )

[ ] , ( )-
( ) ( )

6 ( ) 7 since by ( )
( ) ( )

( )
0 ( )
1 [ ( )

[ ]
( ) ( )
( )
( )
Therefore , ( )] ( )

ar, x(t)- E, ( )- ,E, (t)--


ar, x(t)- at Which is not a constant.
Since ar, x(t)- is a function of t and hence * ( )+is not a stationary process.

Problem : 3 Show that the random process ( ) ( ) where A and are constants, is a
random variable uniformly distributed in ( ) is first order stationary.
Solution: Given ( ) sin( )
Since is uniformly distributed R.V. over ( ) we have

( ) {

Now , ( )- ∫ (t) f( ) d

∫ sin( t ) d
∫ sin

∫ sin( t )d ∫ cos
Cos( )
Sin( )
,cos( t )-

,cos( t)-

,cos( t) cos( t)-

,cos t cos t-

, ( )- a constant. Therefore, Mean is a constant.


Hence * ( )+ is a first order stationary process.

Problem : 4 Consider the random process ( ) ( )where is uniformly distributed in the


interval ( ) . Check whether ( ) is stationary or not. Find the first and second moment of the process
.whether it is S.S.S.
Solution:
Uniform distribution in (a, b)
Given ( ) ( )
f(x) =
Since is uniformly distributed R.V. over ( ) we have
f( ) 8 ( )
otherwise
(i) E,x(t)- ∫ (t)f( ) d ∫ cos sin
sin( ) sin
∫ cos( t ) d
sin( ) sin
sin( ) sin
sin( ) sin
∫ cos( t )d

,sin( t )-

[sin( t ) - sin( t )-

[sin( t) - sin( t)-

[ sin t + sin t-

, 0]

E,x(t)- Constant.
( )E,x (t)- E ,cos ( t )-
cos, ( t )-
E, -

E, - E,cos( t )-

, - ∫ cos( t )- d

sin( t )
, - , -

,sin( t ) sin( t )-

,sin( t) sin( t)-

,sin( t) sin( t)-

, -

E,x (t)-

var ,x(t)- E,x (t)- E,x(t)-

var ,x(t)- , a constant. x(t) is a S.S.S process.

Problem : 5 Show that the random process ( ) ( ) is wide-sense stationary (WSS) if A


and are constants and is a uniformly distributed R.V in ( ).
Solution:
Given that ( ) ( )
Since is uniformly distributed R .V in ( ).we have

( ) {
otherwise
Now, , ( )- ∫ ( ) ( )

∫ cos( t ) d
∫ cos( t )d

,sin( t )-

,sin( t)-

,sin( t) sin( t)-

[Sin t sin t]

, ( )- Constant.
, ( ) ( )- E, cos( t ) cos* (t ) +-
E,cos( t ) cos* t +]

E,cos( t ) cos -

E,cos( t )- E,cos -

cos {∫ cos( t ) d }

sin( t )
cos { 6 7 }

cos { ,sin, ( t )- sin( t )-}

cos 2 ,sin( t ) sin( t )-3

cos 2 ( )3 sin( ) sin

cos = ( ) , a function of only

So, both the conditions are satisfied.


Hence, ( ) is a WSS process.

Problem : 6 Given a random variable Y with characteristic function ( ) [ ] , -


and ( ) ( ). Show that * ( )+ is WSS (or) stationary in wide sense if ( ) ( ) .
Solution:
Given that ( ) E[e ] E,cos isin - and also ( ) ( ) .
Since ( ) ( ) E,cos isin -
E,cos - E,sin - ……… ( )
Since ( ) , we get E,cos - , E,sin - ------ (2)
x(t) cos( t ) E,x(t)- E,cos( t -
E,cos t cos sin t sin ]
cos t E,cos - sin t E,sin -
cos t sin t by using (1)
, ( )- ean Constant.
Now, , ( ) ( )- ,cos( ) cos( ( ) )-
, ( ) * +]

,cos( ) -

* ,cos( )- , -+
* ,cos,( ) )--+

* ,cos( ) sin( ) -+

* ,cos( )- , - ,sin( )- , -+

* +

,= ( ) a function of only.

Both conditions are satisfied.Hence, ( ) is a WSS process.

REMARK
Every SSS process of order 2 is a WSS process but every WSS process need not be a SSS process of order 2.
Since , ( )- = constant. The condition is enough to a problem by SSS process but not enough in case of
WSS Process.

Problem : 7 If () , where A and B are two independent normal random variables


with ( ) ( ) , ( ) ( ) and is a constant, prove that * ( )+ is a strict sense
stationary of order 2 (or) WSS.
Solution:
Given that x(t) cos t sin t
Now, E,x(t)- E, cos t sin t-
cos , - , -[Since A& B are random variables]
E,x(t)- Since ( ) ( )
, ( ) ( )- , )( ( ) ( )-
, cos ( ) ( )
cos ( ) ( )-
cos ( ) E( ) ( ) E( )
E, - cos ( ) E, - ( )---(1)
If A and B are independent variables, then
, - , - , - [Since , - , - -
Therefore (1) becomes
, ( ) ( )- (cos ( ) ( ))
cos, - , ( ) E( ) -
cos( )
cos
( )
Both conditions are satisfied.Hence *x(t)+is a WSS process.

Problem : 8 Two random processes ( ) and ( ) are defined by ( ) and


( ) . Show that ( ) ( ) are jointly wide sense stationary if A and B are
uncorrelated r v’s with mean zero and the same variance and is a constant.
Solution: To show that x(t) and y(t) are jointly wide sense stationary, we have to show that
(i) x(t) is a WSS process
(ii) y(t) is a WSS process
(iii) cross correlation ( ) depends only on

It is clear that x(t) and y(t)are wide sense stationary.


Now we show that the cross correlation ( ) depends only on
Consider ( ) , ( ) ( )-

*( )( ( ) ( )+
* ( ) ( ) ( )
( )
( ) ( )
cos ( ) ( ) ( ) ( ) ( ) ( )
cos ( ) , - cos ( ) ( )
, ( ) cos ( )-
,sin* ( )+-
,sin( )-
,sin( )-
( )Depends only on Hence x(t) and y(t)are jointly wide senses stationary.

Problem : 9 If ( ) , for all where and are independent binary random variables,

each of which assumes the values and with probabilities and respectively, prove that ( ) is wide

sense stationary.
Solution:
Y and Z are discrete random variables which assumes values

Y -1 2
P(Y) 2/3 1/3
E(y) = ∑ ( )

E( ) ( ). / ( ). / .

E( )=∑ ( )

E( ) ( ) ( ) ( ) ( )

var( ) E( ) , E ( )- –
Similarly E( ) and ar ( ) .
Y and Z are independent random variables.
, - , - , -
Given x(t) cos t sint
E,x(t)- E(y)cost E( )sint
cost sint a constant
E,(x(t) x(t )- E , cos t sint )( cos(t ) sin (t )-
E*, costcos(t )- , sint sin(t )- , -cost sin(t ) , - sintcos(t )-+
= E{, costcos(t ) sint sin(t )+ cost sin(t ) sintcos(t )-+
= E[ - costcos(t ) E[ - sint sin(t ) E, - cost sin(t )+ E, - sintcos(t )-
,cost cos(t )- ,sint sin(t )-
[Cost cost (t+ )+ sint sin(t )-
,cos(t (t )-
,cos( )-
cos which is a function of only
Both conditions are satisfied. Hence, {x(t)+ is a WSS process.

( ) is wide sense stationary process with auto correlation ( ) | |


Problem : 10 If , determine the
second order moment of the RV ( ) ( )
Solution:
( ) | | ( )
Given the auto correlation function , .
Since X (t) is W.S.S,
E[X (t), X (t+ )- ( ) put .
[X (t), X (t+ )- ( ) . E[ ( )-
Second moment of ( ) ( ) is given by
, ( ) ( )- , ( )- , ( )- , ( ) ( )- ---------- [1]
| | | |
Given ( ) = ( )
, ( )- ( )
, ( )- , ( )- --------------- [2]
, ( ) ( )- = ( ) ……… , -
Using 2 and 3 we get,
, ( ) ( )- ( ).

Note: √lim ( ).

( ) | |
Problem : 11 The auto correlation function of a stationary process is given by Find the

mean value of the random variable Y = ∫ ( ) and variance of X (t).

( ) | |
Solution: Given

We know √lim ( ).
| |
lim ( ) =9
, ( )-
ar, x(t)- E, ( )- E, (t)-
( )- ( ) | |
E, = 9+ 2(1) = 11.
Var X (t) = 11-9 = 2.
Given Y =∫ ( ) .

E[Y] = ∫ , ( )- =∫ , - = 6.

Problems for practice


1. Verify whether the sine wave random process* ( )+ , Yis uniformly distributed in the
interval to 1, It is WSS or not.
Solution: , ( )- , ( ) ( )- ,cos( ) ( )- .

* ( )+ is not a WSS process.

2. Consider a random process ( ) ( ) where B and are independent r.v, B is a random


variable with mean 0 and variance 1. is uniformly distributed in the interval ( ) Show that x(t)is
WSS.
Solution: Given that ( ) , ( ) ( ) , ( )- ( )

Therefore, ( ) . , ( )- a Constant. ,( ( ) ( )- cos( ), which is a function of

only.

3. Let the random process be ( ) cos( t ) where is a random variable with density function
( ) . Check whether process is stationary or not.

Solution: , ( )- not a constant. ( )is not a stationary process


4. Given RV with density ( ) and another RV is uniformly distributed in ( ) and is
independent of and ( ) acos( t ) prove that { ( )+ is a wide sense stationary (WSS) process.

Solution: , ( )- , ( ) ( )- (cos ) afunction of {X(t)} is a WSS process.

3.3 MARKOV PROCESS AND MARKOV CHAIN

Markov Process
A random process * ( )+ is said to be Markov process if , ( ) ( ) ( )
…… ( ) ( ) - , ( ) ( ) - for all .
In other words, the future behavior depends on the present value but not on the past, then the process is
called a Markov process.

EXAMPLE
The probability of raining today depends on previous weather conditions existed for the last two days and
not on the past weather conditions.

Markov Chain
If P* … + * + for all then the
process * + is called as Markov chain. Here a1, a2 … an are called the states of Markov chain.

One Step Transition Probability


The conditional transition probability , - is called the one step transition
probability from the state to at the thstep (trial) and is denoted by ( )

Homogeneous Markov Chain


If the one-step transition probability does not depend on the step
I.e. ( – ) ( – ) the Markov chain is called a homogeneous Markov chain.
i.e., if one-step transition probabilities are independent of the step.

Transition Probability Matrix (T.P.M)


The matrix ( ) is called the transition probability matrix satisfying the conditions ( ) and
for all That is the sum of the elements of any row of the t.p.m. is 1.

EXAMPLE
Three boys are throwing a ball to each other. always throws the ball to and always throws the

ball to is just likely to throw the ball to The t.p.m. of a markov chain is [ ]

n-Step Transition Probability


The conditional probability that the process is in state at step n, given that it was in state - at step 0,
That is , - is called the n-step transition probability and is denoted by ( )

Chapman – Kolmogorov Theorem


If P is the t.p.m. of a homogeneous markov chain, then n-step t.p.m. ( ) is equal to Pn. i.e.,
, ( )- , - .

Regular Markov Chain


A stochastic matrix is said to be the regular matrix if all the entries of (for some positive integer m)
are positive. A homogeneous Markov chain is said to be regular if its t.p.m. is regular.
Steady State Distribution
If a homogeneous Markov chain is regular, then every sequence of state probability distributions
approaches a unique fixed distribution, called the steady state distribution of the Markov chain.
( ) ( ) ( ) ( ) ( )
i.e., lim { } where state probability distribution at step n, * … + and
the stationary distribution ( … ) are row vectors.
If P is the t.p.m. of the regular Markov chain, and ( … ) is the steady state distribution, then
and

Classification of States of a Markov Chain:


(i) Irreducible Markov Chain
A Markov chain is said to be irreducible if ( ) for some n and for all and , then every
state can be reached from every other state. The t.p.m. of an irreducible chain is an irreducible matrix.
Otherwise the chain is said to be reducible.
(ii) Return State
State of a Markov chain is called a return state if ( ) for some
i.e., the system comes back to starting from
(iii) Period
The period of a return state ‘i’ is defined as the greatest common divisor of all integers m such
that ( ) i.e., * ( ) +
State is said to be periodic with period and aperiodic if
(iv) Persistent, Transient, non-null persistent and null persistent:
A state i is said to be persistent or recurrent if the eventual return to state ‘ ’ is certain. i.e., if

The state is said to be transient if the return to state is uncertain. i.e., if


The state is said to be non-null persistent, if its mean recurrence time finite i.e., and
null persistent if

NOTE
If the state ‘ is persistent non-null then lim and if the state ’i' is persistent null or transient,
then lim .

Ergodic State
A persistent, non-null and aperiodic state i are called an ergodic state.

Absorbing State
A state ‘ is called an absorbing state if and only if

Problem: 1 The transition probability matrix of a Markov chain * + … having 3 states 1, 2 and

( ) ( ) Find ( ) (
3 is [ ] and the initial distribution is

)( ) ( )
Solution:

Given [ ]

( )
[ ][ ]
[ ]

( )
Given that ( )
i.e., , - ; , - ; , -
(i) , - ∑ , - , -
, ⁄ - , - , ⁄ - , - , - , -
( ) ( ) ( )
, - , - , -
( )( ) ( )( ) ( )( )

(ii) , - , - , -
, - , -
( ) ( ) ( )
, -
( )( )( )( )

Formula to remember
( ⁄ ) ( )
(i) For A and B provided that ( ) then . / ( )

( ) ∑ ( ⁄ ) ( )

( ⁄ ) ( ⁄ )
(ii) ( ⁄ )
∑ ( ⁄ ) ( )

(iii) ( ) ( ) ( )
(iv) , - ( ⁄ ) ( ⁄ ) ( )

Problem: 2 The t.p.m. of a Markov process * + …having 3 states 0, 1, and 2 is


⁄ ⁄
( )
⁄ ⁄ ⁄ and the initial distribution . /
[ ⁄ ⁄ ]
Find ( ) ( ), ( ) ( )( ) ( )
Solution:
⁄ ⁄
Given ⁄ ⁄ ⁄
[ ⁄ ⁄ ]

⁄ ⁄ ⁄ ⁄
( )
⁄ ⁄ ⁄ ⁄ ⁄ ⁄
[ ⁄ ⁄ ][ ⁄ ⁄ ]

⁄ ⁄ ⁄
⁄ ⁄ ⁄
[ ⁄ ⁄ ⁄ ]
( )
Given ( )

i.e., , - , - , -

(i) , - ∑ , ⁄ - , -
, ⁄ - , - , ⁄ - , -
, ⁄ - , -
( ) ( ) ( )
, - , - , -

(ii) , - , ⁄ - , ⁄ - , -
( ) ( )
, -

⁄ ⁄ ⁄


(iii) , - = , ⁄ - , ⁄ - , ⁄ - ,
-
( ) ( ) ( )
, -

⁄ ⁄ ⁄ ⁄ ⁄ .

Problem: 3 A college students X has the following study habits. If he studies one night, he is 70% sure
not to study the next night. If he does not study one night, he is only 60% sure not to study next night also.
Find (i) The t.p.m., (ii) How often he studies in the long run
Solution
Since the study pattern depends on the present and not on the past, it is a Markov chain where the states
studying (S) and not studying (N).
The t.p.m. is
State of Xn
S N

tate of 6 7

If he studies one night, next night he is 70% not studying


( ) ( ) ( ) ( )
If , - be the long run (or) limiting form of state distribution then where
( )

, -6 7 , -

, - , -

( )

( )

( ) ⁄ . i.e., He studies ⁄ of the time.

Problem : 4 A Man either drives a car or catches a train to go to office each day. He never goes 2 days in a
row by train but if he drives one day, then the next day he is just as likely to drive again as he is to travel by
train. Now suppose that on the first day of the week, the man is tossed a fair die and drove to work if and
only if 6 appeared. Find (i) The probability that he takes a train on the third day and (ii) The probability
that he drives to work in the long run.
Solution:
Since the mode of transport of next day is decided on the basis of today, the travel pattern is a Markov
chain where the states are train (T) and car (C).
The t.p.m. is
State of Xn
T C

tate of 6 7

If today he goes by train, next day he will not go by train


( ) ( ) ( ) ( )

Initial state probability distribution is obtained by throwing a die.

( ) ( )

( )

( )
The first day state distribution is 0 1
( ) ( )
Second day state probability

[ ]6 7

=0 1 0 1
( ) ( )
Third day state probability

[ ]6 7

[ ] [ ]

() , -

(ii) Let , - be the limiting form of long run probability distrtibution.


We know that

, -6 7 , -

0 1 , -

Put in (1), we get

( ) .

Problem : 5 An engineer analyzing a series of digital signals generated by a testing system observes that
only 1 out of 15 highly distorted signals followed a highly distorted signal with no recognizable signal,
whereas 20 out of 23 recognized signals follow recognizable signals with no highly distorted signals
between. Given that only highly distorted signals are not recognizable, find the fraction of signals that are
highly distorted.
Solution : The state space is * +
We shall denote them by 0, 1. State space is * +
Let if the signal generated is highly distorted.
if the signal generated is recognizable.
* n … +is a Markov Chain with state space * +
The t.p.m. is 0 1

[ ]

Let , - be the limiting form of long run probability distribution and ……… , -
We know that

, -[ ] , -

[ ] , -

Substituting in [1], we get

the fraction of signals that are highly distorted is


In other words, 12.3% of the signals generated by the testing system are highly distorted.

Problem: 6 Find the limiting state probabilities associated with the following probability matrix

[ ]

Solution:

Given t.p.m. is [ ]

Let , - be the limiting probability distribution and


, -
We know that

, - [ ] , -

, - , -
, -
[3]
[4]
[4]x 5
[3]
Subtracting, we get 2.7 [5]
Substituting in [2], we get

[6]
Substituting in [1], we get

The limiting probabilities are . /.

Problem : 7 A fair die is tossed repeatedly. If denotes the maximum of the numbers occurring in the first
n tosses, find the transition probability matrix P of the Markov chain * + Find also and , -
Solution :The State space is * +
The t.p.m. is formed using the following analysis.
Let the maximum of the numbers occuring in the first n trails
Assume this number be 3.
* + (There are 3 possibilities1, 2, 3)

* i + When

The transition probability matrix of the chain is

and

( )
( )
( )
Initial state probability distribution is . /

Since all the faces are equally likely.


Now, * + ∑ ( ) ( )

( )

( ) ( ) ( ) ( ) ( ) ( )
[ ]

[ ]

, -

Problem: 8 suppose that the probability of a dry day following a rainy day is 1/3 and that the
probability of a rainy day following a dry day is ½. Given that May 1 is a dry day. Find the probability that
(i) May 3 is a dry day and
(ii) May 5 is a dry day.
Solution: To find t.p.m.:
The state space is {D, R} where D – Dry day and R – Rainy day.
The t.p.m. of the markov chain is
D R
⁄ ⁄
[ ]
⁄ ⁄
The initial probability distribution is (1, 0)
( ) , -
Next, to find May 3 is a dry day
( ) ( ) ⁄ ⁄
Now, , -[ ]
⁄ ⁄

[ ]

[ ]

( ) ( ) ⁄ ⁄
[ ][ ]
⁄ ⁄

[ ] [ ]

The probability that May 3 is a dry day

( ) ( ) ⁄ ⁄
[ ][ ]
⁄ ⁄

[ ] [ ]

( ) ( ) ⁄ ⁄
[ ][ ]
⁄ ⁄

[ ] [ ]

( ) .

Problem: 9 Three boys A, B and C are throwing a ball to each other. A always throw the ball to B and B
always throws the ball to C and C is just as likely to throw the ball to B as to A. Show that the process is
Markovian. Find the t.p.m. and classify the states.
Solution: The t.p.m. of the process * + is given below
A B C

[ ]

The states of depends only on states of .but not on states of …


The process * + is a markov chain.

Now [ ][ ] [ ]

[ ]

[ ]
[ ]
We observe that,
( ) ( ) ( )
( ) ( ) ( )

( ) ( ) ( )

The chain is irreducible.

[ ]

[ ] [ ]

[ ]

and so on.

[ ]
( ) ( ) ( ) ( ) ( )
Here, etc, are greater than 0 for i = 2, 3
and GCD of …
The states 2 and 3(i.e., B and C) are periodic with period 1. i.e., a periodic.
( ) ( ) ( )
Also … are > 0 and GCD of {3, 5, 6] = 1
The state 1(i.e., state A) is periodic with period 1 i.e., aperiodic.
Since the chain is finite and irreducible, all its states are non null persistent.
Moreover all the states are ergodic.

Problem : 10 Find the nature of the states of the Markov chain with the t.p.m

[ ⁄ ⁄ ]

Solution: Given tpm is


0 1 2

[ ⁄ ⁄ ]

[ ⁄ ⁄ ][ ⁄ ⁄ ]

0 1 2
⁄ ⁄
[ ] We observe that
⁄ ⁄
( ) ( ) ( )

( ) ( ) ( )

( ) ( ) ( )

The chain is irreducible.


⁄ ⁄
Now, [ ][ ⁄ ⁄ ]
⁄ ⁄
[ ⁄ ⁄ ]

[ ⁄ ⁄ ][ ⁄ ⁄ ]

⁄ ⁄
[ ]
⁄ ⁄
⁄ ⁄
[ ][ ⁄ ⁄ ]
⁄ ⁄

[ ⁄ ⁄ ]

[ ⁄ ⁄ ][ ⁄ ⁄ ]

⁄ ⁄
[ ] …
⁄ ⁄
( ) ( ) ( )
Here, … for all i, all the states of the chain are periodic, with period 2.
Since, the chain is finite and irreducible all its states are non-null persistent.
All states are not ergodic.

Problem 11 Consider a Markov chain with state space {0, 1} and the TPM P = [ ]

(i) Draw a transition diagram.


(ii) Show that state 0 is recurrent.
(iii) Show that 1 is transient.
(iv) Is the state 1 periodic? If so, what is the period.
(v) Is the chain irreducible?
(vi) Is the chain Ergodic ?lain
Solution:

Let ( ) * + finite . Given P = [ ]

(i)

(ii) Here state 0 is recurrent.[All absorbing states are recurrent]


(iii) State 1 is transient. [ A state I is said to be transient iff there is a positive probability that the
process will not return to this state]
(iv) In state 1, state 1 is aperiodic

(v) State 0 and 1 are not communicate to each other .The markov chain is not irreducible.
(vi) The markov chain is not irreducible. The state are not non null persistent..
(vii) Hence the chain is not ergodic.

Problems for Practice:


1. A Sales man territory consists of three cities A, B, C. He never sells in the same city on successive day’s .If
he sells in A, and then the next day he sells in city B. However, if he sells in either B or C, then the next day he
is twice as likely to sell in city A as in the other city. In the long run, how often does he sell in each of the
cities.
Solution:[ ⁄ ⁄ ] , ,( , )
⁄ ⁄
1. Let * … + be a markov chain the space s = {1,2,3,} with one step transition matrix P =

[ ⁄ ⁄ ]

(i) Sketch the transition diagram


(ii) Is the chain irreducible? Explain.
(iii) Is the chain ergodic ? Explain ?

3.4 POISSON PROCESS


DEFINITION: POISSON PROCESS
If ( ) represents the number of occurrence of certain events in( ) then the discrete random
process * ( )+ is called the Poisson process with provided the following postulates are satisfied.
(i) , ( )- ( )
(ii) , ( )- ( )
(iii) , ( )- ( )
(iv) ( ) is independent of the number of occurrence of the event in any interval prior and after
the interval ( )
(v) The probability that an event occurs a specified number of times in ( ) depends only
on , but not on .

State and prove the probability law for the Poisson process:
Let be the number of occurrence of the event in unit time (i.e., is the rate of occurrence) then
the probability of exactly n occurrences in time interval of length t say ( ) is a Poisson distribution with
a parameter
( )
, ( ) - …

, ( ) -is denoted by ( )
Proof: Let ( ) be a Poisson process with rate of occurrence (i e be the number of occurrence of the
event in unit time).
Let ( ) , ( ) -
( ) , ( ) -
,( ) ( ) ( )]
, ( ) ( )-
( ) ( ) ( )
( ) ( ) ( )
y ostulates of oisson process
( ) ( ) , ( ) ( )-
( ) ( )
, ( ) ( )-
( ) ( )
, ( ) ( )-

( ) , ( ) ( )- then the solution


( ) ( ) ( ) is ∫ ∫

This is linear differential equation in ( )
The solution is
( ) ∫ ∫ ( ) ∫

( ) ∫ ( ) (1)
Put in (1) we get
( ) ∫ ( ) (2)
but( ) , ( ) ( )-
( ) ( )
( ) ( ) ( )
( ) ( )
( )

( ) ( )
( )
( )
Integrating the above w r t ‘t’ we get

( )
∫ ∫
( )

log ( )
( )
(2) becomes,

( ) ∫ ∫

( )
Put n = 2 in (1) we get

( ) ∫ ( )

( )
( ) ∫

( )
( )

In general,
( )
( ) …

Thus the Poisson process has the Poisson distribution with parameter

Mean and Variance of the Poisson Process


The probability function of the Poisson process ( ) with rate is given by
( )
( ) , ( ) - …

ean , ( )-

∑ ( ) , - ∑ ( )

( )

( )

( )

( )

( )

( )

( ) ( )
6( ) 7

( ) ( ) ( )
( ) 6 7

( )
ean , ( )-
Now,
( )
, ( )- ∑ ( ) ∑

( )
∑, -

( ) ( )
{∑ ( ) ∑ }

( )( ) ( )
{∑ ∑ }
( )( ) ( )

,( ) ( )-
, ( )- ( ) ( )
( ) , ( )- ( , ( )-)

( )
Autocorrelation of the Poisson process
( ) , ( ) ( )- [ ( ), ( ) ( ) ( )-]
, ( ) * ( ) ( )+- , ( )-
( ) ( )

( )
Similarly we get ( )
In general ( ) min( )
Auto Covariance of the Poisson Process
( ) ( ) , ( )- , ( )-

min( )

PROPERTIES OF POISSON PROCESS:


PROPERTY: 1 Poisson process is a Markov process.
PROOF : By the definition of Poisson process
( )
( ) * ( ) + …(1)

Consider , ( ) ( ) ( ) -
, ( ) ( ) ( )
, ( ) ( ) -
( )
, ( )-
( )
( )
, ( ) ( ) -
This means that the conditional probability distribution of ( ) given all the past values ( )
( ) depends only on the recent value ( )
Poisson process possesses the markovian property. Hence poisson process is a markov process.

PROPERTY : 2 (Additive Property) Sum of two independent Poisson processes is a Poisson process.
PROOF:
Let ( ) ( ) ( )

, ( ) - ∑ , ( ) - , ( ) -

( ) ( )

( )

( )
∑ ( ) ( )
( )

( )
∑ ( ) ( )

( )
( )
( ) ,( ) - ( )
, ( ) -

( ) ( ) ( ) ( ).

PROPERTY : 3 Difference of two independent Poisson processes is not a Poisson process.


PROOF Let ( ) ( ) ( )
Then , ( )- , ( ) ( )- , ( )- , ( )-
, ( )- .
Also , ( )- ,* ( ) ( )+ -
[ ( ) ( ) ( ) ( )]
[ ( )] [ ( )] , ( )- , ( )-

, ( )- ( ) ( ) ( ) ( )
, ( )- ( ) ( )
Hence the difference of two Poisson processes is not a Poisson process.

PROPERTY: 4 The inter arrival time of a Poisson process. That is the interval between two successive
occurrences of a oisson process with parameter has an exponential distribution with mean
PROOF: Let two consecutive occurrences of the event be and
Let take place at time interval . Let T be the interval between the occurrences of and
Then T is a continuous random variable.
( ) * ’ ( )+
* +
, ( ) -
( )

( ) .
The c.d.f.of T is given by
( ) ( )
( )

Therefore, the p.d.f. of T is given by

( ) ( )

( )

( )
( )
Which is an exponential distribution with mean

Problems Based on Poisson Process


Problem: 1Suppose that customers arrive at a bank according to a Poisson process with a mean rate of 3 per
minute; Find the probability that during a time interval of 2 min (i) exactly 4 customers arrive and (ii) More
than 4 customers arrive (iii) fewer than 4 customers in 2 minute interval.
Solution: Here mean arrival rate per min time arrival per min, o of arrivals
By Poisson process
( ) ( )
, ( ) -

(i) P[exactly 4 customers arrive at a bank in the interval of 2 mins] , ( ) -


( )(
( ))

(ii)P [more than 4 customers arrive at a bank in the interval of 2 min] , ( ) -


, ( ) -
* , ( ) - , ( ) - , ( ) - , ( ) - , ( ) -

, -

, - , - .

(iii)P [fewer than 4 customers in 2 min interval] , ( ) -.


* , ( ) - , ( ) - , ( ) - , ( ) -+

, -

[ ] , -

Problem: 2 A machines goes out of order, whenever a component fails. The failure of this part follows a
Poisson process with a mean rate of 2 per week. Find the probability that 2 weeks have elapsed since last
failure. If there are 5 spare parts of this component in an inventory and that the next supply is not due in 10
weeks, find the probability that the machine will not be out of order in the next 10 weeks.
Solution: Here the unit time is one week.
Here mean arrival rate
( )
By Poisson process , ( ) -

(i) , - , ( ) -

(ii) There are only 5 spare parts and the machine should not go out of order in the next 10 weeks
, - , ( ) -

( )

( ) ( ) ( ) ( ) ( )

, .

Problem: 3 A radioactive sources emit particles at a rate of 5 per minute in accordance with Poisson
process. Each particle emitted has a probability 0.6 of being recorded. Find the probability that 10 particles
are recorded in 4 min period.
Solution: The number of recorded particles ( ) follows a piosson process with parameter p. Here mean
rate andprobability constant p=0.6; time interval t=4 ,
No .of record n=10.
( )
By the property of Poisson process, , ( ) -
( ) ⌊( )( )( )⌋
( min- , ( ) .

Problem : 4 A radioactive sources emits particles at the rate of 6 per minute in a Poisson process .Each

emitted particle has a probability of of being recorded .Find the probability that at least 5 particles are

recorded in a 5 minute period.


Solution: ( ) is a Poisson process with parameter p

Here mean rate per minute and probablity constant time interval per minute ,

o of recorded .
( )
By the property of Poisson process, , ( ) - .
( min - , ( ) -
, ( ) -
, , ( ) - , ( ) - , ( ) - , ( ) - , ( ) -
( ) ( ) ( ) ( ) ( )
, -

, -.

, -.

Problem : 5 If customers arrive at a counter in accordance with a Poisson process with a mean rate of 2 per
minute. Find the probability that the interval between 2 consecutive arrival if (i) More than 1 minute (ii)
Between 1 and 2 min (iii) 4 min or less
Solution : By the property of Poisson process, the interval between 2 consecutive arrivals follows an
exponential distribution with parameter
Now ( )
(i) ( ) ∫ , - , - .

(ii) ( ) ∫ , - , - .

(ii) ( ) ∫ , - , - ( ) .

Problem:6 If ( ) and ( ) are two independent Poisson Processes, show that the conditional distribution
of ( ) given * ( ) ( )+ binomial.
, ( ) ) ( ( ) ( ) -
Solution : * ( ) ( ) ( ) +
, ( ) ( ) -

, ( ) ) ( ( ) -
, ( ) ( ) -
, ( ) ) ( ( ) -
, ( ) ( ) -
( ) ( )
( )
*( ) + ( )

( ) ( )
( ) *( ) +
( ) ( )
( ) *( )+
( ) ( )
( ) *( )+ ( )+

Problems for Practice


1. Queries presented in a computer data base are following a Poisson process of rate queries per
minute. An experiment consists of monitoring the data base for m minutes and recording N the number of
queries presented.
(i) What is the Probability that no queries arrive in one minute interval?
(ii) What is the Probability that exactly 6 queries arriving in one minute interval?
(iii) What is the Probability of less than 3 queries arriving in a half minute interval?
Solution : Mean rate per min time arrival per min, o of arrivals
(i) , ( ) - (ii) , ( ) - (iii) , ( ) - .

2. A machine goes out of order, whenever a component fails. The failure of this part follows a Poisson
process with a mean rate of 1 per week. Find the probability that 2 weeks have elapsed since last failure .If
there are 5 spare parts of this component in an inventory and that the next supply is not due in 10 weeks,
find the probability that the machine will not be out of order in the next 10 weeks.
Solution: mean rate
() , ( ) - ( ) , ( ) - .

3. VLSI chips, essential to the running of a computer system, fail in accordance with a Poisson
distribution with the rate of one chip in about 5 weeks .If there are two spare chips on hand and if a new
supply will arrive in 8 weeks, what is the probability that during the next 8 weeks the system will be down
will be down for a week or more, owing to the lack of chips.
Solution: ( ) , ( ) - .

3.5 RANDOM TELEGRAPH PROCESS


Definition: Random telegraph Process
A Random telegraph process is a discrete random process X (t) satisfying the following conditions.
(i) X (t) assumes only one of the two possible values 1 or - at any time‘t’ randomly
(ii) X (0) = 1 or -1 with equal probability ½.
(iii) The number of level transitions or flips, X (t) from one value to another occurring in any interval of
length is a Poisson process with rate so that the probability of exactly k transitions is , ( ) -
( )
k …… .

Properties of a random telegraph process


(i) , ( ) - = , ( ) - for any t

(ii) E[ X(t)] = 0 and var [x(t)] =1.


(iii) X(t) is a W.S.S process.

Semi random telegraph signal process


( ) represents the number of occurrences of a specified event in (0, t) and ( ) ( )
If ( ) , then
( ) s called a semi random telegraph signal process.

Problem: 1 Define random telegraph process. Prove that it is stationary in the wide sense.
Solution:
It is defined as a discrete-state, continuous parameter process * ( ) +, with the state
space* +. Assume that these two values are equally likely

i.e., , ( ) - , ( ) -

A typical sample function of the process is shown below.


( )

Assume that the number of flips ( ) from one value to another occurring in an interval of time is
Poisson distributed with parameter .
( )
, ( ) - …

Where is the number of flips per unit time.


Finally assume that the number of flips in a given time interval is stochastically independent of the values
assumed by the stochastic process ( ) at the beginning of the interval.
For the telegraph process,

, ( )- ( ) . / for all

( ) , ( ) ( )-
, ( ) ( ) - , ( ) ( ) -
, ( ) ( ) - , ( ) ( ) -
The events , ( ) ( ) - and , ( ) ( ) - are equally,
Similarly the events , ( ) ( ) - and , ( ) ( ) - are equally likely.
( ) , ( ) ( ) - , ( ) ( ) -
, ( ) ( ) - , ( ) - , ( ) ( ) - , ( ) -
( ) , ( ) ( ) - , ( ) ( ) -
We observe that , ( ) ( ) - is equivalent to the event “ n even number of flips in the
interval( ).
Let
Then , ( ) ( ) - , ( ) even-
( )

, ( ) - , ( ) - , ( ) -
( ) ( )
6 7

6 7

6 7

, ( ) ( ) - , ( ) odd-
( )

, ( ) - , ( ) - , ( ) -
( ) ( )
6 7

6 7

6 7

( ) 4 5 4 5

Since, ( ) , ( )- . / . / is finite, we conclude that the random telegraph process is

Wide sense stationary.

Problem : 2 Show that the semi random telegraph signal process is evolutionary.i.e,it is not a SSS and it is
not a WSS process. [0R]
Prove that a random telegraph signal process ( ) ( ) is a WSS process when is a random variable
which is independent of ( ) , assumes the values -1 and +1 with equal probability and ( )
( )
[0R]
Let { ( ) + ( ) = total number of points in the interval;

(0,t) = k, say and ( ) { Find the ACF of ( ) Also if P [A=1] = P[A=-1]=1/2. and A is

independent of ( ), find the ACF of ( ) ( ) and prove that * ( ) +is wide sense stationary process.
Solution:
( )
(i) The probability law of ( ) is given by , ( ) - k = 0, 1, 2, 3…… .

Then , ( ) - , ( ) is even-
( )
∑ … , ( ) - ∑ …

( ) ( ) ( )
6 7
( ) ( )
6 7

, ( ) - , ( ) is odd-

( )
∑ , ( ) - ∑
… …

( ) ( ) ( )
6 7

( ) ( )
6 7

, ( )- ( ) t 1 -1
X(t)
, -

X(t) is not a S.S.S process . X(t) is not W.S.S process.


(ii) ( ) , ( ) ( )-
, ( ) ( ) - , ( ) ( ) - , ( ) -
( )( ) ( )
, ( ) ( ) - ( )( )
, ( ) ( ) - ( )( )
, ( ) ( ) - ( )( )
Then , ( ) ( ) -
And , ( ) ( ) -
( )
( ) .
(iii) To find ACF of ( ) ( )
( ) ( ) , - ( ) , -

( ) ( ) , - ( ) , -

( ) , ( ) ( )-
( ) , ( ) ( )-
( ) ( )

( )
( ) . ( ) + is wide sense stationary process.

( ) | |
Problem : 3 The auto correlation of the random telegraph signal process is given by
.Determine the power density spectrum of the random telegraph signal.
( ) | |
Solution: ∫

| | | |
∫ ∫

( ) ( )
∫ ∫
( ) ( )
, - + , ( )
-

, -+ , -
( )( )
6 7
( )( )

You might also like