Random Process
Random Process
NOTE
If s and t are fixed,* ( )+is a number.
If t is fixed, * ( )+is a random variable.
If s is fixed, * ( )+is a single time function.
If s and t are variables * ( )+is a collection of random variables that are time functions.
A random process * ( )+ is said to be stationary if all its statistical properties do not change with time.
(i.e., statistical properties are constants). Statistical properties means, Moments, Mean, Variance etc.
WIDE- SENSE STATIONARY (WSS) (or) WEAKLY STATIONARY PROCESS (or) COVARIANCE STATIONARY
PROCESS
A random process * ( )+ is said to be Wide sense stationary if the following conditions are satisfied
(i) Mean = , ( )- = Constant.
(ii) , ( ) ( )- ( ) the autocorrelation depends only on where –
EVOLUTIONARY PROCESS
A random process that is not stationary in any sense is called as evolutionary process.
( ) *x(t ) x(t )} ∫ ∫ ( ) .
NOTE :
( ) E, (t) (t )-
wide-sense stationary , ( )-a constant
depends only on .
x(t) is W.S.S and (t t )depends only
jointly wide-sense stationary
y(t) is on .
Formula
, ( )- ∑ ( ) ( ) , ( )- ∫ ( ) ( )
, ( )- ∑ ( ) ( ) , ( )- ∫ ( ) ( )
, ( )- , ( )- , ( )- , ( )- , ( )- , ( )-
Problem : 2 The process * ( )+whose probability distribution under certain condition is given by
( )
{* ( )+ } ( )
, …
( )
Show that * ( )+is not stationary.
( )
Solution:
( ) ………
( )
* ( ) + ………
( ) ( ) ( ) ( )
( )
( ) ( ) …………
( ) ( ) ( )
( )
………
( ) ( ) ( )
( )
, ………-
( ) ( ) ( )
ut .
( )
, - ( )
( )
( )
( )
( )
0 ( )
1
[ ]
( )
( )
( ) ( ) Since 0 1 =0 1 ( )
Therefore , ( )- , a constant.
Now, , ( )- ∑ ( ).
( )
∑
( )
( )
∑, ( ) -
( )
( ) ( )
∑ ( ) ∑
( ) ( )
[ ] , ( )-
( ) ( )
6 ( ) 7 since by ( )
( ) ( )
( )
0 ( )
1 [ ( )
[ ]
( ) ( )
( )
( )
Therefore , ( )] ( )
Problem : 3 Show that the random process ( ) ( ) where A and are constants, is a
random variable uniformly distributed in ( ) is first order stationary.
Solution: Given ( ) sin( )
Since is uniformly distributed R.V. over ( ) we have
( ) {
Now , ( )- ∫ (t) f( ) d
∫ sin( t ) d
∫ sin
∫ sin( t )d ∫ cos
Cos( )
Sin( )
,cos( t )-
,cos( t)-
,cos t cos t-
,sin( t )-
[sin( t ) - sin( t )-
[ sin t + sin t-
, 0]
E,x(t)- Constant.
( )E,x (t)- E ,cos ( t )-
cos, ( t )-
E, -
E, - E,cos( t )-
, - ∫ cos( t )- d
sin( t )
, - , -
,sin( t ) sin( t )-
, -
E,x (t)-
( ) {
otherwise
Now, , ( )- ∫ ( ) ( )
∫ cos( t ) d
∫ cos( t )d
,sin( t )-
,sin( t)-
[Sin t sin t]
, ( )- Constant.
, ( ) ( )- E, cos( t ) cos* (t ) +-
E,cos( t ) cos* t +]
E,cos( t ) cos -
E,cos( t )- E,cos -
cos {∫ cos( t ) d }
sin( t )
cos { 6 7 }
,cos( ) -
* ,cos( )- , -+
* ,cos,( ) )--+
* ,cos( ) sin( ) -+
* ,cos( )- , - ,sin( )- , -+
* +
,= ( ) a function of only.
REMARK
Every SSS process of order 2 is a WSS process but every WSS process need not be a SSS process of order 2.
Since , ( )- = constant. The condition is enough to a problem by SSS process but not enough in case of
WSS Process.
*( )( ( ) ( )+
* ( ) ( ) ( )
( )
( ) ( )
cos ( ) ( ) ( ) ( ) ( ) ( )
cos ( ) , - cos ( ) ( )
, ( ) cos ( )-
,sin* ( )+-
,sin( )-
,sin( )-
( )Depends only on Hence x(t) and y(t)are jointly wide senses stationary.
Problem : 9 If ( ) , for all where and are independent binary random variables,
each of which assumes the values and with probabilities and respectively, prove that ( ) is wide
sense stationary.
Solution:
Y and Z are discrete random variables which assumes values
Y -1 2
P(Y) 2/3 1/3
E(y) = ∑ ( )
E( ) ( ). / ( ). / .
E( )=∑ ( )
E( ) ( ) ( ) ( ) ( )
var( ) E( ) , E ( )- –
Similarly E( ) and ar ( ) .
Y and Z are independent random variables.
, - , - , -
Given x(t) cos t sint
E,x(t)- E(y)cost E( )sint
cost sint a constant
E,(x(t) x(t )- E , cos t sint )( cos(t ) sin (t )-
E*, costcos(t )- , sint sin(t )- , -cost sin(t ) , - sintcos(t )-+
= E{, costcos(t ) sint sin(t )+ cost sin(t ) sintcos(t )-+
= E[ - costcos(t ) E[ - sint sin(t ) E, - cost sin(t )+ E, - sintcos(t )-
,cost cos(t )- ,sint sin(t )-
[Cost cost (t+ )+ sint sin(t )-
,cos(t (t )-
,cos( )-
cos which is a function of only
Both conditions are satisfied. Hence, {x(t)+ is a WSS process.
Note: √lim ( ).
( ) | |
Problem : 11 The auto correlation function of a stationary process is given by Find the
( ) | |
Solution: Given
We know √lim ( ).
| |
lim ( ) =9
, ( )-
ar, x(t)- E, ( )- E, (t)-
( )- ( ) | |
E, = 9+ 2(1) = 11.
Var X (t) = 11-9 = 2.
Given Y =∫ ( ) .
E[Y] = ∫ , ( )- =∫ , - = 6.
only.
3. Let the random process be ( ) cos( t ) where is a random variable with density function
( ) . Check whether process is stationary or not.
Markov Process
A random process * ( )+ is said to be Markov process if , ( ) ( ) ( )
…… ( ) ( ) - , ( ) ( ) - for all .
In other words, the future behavior depends on the present value but not on the past, then the process is
called a Markov process.
EXAMPLE
The probability of raining today depends on previous weather conditions existed for the last two days and
not on the past weather conditions.
Markov Chain
If P* … + * + for all then the
process * + is called as Markov chain. Here a1, a2 … an are called the states of Markov chain.
EXAMPLE
Three boys are throwing a ball to each other. always throws the ball to and always throws the
ball to is just likely to throw the ball to The t.p.m. of a markov chain is [ ]
NOTE
If the state ‘ is persistent non-null then lim and if the state ’i' is persistent null or transient,
then lim .
Ergodic State
A persistent, non-null and aperiodic state i are called an ergodic state.
Absorbing State
A state ‘ is called an absorbing state if and only if
Problem: 1 The transition probability matrix of a Markov chain * + … having 3 states 1, 2 and
( ) ( ) Find ( ) (
3 is [ ] and the initial distribution is
)( ) ( )
Solution:
Given [ ]
( )
[ ][ ]
[ ]
( )
Given that ( )
i.e., , - ; , - ; , -
(i) , - ∑ , - , -
, ⁄ - , - , ⁄ - , - , - , -
( ) ( ) ( )
, - , - , -
( )( ) ( )( ) ( )( )
(ii) , - , - , -
, - , -
( ) ( ) ( )
, -
( )( )( )( )
Formula to remember
( ⁄ ) ( )
(i) For A and B provided that ( ) then . / ( )
( ) ∑ ( ⁄ ) ( )
( ⁄ ) ( ⁄ )
(ii) ( ⁄ )
∑ ( ⁄ ) ( )
(iii) ( ) ( ) ( )
(iv) , - ( ⁄ ) ( ⁄ ) ( )
⁄ ⁄ ⁄ ⁄
( )
⁄ ⁄ ⁄ ⁄ ⁄ ⁄
[ ⁄ ⁄ ][ ⁄ ⁄ ]
⁄ ⁄ ⁄
⁄ ⁄ ⁄
[ ⁄ ⁄ ⁄ ]
( )
Given ( )
i.e., , - , - , -
(i) , - ∑ , ⁄ - , -
, ⁄ - , - , ⁄ - , -
, ⁄ - , -
( ) ( ) ( )
, - , - , -
(ii) , - , ⁄ - , ⁄ - , -
( ) ( )
, -
⁄ ⁄ ⁄
⁄
(iii) , - = , ⁄ - , ⁄ - , ⁄ - ,
-
( ) ( ) ( )
, -
⁄ ⁄ ⁄ ⁄ ⁄ .
Problem: 3 A college students X has the following study habits. If he studies one night, he is 70% sure
not to study the next night. If he does not study one night, he is only 60% sure not to study next night also.
Find (i) The t.p.m., (ii) How often he studies in the long run
Solution
Since the study pattern depends on the present and not on the past, it is a Markov chain where the states
studying (S) and not studying (N).
The t.p.m. is
State of Xn
S N
tate of 6 7
, -6 7 , -
, - , -
( )
( )
Problem : 4 A Man either drives a car or catches a train to go to office each day. He never goes 2 days in a
row by train but if he drives one day, then the next day he is just as likely to drive again as he is to travel by
train. Now suppose that on the first day of the week, the man is tossed a fair die and drove to work if and
only if 6 appeared. Find (i) The probability that he takes a train on the third day and (ii) The probability
that he drives to work in the long run.
Solution:
Since the mode of transport of next day is decided on the basis of today, the travel pattern is a Markov
chain where the states are train (T) and car (C).
The t.p.m. is
State of Xn
T C
tate of 6 7
( ) ( )
( )
( )
The first day state distribution is 0 1
( ) ( )
Second day state probability
[ ]6 7
=0 1 0 1
( ) ( )
Third day state probability
[ ]6 7
[ ] [ ]
() , -
, -6 7 , -
0 1 , -
( ) .
Problem : 5 An engineer analyzing a series of digital signals generated by a testing system observes that
only 1 out of 15 highly distorted signals followed a highly distorted signal with no recognizable signal,
whereas 20 out of 23 recognized signals follow recognizable signals with no highly distorted signals
between. Given that only highly distorted signals are not recognizable, find the fraction of signals that are
highly distorted.
Solution : The state space is * +
We shall denote them by 0, 1. State space is * +
Let if the signal generated is highly distorted.
if the signal generated is recognizable.
* n … +is a Markov Chain with state space * +
The t.p.m. is 0 1
[ ]
Let , - be the limiting form of long run probability distribution and ……… , -
We know that
, -[ ] , -
[ ] , -
Problem: 6 Find the limiting state probabilities associated with the following probability matrix
[ ]
Solution:
Given t.p.m. is [ ]
, - [ ] , -
, - , -
, -
[3]
[4]
[4]x 5
[3]
Subtracting, we get 2.7 [5]
Substituting in [2], we get
[6]
Substituting in [1], we get
Problem : 7 A fair die is tossed repeatedly. If denotes the maximum of the numbers occurring in the first
n tosses, find the transition probability matrix P of the Markov chain * + Find also and , -
Solution :The State space is * +
The t.p.m. is formed using the following analysis.
Let the maximum of the numbers occuring in the first n trails
Assume this number be 3.
* + (There are 3 possibilities1, 2, 3)
* i + When
and
( )
( )
( )
Initial state probability distribution is . /
( )
∑
( ) ( ) ( ) ( ) ( ) ( )
[ ]
[ ]
, -
Problem: 8 suppose that the probability of a dry day following a rainy day is 1/3 and that the
probability of a rainy day following a dry day is ½. Given that May 1 is a dry day. Find the probability that
(i) May 3 is a dry day and
(ii) May 5 is a dry day.
Solution: To find t.p.m.:
The state space is {D, R} where D – Dry day and R – Rainy day.
The t.p.m. of the markov chain is
D R
⁄ ⁄
[ ]
⁄ ⁄
The initial probability distribution is (1, 0)
( ) , -
Next, to find May 3 is a dry day
( ) ( ) ⁄ ⁄
Now, , -[ ]
⁄ ⁄
[ ]
[ ]
( ) ( ) ⁄ ⁄
[ ][ ]
⁄ ⁄
[ ] [ ]
( ) ( ) ⁄ ⁄
[ ][ ]
⁄ ⁄
[ ] [ ]
( ) ( ) ⁄ ⁄
[ ][ ]
⁄ ⁄
[ ] [ ]
( ) .
Problem: 9 Three boys A, B and C are throwing a ball to each other. A always throw the ball to B and B
always throws the ball to C and C is just as likely to throw the ball to B as to A. Show that the process is
Markovian. Find the t.p.m. and classify the states.
Solution: The t.p.m. of the process * + is given below
A B C
[ ]
Now [ ][ ] [ ]
[ ]
[ ]
[ ]
We observe that,
( ) ( ) ( )
( ) ( ) ( )
( ) ( ) ( )
[ ]
[ ] [ ]
[ ]
and so on.
[ ]
( ) ( ) ( ) ( ) ( )
Here, etc, are greater than 0 for i = 2, 3
and GCD of …
The states 2 and 3(i.e., B and C) are periodic with period 1. i.e., a periodic.
( ) ( ) ( )
Also … are > 0 and GCD of {3, 5, 6] = 1
The state 1(i.e., state A) is periodic with period 1 i.e., aperiodic.
Since the chain is finite and irreducible, all its states are non null persistent.
Moreover all the states are ergodic.
Problem : 10 Find the nature of the states of the Markov chain with the t.p.m
[ ⁄ ⁄ ]
[ ⁄ ⁄ ]
[ ⁄ ⁄ ][ ⁄ ⁄ ]
0 1 2
⁄ ⁄
[ ] We observe that
⁄ ⁄
( ) ( ) ( )
( ) ( ) ( )
( ) ( ) ( )
[ ⁄ ⁄ ][ ⁄ ⁄ ]
⁄ ⁄
[ ]
⁄ ⁄
⁄ ⁄
[ ][ ⁄ ⁄ ]
⁄ ⁄
[ ⁄ ⁄ ]
[ ⁄ ⁄ ][ ⁄ ⁄ ]
⁄ ⁄
[ ] …
⁄ ⁄
( ) ( ) ( )
Here, … for all i, all the states of the chain are periodic, with period 2.
Since, the chain is finite and irreducible all its states are non-null persistent.
All states are not ergodic.
Problem 11 Consider a Markov chain with state space {0, 1} and the TPM P = [ ]
(i)
(v) State 0 and 1 are not communicate to each other .The markov chain is not irreducible.
(vi) The markov chain is not irreducible. The state are not non null persistent..
(vii) Hence the chain is not ergodic.
[ ⁄ ⁄ ]
State and prove the probability law for the Poisson process:
Let be the number of occurrence of the event in unit time (i.e., is the rate of occurrence) then
the probability of exactly n occurrences in time interval of length t say ( ) is a Poisson distribution with
a parameter
( )
, ( ) - …
, ( ) -is denoted by ( )
Proof: Let ( ) be a Poisson process with rate of occurrence (i e be the number of occurrence of the
event in unit time).
Let ( ) , ( ) -
( ) , ( ) -
,( ) ( ) ( )]
, ( ) ( )-
( ) ( ) ( )
( ) ( ) ( )
y ostulates of oisson process
( ) ( ) , ( ) ( )-
( ) ( )
, ( ) ( )-
( ) ( )
, ( ) ( )-
( ) ∫ ( ) (1)
Put in (1) we get
( ) ∫ ( ) (2)
but( ) , ( ) ( )-
( ) ( )
( ) ( ) ( )
( ) ( )
( )
( ) ( )
( )
( )
Integrating the above w r t ‘t’ we get
( )
∫ ∫
( )
log ( )
( )
(2) becomes,
( ) ∫ ∫
( )
Put n = 2 in (1) we get
( ) ∫ ( )
( )
( ) ∫
( )
( )
In general,
( )
( ) …
Thus the Poisson process has the Poisson distribution with parameter
ean , ( )-
∑ ( ) , - ∑ ( )
( )
∑
( )
∑
( )
∑
( )
( )
∑
( )
( ) ( )
6( ) 7
( ) ( ) ( )
( ) 6 7
( )
ean , ( )-
Now,
( )
, ( )- ∑ ( ) ∑
( )
∑, -
( ) ( )
{∑ ( ) ∑ }
( )( ) ( )
{∑ ∑ }
( )( ) ( )
,( ) ( )-
, ( )- ( ) ( )
( ) , ( )- ( , ( )-)
( )
Autocorrelation of the Poisson process
( ) , ( ) ( )- [ ( ), ( ) ( ) ( )-]
, ( ) * ( ) ( )+- , ( )-
( ) ( )
( )
Similarly we get ( )
In general ( ) min( )
Auto Covariance of the Poisson Process
( ) ( ) , ( )- , ( )-
min( )
Consider , ( ) ( ) ( ) -
, ( ) ( ) ( )
, ( ) ( ) -
( )
, ( )-
( )
( )
, ( ) ( ) -
This means that the conditional probability distribution of ( ) given all the past values ( )
( ) depends only on the recent value ( )
Poisson process possesses the markovian property. Hence poisson process is a markov process.
PROPERTY : 2 (Additive Property) Sum of two independent Poisson processes is a Poisson process.
PROOF:
Let ( ) ( ) ( )
, ( ) - ∑ , ( ) - , ( ) -
( ) ( )
∑
( )
( )
∑ ( ) ( )
( )
( )
∑ ( ) ( )
( )
( )
( ) ,( ) - ( )
, ( ) -
( ) ( ) ( ) ( ).
, ( )- ( ) ( ) ( ) ( )
, ( )- ( ) ( )
Hence the difference of two Poisson processes is not a Poisson process.
PROPERTY: 4 The inter arrival time of a Poisson process. That is the interval between two successive
occurrences of a oisson process with parameter has an exponential distribution with mean
PROOF: Let two consecutive occurrences of the event be and
Let take place at time interval . Let T be the interval between the occurrences of and
Then T is a continuous random variable.
( ) * ’ ( )+
* +
, ( ) -
( )
( ) .
The c.d.f.of T is given by
( ) ( )
( )
( ) ( )
( )
( )
( )
Which is an exponential distribution with mean
, -
, - , - .
, -
[ ] , -
Problem: 2 A machines goes out of order, whenever a component fails. The failure of this part follows a
Poisson process with a mean rate of 2 per week. Find the probability that 2 weeks have elapsed since last
failure. If there are 5 spare parts of this component in an inventory and that the next supply is not due in 10
weeks, find the probability that the machine will not be out of order in the next 10 weeks.
Solution: Here the unit time is one week.
Here mean arrival rate
( )
By Poisson process , ( ) -
(i) , - , ( ) -
(ii) There are only 5 spare parts and the machine should not go out of order in the next 10 weeks
, - , ( ) -
( )
∑
( ) ( ) ( ) ( ) ( )
, .
Problem: 3 A radioactive sources emit particles at a rate of 5 per minute in accordance with Poisson
process. Each particle emitted has a probability 0.6 of being recorded. Find the probability that 10 particles
are recorded in 4 min period.
Solution: The number of recorded particles ( ) follows a piosson process with parameter p. Here mean
rate andprobability constant p=0.6; time interval t=4 ,
No .of record n=10.
( )
By the property of Poisson process, , ( ) -
( ) ⌊( )( )( )⌋
( min- , ( ) .
Problem : 4 A radioactive sources emits particles at the rate of 6 per minute in a Poisson process .Each
emitted particle has a probability of of being recorded .Find the probability that at least 5 particles are
Here mean rate per minute and probablity constant time interval per minute ,
o of recorded .
( )
By the property of Poisson process, , ( ) - .
( min - , ( ) -
, ( ) -
, , ( ) - , ( ) - , ( ) - , ( ) - , ( ) -
( ) ( ) ( ) ( ) ( )
, -
, -.
, -.
Problem : 5 If customers arrive at a counter in accordance with a Poisson process with a mean rate of 2 per
minute. Find the probability that the interval between 2 consecutive arrival if (i) More than 1 minute (ii)
Between 1 and 2 min (iii) 4 min or less
Solution : By the property of Poisson process, the interval between 2 consecutive arrivals follows an
exponential distribution with parameter
Now ( )
(i) ( ) ∫ , - , - .
(ii) ( ) ∫ , - , - .
(ii) ( ) ∫ , - , - ( ) .
Problem:6 If ( ) and ( ) are two independent Poisson Processes, show that the conditional distribution
of ( ) given * ( ) ( )+ binomial.
, ( ) ) ( ( ) ( ) -
Solution : * ( ) ( ) ( ) +
, ( ) ( ) -
, ( ) ) ( ( ) -
, ( ) ( ) -
, ( ) ) ( ( ) -
, ( ) ( ) -
( ) ( )
( )
*( ) + ( )
( ) ( )
( ) *( ) +
( ) ( )
( ) *( )+
( ) ( )
( ) *( )+ ( )+
2. A machine goes out of order, whenever a component fails. The failure of this part follows a Poisson
process with a mean rate of 1 per week. Find the probability that 2 weeks have elapsed since last failure .If
there are 5 spare parts of this component in an inventory and that the next supply is not due in 10 weeks,
find the probability that the machine will not be out of order in the next 10 weeks.
Solution: mean rate
() , ( ) - ( ) , ( ) - .
3. VLSI chips, essential to the running of a computer system, fail in accordance with a Poisson
distribution with the rate of one chip in about 5 weeks .If there are two spare chips on hand and if a new
supply will arrive in 8 weeks, what is the probability that during the next 8 weeks the system will be down
will be down for a week or more, owing to the lack of chips.
Solution: ( ) , ( ) - .
Problem: 1 Define random telegraph process. Prove that it is stationary in the wide sense.
Solution:
It is defined as a discrete-state, continuous parameter process * ( ) +, with the state
space* +. Assume that these two values are equally likely
i.e., , ( ) - , ( ) -
Assume that the number of flips ( ) from one value to another occurring in an interval of time is
Poisson distributed with parameter .
( )
, ( ) - …
, ( )- ( ) . / for all
( ) , ( ) ( )-
, ( ) ( ) - , ( ) ( ) -
, ( ) ( ) - , ( ) ( ) -
The events , ( ) ( ) - and , ( ) ( ) - are equally,
Similarly the events , ( ) ( ) - and , ( ) ( ) - are equally likely.
( ) , ( ) ( ) - , ( ) ( ) -
, ( ) ( ) - , ( ) - , ( ) ( ) - , ( ) -
( ) , ( ) ( ) - , ( ) ( ) -
We observe that , ( ) ( ) - is equivalent to the event “ n even number of flips in the
interval( ).
Let
Then , ( ) ( ) - , ( ) even-
( )
∑
, ( ) - , ( ) - , ( ) -
( ) ( )
6 7
6 7
6 7
, ( ) ( ) - , ( ) odd-
( )
∑
, ( ) - , ( ) - , ( ) -
( ) ( )
6 7
6 7
6 7
( ) 4 5 4 5
Problem : 2 Show that the semi random telegraph signal process is evolutionary.i.e,it is not a SSS and it is
not a WSS process. [0R]
Prove that a random telegraph signal process ( ) ( ) is a WSS process when is a random variable
which is independent of ( ) , assumes the values -1 and +1 with equal probability and ( )
( )
[0R]
Let { ( ) + ( ) = total number of points in the interval;
(0,t) = k, say and ( ) { Find the ACF of ( ) Also if P [A=1] = P[A=-1]=1/2. and A is
independent of ( ), find the ACF of ( ) ( ) and prove that * ( ) +is wide sense stationary process.
Solution:
( )
(i) The probability law of ( ) is given by , ( ) - k = 0, 1, 2, 3…… .
Then , ( ) - , ( ) is even-
( )
∑ … , ( ) - ∑ …
( ) ( ) ( )
6 7
( ) ( )
6 7
, ( ) - , ( ) is odd-
( )
∑ , ( ) - ∑
… …
( ) ( ) ( )
6 7
( ) ( )
6 7
, ( )- ( ) t 1 -1
X(t)
, -
( ) ( ) , - ( ) , -
( ) , ( ) ( )-
( ) , ( ) ( )-
( ) ( )
( )
( ) . ( ) + is wide sense stationary process.
( ) | |
Problem : 3 The auto correlation of the random telegraph signal process is given by
.Determine the power density spectrum of the random telegraph signal.
( ) | |
Solution: ∫
| | | |
∫ ∫
( ) ( )
∫ ∫
( ) ( )
, - + , ( )
-
, -+ , -
( )( )
6 7
( )( )