Note 1469965023
Note 1469965023
RANDOM PROCESSES
Introduction
In chapter 1, we discussed about random variables. Random variable is a function of the
possible outcomes of a experiment. But, it does not include the concept of time. In the
real situations, we come across so many time varying functions which are random in
nature. In electrical and electronics engineering, we studied about signals.
Generally, signals are classified into two types.
(i) Deterministic
(ii) Random
Here both deterministic and random signals are functions of time. Hence it is
possible for us to determine the value of a signal at any given time. But this is not
possible in the case of a random signal, since uncertainty of some element is always
associated with it. The probability model used for characterizing a random signal is called
a random process or stochastic process.
REMARK
i) If t is fixed, then {X(s, t)} is a random variable.
ii) If S and t are fixed {X(s, t)} is a number.
iii) If S is fixed, {X(s, t)} is a signal time function.
NOTATION
Here after we denote the random process {X(s, t)} by {X(t)} where the index set T is assumed to
be continuous process is denoted by {X(n)} or {Xn}.
1
CLASSIFICATION OF RANDOM PROCESSES
We can classify the random process according to the characteristics of time t and the
random variable X = X(t) t & x have values in the ranges
–< t < and –< x <.
Deterministic Process
A process is called deterministic if future value of any sample function can be predicted
from past values.
STATIONARY PROCESS
A random process is said to be stationary if its mean, variance, moments etc are constant.
Other processes are called non stationary.
Example :1
Show that a first order stationary process has a constant mean.
Solution
Let us consider a random process {X(t1)} at two different times t1 and t2.
E X t1
xf x, t dx
1
E X t 2 xf x, t dx
2
[f(x,t2) is the density form of the random process X(t2)]
Let t2 = t1 + C
t 2
E X xf x, t Cdx xf x, t dx
1 1
= E X t1
Thus E X t 2 t1
E X
Mean process {X(t1)} = mean of the random process {X(t2)}.
Definition 2:
If the process is first order stationary, then
Mean = E(X(t)] = constant
Second Order Stationary Process
A random process is said to be second order stationary, if the second order density
function stationary.
f x1 ,2x 2 ; t1 , t 22 f x1 , x 2 ; t1 C, t 2 C x1 , x 2 and C.
E X1 ,E X2 ,E X1, X2 denote change with time, where
X = X(t1); X2 = X(t2).
Strongly Stationary Process
A random process is called a strongly stationary process or Strict Sense Stationary
Process (SSS Process) if all its finite dimensional distribution are invariance under translation of
time 't'.
fX(x1, x2; t1, t2) = fX(x1, x2; t1+C, t2+C)
fX(x1, x2, x3; t1, t2, t3) = fX(x1, x2, x3; t1+C, t2+C, t3+C)
In general
fX(x1, x2..xn; t1, t2…tn) = fX(x1, x2..xn; t1+C, t2+C..tn+C) for any t1 and any real number
C.
CXX t1 , t 2 E X t1 E X t1 X t 2 E X t 2
= R XX t1 , t 2 E X
t1 E X t 2
Correlation Coefficient
The correlation coefficient of the random process {X(t)} is defined as
t ,t CXX t 1 ,t 2
Var X t1 xVar X t 2
XX 1 2
CROSS CORRELATION
The cross correlation of the two random process {X(t)} and {Y(t)} is defined by
RXY (t1, t2) = E[X(t1) Y (t2)]
EVOLUTIONARY PROCESS
A random process that is not stationary in any sense is called as evolutionary process.
E[X(t)] =
X t f d
1
At
2
= d
0 2
A
sin t
2
=
2 0
A
= 2 t Sin t 0 2
Sin
= Sint sin t
A
2
= 0 constant
Since E[X(t)] = a constant, the process X(t) is a stationary random process.
E X t nPn t
n0
ne t t
n
=
n0 n
e t t
n
=
n1 n 1
t n
= e t n 1
n1
t t 2
= et ...
0! 1!
= t e 1 t t 2 ...
t
1 2
= t e e
t t
= t , depends on t
Hence Poisson process is not a stationary process.
ERGODIC RANDOM PROCESS
Time Average
The time average of a random process {X(t)} is defined as
1 T
XT X t dt
2T T
Ensemble Average
The ensemble average of a random process {X(t)} is the expected value of the random
variable X at time t
Ensemble Average = E[X(t)]
Ergodic Random Process
{X(t)} is said to be mean Ergodic
If lim X T
T
T
1
lim
T 2T T
X t dt
Mean Ergodic Theorem
Let {X(t)} be a random process with constant mean and let XT be its time average.
Then {X(t)} is mean ergodic if
lim Var X T 0
T
Correlation Ergodic Process
The stationary process {X(t)} is said to be correlation ergodic if the process {Y(t)} is
mean ergodic where
Y(t) = X(t) X(t+)
TUTORIAL QUESTIONS
Example:2 Given an example of stationary random process and justify your claim.
Solution:
Let us consider a random process X(t) = A as (wt + ) where A & are custom and '' is
uniformly distribution random Variable in the interval
(0, 2).
Since '' is uniformly distributed in (0, 2), we have
1
,0 C 2
f 2
0 , otherwise
E[X(t)] =
X t f d
1
At
2
= d
0A 2
sin t
2
=
2 0
A
= 2 t Sin t 0 2
Sin
= Sint sin t
A
2
= 0 constant
Since E[X(t)] = a constant, the process X(t) is a stationary random process.
Example:3.which are not stationary .Examine whether the Poisson process {X(t)} given by the
et t
probability law P{X(t)=n] = , n = 0, 1, 2, ….
n
Solution
We know that the mean is given by
E X t nPn t
n0
ne t t
n
=
n0 n
e t t
n
=
n1 n 1
t
n
=e t
n 1
n1
t t 2
= et ...
0! 1!
= t e t 1 t t ...
2
1 2
= t e e
t t
= t , depends on t
Hence Poisson process is not a stationary process.
CORRELATION AND SPECTRAL DENSITY
Introduction
The power spectrum of a time series x(t) describes how the variance of the data x(t) is
distributed over the frequency components into which x(t) may be decomposed. This
distribution of the variance may be described either by a measure or by a statistical
cumulative distribution function S(f) = the power contributed by frequencies from 0 upto
f. Given a band of frequencies [a, b) the amount of variance contributed to x(t) by
frequencies lying within the interval [a,b) is given by S(b) - S(a). Then S is called the
spectral distribution function of x.
The spectral density at a frequency f gives the rate of variance contributed by
frequencies in the immediate neighbourhood of f to the variance of x per unit frequency.
PROPERTY: 1
The mean square value of the Random process may be obtained from the auto correlation
function.
RXX(), by putting = 0.
is known as Average power of the random process {X(t)}.
PROPERTY: 2
RXX() is an even function of .
RXX () = RXX (-)
PROPERTY: 3
If the process X(t) contains a periodic component of the same period.
PROPERTY: 4
If a random process {X(t)} has no periodic components, and
E[X(t)] = X then
60
lim R XX X (or)X lim R XX
2
|T| |T|
i.e., when , the auto correlation function represents the square of the mean of the random
process.
PROPERTY: 5
The auto correlation function of a random process cannot have an arbitrary shape.
Solution:
(i) Given RXX() = 5 Sin n
RXX (–) = 5 Sin n(–) = –5 Sin n
RXX() RXX(–), the given function is not an auto correlation function.
1
(ii) Given RXX () =
1 92
1
RXX (–) = R XX
1 9
2
Example : 2
Find the mean and variance of a stationary random process whose auto correlation
function is given by
R XX 18 2
2
6
Solution
Given R XX 18 2
2
6
X2 lim R
XX
| | 2
= lim 18
2
| | 6
= 18 lim 2
| | 6 2
2
= 18
6
= 18 + 0
= 18
X = 18
E X t = 18
Var {X(t)} = E[X2(t)] - {E[X(t)]}2
We know that
E X 2 t = R XX (0)
2 55
= 18
60 3
1
=
3
Example : 3
Express the autocorrelation function of the process {X'(t)} in terms of the auto correlation
function of process {X(t)}
Solution
Consider, RXX'(t1, t2) = E{X(t1)X'(t2)}
X t 2 h X t 2
= E X t 1 lim
n0
h
X t1 X t 2 h X t1 X t 2
= lim E
h0
h
R XX t 1 ,t 2 h R X t 1 ,t 2
= lim
h0 h
RXX' (t1, t 2 ) = R t , t (1)
t 2 XX 1 2
Similarly RXX' (t1, t 2 ) = R ' t, t
t 1 XX 2
R X'X (t1, t2 ) = R XX t1, t2 by (1)
t, t2
Auto Covariance
The auto covariance of the process {X(t)} denoted by CXX(t1, t2) or C(t1, t2) is defined as
CXX t1, t 2
E X t1 E X t1 X t 2 E X
t 2
CORRELATION COEFFICIENT
CXX t 1 ,t 2
XX t 1 ,t 2
Var X t1 x Var X t 2
Where CXX(t1, t2) denotes the auto covariance.
CROSS CORRELATION
Cross correlation between the two random process {X(t)} and {Y(t)} is defined as
RXY (t1, t2) = E[X(t1) Y(t2)] where X(t1) Y(t2) are random variables.
CROSS COVARIANCE
Let {X(t)} and {Y(t)} be any two random process. Then the cross covariance is defined
as
CXY t1 , t 2
E X t1 E Y t1 X t 2 E Y
t 2
The relation between Mean Cross Correlation and cross covariance is as follows:
CXY t 1 ,t 2 R XY t1 , t 2 E X t1 E Y t 2
Definition
Two random process {X(t)} and {Y(t)} are said to be uncorrelated if
CXY t 1 ,t 2 0, t1,t2
Hence from the above remark we have,
RXY (t1, t2) = E[X(t1) Y(t2)]
CROSS CORRELATION COEFFICIENT
cXY t 1 ,t 2
XY t 1 ,t 2
Var X t1 Var X t 2
CROSS CORRELATION AND ITS PROPERTIES
Let {X(t)} and {Y(t)} be two random. Then the cross correlation between them is also
defined as
RXY(t, t+) = E X t Y t
= RXY ()
PROPERTY : 1
RXY () = RYX (–)
PROPERTY : 2
If {X(t)} and {Y(t)} are two random process then R XY R XX 0 R YY 0 , where
RXX() and RYY() are their respective auto correlation functions.
PROPERTY : 3
If {X(t)} and {Y(t)} are two random process then,
R XY 1 R XX 0 R YY 0
2
Solution
By def. we have
RXY() = RXY (t, t+)
Now, RXY (t, t+) = E[X(t). Y(t+)]
= E [A cos (t + ). A sin ( (t+) + )]
= A 2 E sin t cos t
Since '' is a uniformly distributed random variable we have
1
f(0) = , 0 2
2
Now E sin t cos t
= sin t .cos wt f d
sin t t .cos t 1 d
2
=
2
0
2
1
2 0
= sin t cos t d
1 2 1
sin t t
= 2 0 2
sint t d
1 sin 2t 2 sin
2
= d
2 0 2
1 cos2t 2
2
= sin
4 2 0
1 cos2t cos2t 0
= sin 2 0
4 2 2
1 cos 2t cos 2t
= 2sin
4 2 2
=
1
0 2sin
4
1
= sin (3)
2
Substituting (3) in (1) we get
A2
R XY t, t sin
2
SPECIAL REPRESENTATION
Let x(t) be a deterministic signal. The Fourier transform of x(t) is defined as
F x t x w x t e it dt
Here X() is called "spectrum of x(t)".
Hence x(t) = Inverse Fourier Transform of X()
1
= X eitd .
2
Definition
The average power P(T) of x(t) over the interval (-T, T) is given by
T
P T 1 x 2 t dt
2T T
X T d
2
1
= (1)
2 2T
Definition
The average1power PXX for the random process {X(t)} is given by
P lim T
E X2 t dt
XX
T
2T
E X T
2
R e
i
= XX d
Thus,
SXX f R XX
ei 2fd
2
[inverse Fourier transform of SXX(f)]
TUTORIAL QUESTIONS
1. Find the ACF of {Y(t)} = AX(t)cos (w0+ ) where X(t) is a zero mean stationary random
process with ACF A and w0 are constants and is uniformly distributed over (0, 2 ) and
independent of X(t).
2. Find the ACF of the periodic time function X(t) = A sinwt
3. If X(t) is a WSS process and if Y(t) = X(t + a) – X(t – a), prove that
4. If X(t) = A sin( ), where A and are constants and is a random variable, uniformly
distributed over (- ), Find the A.C.F of {Y(t)} where Y(t) = X2(t).
5.. Let X(t) and Y(t) be defined by X(t) = Acos t + Bsin t and Y(t) = B cos t – Asin t
Where is a constant and A nd B are independent random variables both having zero mean and
varaince . Find the cross correlation of X(t) and Y(t). Are X(t) and Y(t) jointly W.S.S
processes?
6. Two random processes X(t) and Y(t) are given by X(t) = A cos ( ), Y(t) = A sin(
), where A and are constants and is uniformly distributed over (0, 2 ). Find the cross
correlation of X(t) and Y(t) and verify that .
7..If U(t) = X cos t + Y sin t and V(t) = Y cost + X sint t where X and Y are independent random
varables such that E(X) = 0 = E(Y), E[X2] = E[Y2] = 1, show that U(t) and V(t) are not jointly
W.S.S but they are individually stationary in the wide sense.
8. Random Prosesses X(t) and Y(t) are defined by X(t) = A cos ( ), Y(t) = B cos ( )
where A, B and are constants and is uniformly distributed over (0, 2 ). Find the cross
correlation and show that X(t) and Y(t) are jointly W.S.S
WORKEDOUT EXAMPLES
Example 1.Check whether the following function are valid auto correlation function (i) 5 sin n
1
(ii)
1 92
Solution:
(i) Given RXX() = 5 Sin n
RXX (–) = 5 Sin n(–) = –5 Sin n
1
(ii) Given RXX () =
1 92
1
RXX (–) = R XX
1 9
2
Example : 2
Find the mean and variance of a stationary random process whose auto correlation
function is given by
R XX 18 2
2
6
Solution
Given R XX 18 2
2
6
X2 lim R
XX
| | 2
= lim 18
2
| | 6
= 18 lim 2
| | 6 2
2
= 18
6
= 18 + 0
= 18
X = 18
E X t = 18
Var {X(t)} = E[X2(t)] - {E[X(t)]}2
We know that
E X 2 t = R XX (0)
2 55
= 18
60 3
1
=
3
Example : 3
Express the autocorrelation function of the process {X'(t)} in terms of the auto correlation
function of process {X(t)}
Solution
Consider, RXX'(t1, t2) = E{X(t1)X'(t2)}
X t 2 h X t 2
= E X t 1 lim
n0
h
X t1 X t 2 h X t1 X t 2
= lim E
h0
h
R XX t 1 ,t 2 h R X t 1 ,t 2
= lim
h0 h
RXX' (t1, t 2 ) = R t , t (1)
t 2 XX 1 2
Similarly RXX' (t1, t 2 ) = R ' t, t
t 1 XX 2
R X'X (t1, t2 ) = R XX t1, t2 by (1)
t, t2
Example :4
Two random process {X(t)} and {Y(t)} are given by
X(t) = A cos (t+), Y(t) = A sin (t + ) where A and are constants and '' is a uniform
random variable over 0 to 2. Find the cross correlation function.
Solution
By def. we have
RXY() = RXY (t, t+)
Now, RXY (t, t+) = E[X(t). Y(t+)]
= E [A cos (t + ). A sin ( (t+) + )]
= A 2 E sin t cos t
Since '' is a uniformly distributed random variable we have
1
f(0) = , 0 2
2
Now E sin t cos t
= sin t .cos wt f d
1
sin t t .cos t d
2
=
2
0
2
1
2 0
= sin t cos t d
1 2 1
sin t t
= 2 0 2
sint t d
1 sin 2t 2 sin
2
= d
2 0 2
1 cos2t 2
2
= sin
4 2 0
1 cos2t cos2t 0
= sin 2 0
4 2 2
1 cos2t cos2t
= 2sin
4 2 2
=
1
0 2sin
4
1
= sin (3)
2
Substituting (3) in (1) we get
A2
R XY t, t sin
2
It is true that if X and Y are independent, then Cov(X, Y ) = 0. If Cov(X, Y ) = 0 then immediately
X and Y cannot be concluded that they are independent. Graphically it can be written as
Proof :
With X=Z 2
Recognize that this last integral is (up to a scalar constant) the same integral is calculated to find E [Z],
which is 0. Therefore E [Z ] = 0. Hence Cov(X,Z) = 0 - 0 = 0. Another way to find E[Z ] = 0 is by noticing
3 3
-x2/2
that x3 is an odd function, and e is an even function. Hence, x e- is an odd function. We know that the
3 x2/2
Exercise 2. Let X and Y be jointly continuous random variables with joint density
Discrete time processes and sequences
A1,2,…,6 the amplitude is a random variable that assumes any integer number between
one and six, each with equal probability Pr(A=k)=1/6 (k=1, 2, …, 6). This random process consists
of an ensemble of six different discrete-time signals x k (n) , cos( nwo) ,x6(n)= 6cos(nwo),each of
which shows up with equal probability.
x1 ( n) =cos(nωo) ,x2 ( n) =2cos(nωo) ,…x6 (n ) =6cos(nωo ) ,each of which shows up with equal
probability.
The discrete-time version of a continuous-time signal x(t) is here denoted by x[n],which
corresponds to x(t) sampled at the time instant t = nT . The sampling rate is Fs =1/T in Hz, and T
is the sampling interval in seconds. The index n can be interpreted as normalized (to the sampling
rate) time. Similar to the deterministic case, it is in principle possible to reconstruct bandlimited
continuous-time stochastic process x(t) from its samples {x[n]}∞n=∞. In the random case, the
reconstruction must be given a more precise statistical meaning, like convergence in the
mean square sense.To simplify matters, it will throughout be assumed that all processes are (
wide-sense) stationary, real valued and zero mean, i.e. E{x[n]}=0. The autocorrelation
function is defined as
The cross-correlation measures how related the two processes are. This is useful, for
example for de- termining who well one can predict a desired but unmeasurable signal y[n] from
the observed x[n]. If rxy [k] = 0 for all k, we say that x[n] and y[n] are uncorrelated. It should be
noted that the cross- correlation function is not necessarily symmetric, and it does not necessarily
have its maximum at k = 0. For example, if y[n] is a time-delayed version of x[n], y[n] = x[n − l],
then rxy [k] peaks at lag k = l. Thus, the cross-correlation can be used to estimate the time-delay
between two measurements of the same signal, perhaps taken at different spatial locations. This
can be used, for example for synchronization in a communication system.
V+ TEAM