0% found this document useful (0 votes)
697 views11 pages

PTSP Notes Unit 3 PDF

1. Random processes can be classified as continuous, discrete, continuous random sequence, or discrete random sequence based on whether the random variable and time are continuous or discrete. 2. Random processes may be deterministic or non-deterministic. They can also be stationary or ergodic depending on whether their statistical properties change over time or sampling equals ensemble averages. 3. Key statistical properties of random processes include probability distributions, autocorrelation, and whether they are wide-sense stationary or strictly stationary. These characterize the processes' fluctuations and dependencies over time.

Uploaded by

G S Arun
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
697 views11 pages

PTSP Notes Unit 3 PDF

1. Random processes can be classified as continuous, discrete, continuous random sequence, or discrete random sequence based on whether the random variable and time are continuous or discrete. 2. Random processes may be deterministic or non-deterministic. They can also be stationary or ergodic depending on whether their statistical properties change over time or sampling equals ensemble averages. 3. Key statistical properties of random processes include probability distributions, autocorrelation, and whether they are wide-sense stationary or strictly stationary. These characterize the processes' fluctuations and dependencies over time.

Uploaded by

G S Arun
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

UNIT-3: RANDOM PROCESSES: TEMPORAL CHARACTERISTICS

The random processes are also called as stochastic processes which deal with randomly varying
time wave forms such as any message signals and noise. They are described statistically since the
complete knowledge about their origin is not known. So statistical measures are used. Probability
distribution and probability density functions give the complete statistical characteristics of random
signals. A random process is a function of both sample space and time variables. And can be represented
as {X x(s,t)}.

Deterministic and Non-deterministic processes: In general a random process may be deterministic or


non deterministic. A process is called as deterministic random process if future values of any sample
function can be predicted from its past values. For example, X(t) = A sin (ω0t+ϴ), where the parameters
A, ω0 and ϴ may be random variables, is deterministic random process because the future values of the
sample function can be detected from its known shape. If future values of a sample function cannot be
detected from observed past values, the process is called non-deterministic process.

Classification of random process: Random processes are mainly classified into four types based on the
time and random variable X as follows. 1. Continuous Random Process: A random process is said to be
continuous if both the random variable X and time t are continuous. The below figure shows a continuous
random process. The fluctuations of noise voltage in any network is a continuous random process.

2. Discrete Random Process: In discrete random process, the random variable X has only discrete values
while time, t is continuous. The below figure shows a discrete random process. A digital encoded signal
has only two discrete values a positive level and a negative level but time is continuous. So it is a discrete
random process.

DEPT OF ECE, GPCET Page 35


PROBABILITY THEORY & STOCHASTIC PROCESSES

3. Continuous Random Sequence: A random process for which the random variable X is continuous but t
has discrete values is called continuous random sequence. A continuous random signal is defined only at
discrete (sample) time intervals. It is also called as a discrete time random process and can be represented
as a set of random variables {X(t)} for samples tk, k=0, 1, 2,….

4. Discrete Random Sequence: In discrete random sequence both random variable X and time t are
discrete. It can be obtained by sampling and quantizing a random signal. This is called the random
process and is mostly used in digital signal processing applications. The amplitude of the sequence can be
quantized into two levels or multi levels as shown in below figure s (d) and (e).

DEPT OF ECE, GPCET Page 36


PROBABILITY THEORY & STOCHASTIC PROCESSES

Joint distribution functions of random process: Consider a random process X(t). For a single random
variable at time t1, X1=X(t1), The cumulative distribution function is defined as FX(x1;t1) = P {(X(t1)
x1} where x1 is any real number. The function FX(x1;t1) is known as the first order distribution function
of X(t). For two random variables at time instants t1 and t2 X(t1) = X1 and X(t2) = X2, the joint
distribution is called the second order joint distribution function of the random process X(t) and is given
by FX(x1, x2 ; t1, t2) = P {(X(t1)≤ x1, X(t2)≤ x2}. In general for N random variables at N time intervals
X(ti) = Xi i=1,2,…N, the Nth order joint distribution function of X(t) is defined as FX(x1, x2…… xN ;
t1, t2,….. tN) = P {(X(t1)≤ x1, X(t2) ≤x2,…. X(tN)≤ xN}.

Joint density functions of random process:: Joint density functions of a random process
can be obtained from the derivatives of the distribution functions.

Independent random processes: Consider a random process X(t). Let X(ti) = xi, i= 1,2,…N be N
Random variables defined at time constants t1,t2, … t N with density functions fX(x1;t1), fX(x2;t2), …
fX(xN ; tN). If the random process X(t) is statistically independent, then the Nth order joint density
function is equal to the product of individual joint functions of X(t) i.e. fX(x1, x2…… xN ; t1, t2,….. tN)

DEPT OF ECE, GPCET Page 37


PROBABILITY THEORY & STOCHASTIC PROCESSES

= fX(x1;t1) fX(x2;t2). . . . fX(xN ; tN). Similarly the joint distribution will be the product of the
individual distribution functions.

Statistical properties of Random Processes: The following are the statistical properties of random
processes.

Stationary Processes: A random process is said to be stationary if all its statistical properties such as
mean, moments, variances etc… do not change with time. The stationarity which depends on the density
functions has different levels or orders.
1. First order stationary process: A random process is said to be stationary to order one or first
order stationary if its first order density function does not change with time or shift in time value. If
X(t) is a first order stationary process then fX(x1;t1) = fX(x1;t1+Δt) for any time t1. Where Δt is shift
in time value. Therefore the condition for a process to be a first order stationary random process is
that its mean value must be constant at any time instant. i.e. E[X(t)] = constant.
2. Second order stationary process: A random process is said to be stationary to order two or
second order stationary if its second order joint density function does not change with time or shift in

DEPT OF ECE, GPCET Page 38


PROBABILITY THEORY & STOCHASTIC PROCESSES

time value i.e. fX(x1, x2 ; t1, t2) = fX(x1, x2;t1+Δt, t2+Δt) for all t1,t2 and Δt. It is a function of time
difference (t2, t1) and not absolute time t. Note that a second order stationary process is also a first
order stationary process. The condition for a process to be a second order stationary is that its
autocorrelation should depend only on time differences and not on absolute time. i.e. If RXX(t1,t2) =
E[X(t1) X(t2)] is autocorrelation function and τ =t2 –t1 then RXX(t1,t1+ τ) = E[X(t1) X(t1+ τ)] =
RXX(τ) . RXX(τ) should be independent of time t.
3. Wide sense stationary (WSS) process: If a random process X(t) is a second order stationary
process, then it is called a wide sense stationary (WSS) or a weak sense stationary process. However
the converse is not true. The condition for a wide sense stationary process are 1. E[X(t)] = constant. 2.
E[X(t) X(t+τ)] = RXX(τ) is independent of absolute time t. Joint wide sense stationary process:
Consider two random processes X(t) and Y(t). If they are jointly WSS, then the cross correlation
function of X(t) and Y(t) is a function of time difference τ =t2 –t1only and not absolute time. i.e.
RXY(t1,t2) = E[X(t1) Y(t2)] . If τ =t2 –t1 then RXY(t,t+ τ) = E[X(t) Y(t+ τ)] = RXY(τ). Therefore
the conditions for a process to be joint wide sense stationary are 1. E[X(t)] = Constant. 2. E[Y(t)] =
Constant 3. E[X(t) Y(t+ τ)] = RXY(τ) is independent of time t.
4. Strict sense stationary (SSS) processes: A random process X(t) is said to be strict Sense
stationary if its Nth order joint density function does not change with time or shift in time value. i.e.
fX(x1, x2…… xN ; t1, t2,….. tN) = fX(x1, x2…… xN ; t1+Δt, t2+Δt, . . . tN+Δt) for all t1, t2 . . . tN
and Δt. A process that is stationary to all orders n=1,2,. . . N is called strict sense stationary process.
Note that SSS process is also a WSS process. But the reverse is not true.

DEPT OF ECE, GPCET Page 39


PROBABILITY THEORY & STOCHASTIC PROCESSES

Ergodic Theorem and Ergodic Process: The Ergodic theorem states that for any random process X(t),
all time averages of sample functions of x(t) are equal to the corresponding statistical or ensemble
averages of X(t). i.e. ̅x = ̅X or Rxx(τ) = RXX(τ) . Random processes that satisfy the Ergodic theorem are
called Ergodic processes.

Joint Ergodic Process: Let X(t) and Y(t) be two random processes with sample functions x(t) and y(t)
respectively. The two random processes are said to be jointly Ergodic if they are individually Ergodic and
their time cross correlation functions are equal to their respective statistical cross correlation functions.
i.e. x ̅ =X ̅y =Y ̅ 2. Rxx(τ) = RXX(τ), Rxy(τ) = RXY(τ) and Ryy(τ) = RYY(τ).

Mean Ergodic Random Process: A random process X(t) is said to be mean Ergodic if time average of
any sample function x(t) is equal to its statistical average, which is constant and the probability of all
other sample functions is equal to one. i.e. E[X(t)] =X ̅ = A[x(t)] = ̅x with probability one for all x(t).

Autocorrelation Ergodic Process: A stationary random process X(t) is said to be Autocorrelation


Ergodic if and only if the time autocorrelation function of any sample function x(t) is equal to the
statistical autocorrelation function of X(t). i.e. A[x(t) x(t+τ)] = E[X(t) X(t+τ)] or Rxx(τ) = RXX(τ).

Cross Correlation Ergodic Process: Two stationary random processes X(t) and Y(t) are said to be cross
correlation Ergodic if and only if its time cross correlation function of sample functions x(t) and y(t) is
equal to the statistical cross correlation function of X(t) and Y(t). i.e. A[x(t) y(t+τ)] = E[X(t) Y(t+τ)] or
Rxy(τ) = RXY(τ).

Properties of Autocorrelation function: Consider that a random process X(t) is at least WSS and is a
function of time difference τ = t2-t1. Then the following are the properties of the autocorrelation function
of X(t).

DEPT OF ECE, GPCET Page 40


PROBABILITY THEORY & STOCHASTIC PROCESSES

DEPT OF ECE, GPCET Page 41


PROBABILITY THEORY & STOCHASTIC PROCESSES

Properties of Cross Correlation Function: Consider two random processes X(t) and Y(t) are at least
jointly WSS. And the cross correlation function is a function of the time difference τ = t2-t1. Then the
following are the properties of cross correlation function.
1. RXY(τ) = RYX(-τ) is a Symmetrical property.
Proof: We know that RXY(τ) = E[X(t) Y(t+ τ)] also RYX(τ) = E[Y(t) X(t+ τ)] Let τ = - τ then
RYX(-τ) = E[Y(t) X(t- τ)] Let u=t- τ or t= u+ τ. then RYX(-τ) = E[Y(u+ τ) X(u)] = E[X(u) Y(u+ τ)]
Therefore RYX(-τ) = RXY(τ) hence proved.
2. If RXX(τ) and RYY(τ) are the autocorrelation functions of X(t) and Y(t) respectively then the cross
correlation satisfies the inequality

DEPT OF ECE, GPCET Page 42


PROBABILITY THEORY & STOCHASTIC PROCESSES

3. If RXX(τ) and RYY(τ) are the autocorrelation functions of X(t) and Y(t) respectively then the cross
correlation satisfies the inequality:

Proof: We know that the geometric mean of any two positive numbers cannot exceed their arithmetic
mean that is if RXX(τ) and RYY(τ) are two positive quantities then at τ=0,

4.If two random processes X(t) and Y(t) are statistically independent and are at least WSS, then RXY(τ)
=X ̅Y ̅. Proof: Let two random processes X(t) and Y(t) be jointly WSS, then we know that RXY(τ)
=E[X(t) Y(t+ τ)] Since X(t) and Y(t) are independent RXY(τ) =E[X(t)]E[ Y(t+ τ)]

DEPT OF ECE, GPCET Page 43


PROBABILITY THEORY & STOCHASTIC PROCESSES

Covariance functions for random processes: Auto Covariance function: Consider two random
processes X(t) and X(t+ τ) at two time intervals t and t+ τ. The auto covariance function can be expressed
as

Therefore at τ = 0, the auto covariance function becomes the Variance of the random process. The
autocorrelation coefficient of the random process, X(t) is defined as

Cross Covariance Function: If two random processes X(t) and Y(t) have random variables X(t) and
Y(t+ τ), then the cross covariance function can be defined as
CXY(t, t+τ) = E[(X(t)-E[X(t)]) ((Y(t+τ) – E[Y(t+τ)])] or CXY(t, t+τ) = RXY(t, t+τ) - E[(X(t) E[Y(t+τ)].
If X(t) and Y(t) are jointly WSS, then CXY(τ) = RXY(τ) -X ̅Y ̅. If X(t) and Y(t) are Uncorrelated then
CXY(t, t+τ) =0.
The cross correlation coefficient of random processes X(t) and Y(t) is defined as

Gaussian Random Process: Consider a continuous random process X(t). Let N random variables
X1=X(t1),X2=X(t2), . . . ,XN =X(tN) be defined at time intervals t1, t2, . . . tN respectively. If random
variables are jointly Gaussian for any N=1,2,…. And at any time instants t1,t2,. . . tN. Then the random
process X(t) is called Gaussian random process. The Gaussian density function is given as

DEPT OF ECE, GPCET Page 44


PROBABILITY THEORY & STOCHASTIC PROCESSES

Poisson’s random process: The Poisson process X(t) is a discrete random process which represents the
number of times that some event has occurred as a function of time. If the number of occurrences of an
event in any finite time interval is described by a Poisson distribution with the average rate of occurrence
is λ, then the probability of exactly occurrences over a time interval (0,t) is

And the probability density function is

DEPT OF ECE, GPCET Page 45

You might also like