0% found this document useful (0 votes)
416 views50 pages

Unit-III FDP

This document provides an overview of the key concepts covered in the unit on random processes. It defines random processes and distinguishes them from deterministic processes. It describes important characteristics of random processes like stationarity, ergodicity, mean, correlation, and covariance. It discusses how random processes are transmitted through linear filters and introduces concepts like power spectral density and Gaussian processes. It provides objectives, prerequisites and references for further study of random processes and their analysis and applications in digital communication systems.

Uploaded by

Harshali Mane
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
416 views50 pages

Unit-III FDP

This document provides an overview of the key concepts covered in the unit on random processes. It defines random processes and distinguishes them from deterministic processes. It describes important characteristics of random processes like stationarity, ergodicity, mean, correlation, and covariance. It discusses how random processes are transmitted through linear filters and introduces concepts like power spectral density and Gaussian processes. It provides objectives, prerequisites and references for further study of random processes and their analysis and applications in digital communication systems.

Uploaded by

Harshali Mane
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 50

Digital Communication

Unit No. 3
Random Processes

Dr. Suvarna S. Patil


Professor in E&TC Engg., BVCOEW, Pune-43
Mail id: [email protected]
Unit 3: Contents
• Introduction
• Mathematical definition of a random process.
• Stationary processes.
• Mean, Correlation and Covariance Functions
• Ergodic process
• Transmission of a random process through a LTI filter
• Power spectral density (PSD)
• Gaussian process
• Narrow band Noise
• In-phase and Quadrature components of noise

12/08/20 Random Processes 2


References

• T1: Digital Communication Systems, by


Simon Haykin 4th Ed.

• R3: Modern Digital & Analog Communication


Systems, by B. P. Lathi 4th Ed.

12/08/20 Random Processes 3


Unit 3: Objectives
• To understand random signals and random process.
• To define and calculate mean, correlation and covariance
functions of random process.
• To describe and define various types of random process
e.g. Ergodic and Gaussian process etc.
• To understand transmission of a random process through
a linear filter.
• To understand concept of narrow-band noise and study
its in-phase and quadrature components.

12/08/20 Random Processes 4


Pre-requisites
• Signals and their classification.
• Concepts of deterministic and random processes
stationarity, ergodicity
• Basic properties of a single random process
mean, standard deviation, auto-correlation, spectral
density
• Joint properties of two or more random processes
correlation, covariance, cross spectral density, simple
input-output relations.
• LTI system
• Noise and its types.
12/08/20 Random Processes 5
Importance of Random Processes
• Random variables and processes talk about
quantities and signals which are unknown in advance
• The data sent through a communication system is
modeled as random variable
• The noise, interference, and fading introduced by the
channel can all be modeled as random processes
• Even the measure of performance (Probability of Bit
Error) is expressed in terms of a probability

12/08/20 Random Processes 6


Deterministic and random processes :
• both continuous functions of time (usually)

• Deterministic processes :
physical process is represented by explicit
mathematical relation
• Random processes :
result of a large number of separate causes.
Described in probabilistic terms and by properties
which are averages

12/08/20 Random Processes 7


Random Processes: significance
• In practical problems, we deal with time varying
waveforms whose value at a time is random in nature.
Randomness or unpredictability is a fundamental property
of information.
For example,
• The speech waveform recorded by a microphone,
• The signal received by communication receiver
• The temperature of a certain city at noon or
• The daily record of stock-market data represents
random variables that change with time. 
• How do we characterize such data? Such data are
characterized as random or stochastic processes.
12/08/20 Random Processes 8
Random Processes: significance contd.

• A random variable maps each sample point in the


sample space to a point in the real line.

• A random process maps each sample point to a


waveform.
•Thus a random process is a function of the sample
point  ‘s’ and index variable ’t’  and may be written as 
X(t,s)
for fixed t=t0 X(t0, s) is a random variable.

12/08/20 Random Processes 9


e.g. Temperature records for the day
X (t,  )
S1

S2

S3
X (t,  n )

S4 X (t,  k ) 

X (t,  2 ) 
X (t, 1 )
t
0 t1 t2
12/08/20 Random Processes 10
Ensemble and sample function
• The collection of all possible waveform is
known as Ensemble. (sample space in random
variable)

• Each waveform in the collection is a sample


function (sample point)
• Amplitudes of all the sample functions at t=t0
is ensemble statistics.

12/08/20 Random Processes 11


Classification of random processes

• Stationary and Nonstationary

• Wide-Sense or Weakly Stationary

• Ergodic

12/08/20 Random Processes 12


Strictly Stationary random process :
• Ensemble averages do not vary with time
• The statistical characterization of the
process is time invariant.
• The PDFs obtained at any instants must be
identical.
• The Autocorrelation function must be
Rx (t1 , t2 )  Rx (t 2  t1 )

12/08/20 Random Processes 13


Wide sense Stationary random process :

• Mean value is constant


• Autocorrelation function is independent of
the shifts
Rx (t1 , t2 )  Rx (t 2  t1 )  Rx ( )

• All stationary processes are wide-sense


stationary but converse is not true.
12/08/20 Random Processes 14
Stationarity

12/08/20 Random Processes 15


Ergodic random process :
• Ensemble averages are equal to time averages of
any sample function.
T 2 T 2
1 1
E   X (T )  x(t )  lim  x(t )dt  lim   X dt   X
T  T T  T
T 2 T 2

• stationary process in which averages from a single


record are the same as those obtained from
averaging over the ensemble
• Most stationary random processes can be treated
as Ergodic
12/08/20 Random Processes 16
Terminology Describing Random Processes
• A stationary random process has statistical
properties which do not change at all time
• A wide sense stationary (WSS) process has a
mean and autocorrelation function which do not
change with time
• A random process is Ergodic if the time average
always converges to the statistical average
• Unless specified, we will assume that all random
processes are WSS and Ergodic
12/08/20 Random Processes 17
Ergodic

12/08/20 Random Processes 18


Mean, Correlation and covariance function
• Mean value :

x(t)

x

time, t T


 X (t )  E  X (t )   xf X (t ) ( x) dx


12/08/20 Random Processes 19


Mean, Correlation and covariance function
• Mean value of a stationary random
process is a constant.  (t )  
X X

• Autocorrelation function of a random


process X(t)is given as

RX (t1 , t 2 )  E  X (t1 ) X (t 2 )

12/08/20 Random Processes 20


12/08/20 Random Processes 21
12/08/20 Random Processes 22
• Autocorrelation :
x(t)

time, t T

• The autocorrelation, or autocovariance,


describes the general dependency of x(t)
with its value at a short time later, x(t+)

12/08/20 Random Processes 23


Autocorrelation properties

Symmetry

2. Power of W.S.S. process


3. Maximum value

12/08/20 Random Processes 24


Autocorrelation function of slowly and
rapidly fluctuating random processes

Correlation and decorrelation time concept


12/08/20 Random Processes 25
Mean, Correlation and covariance function
• Autocovariance function of a stationary
random process X(t)is given as

C X (t1 , t 2 )  E  ( X (t1 )   X )( X (t 2 )   X )
 RX (t 2  t1 )   2
X

12/08/20 Random Processes 26


Probable Questions on this topic
• Numerical based on calculation of
expectation of a given random process.
• Properties of Autocorrelation
• Find mean, autocorrelation and covariance
of a given random process.

12/08/20 Random Processes 27


12/08/20 Random Processes 28
_________________

   
 
Rxy  t1 , t2   x t y t    x1 y2 f xy  x1 , y 2  dx1dy 2
1 2  

12/08/20 Random Processes 29


Jointly Stationary Properties
• Properties

• Uncorrelated:

• Orthogonal:

12/08/20 Random Processes 30


12/08/20 Random Processes 31
If τ=0 then cross-correlation results in zero i.e. the random variables
are orthogonal

12/08/20 Random Processes 32


12/08/20 Random Processes 33
Spectral density

12/08/20 Random Processes 34


Transmission of a random process through a
X(t) w.s.s Linear filter
Y(t) Y (t )  E Y (t )
 

I Impulse

 E   h( ) X (t   )d 
Response h(t)

  

random process 

The mean of the output  h( ) E[ X (t   )]d

Y(t) is given as 
 X 

h ( ) d   X H ( 0)
12/08/20 Random Processes 35
Points to remember

• Mean of random process Y(t) produced at the


output of a LTI system in response to input
random process X(t) equals to the mean of
X(t) multiplied by the dc response of the
system.
• Autocorrelation function of the random
process Y(t) is a constant.

12/08/20 Random Processes 36


Filtering of random signals

12/08/20 Random Processes 37


12/08/20 Random Processes 38
Gaussian process
• A stochastic process is said to be a normal or Gaussian process if its
joint probability distribution is normal.

• A Gaussian process is strictly and weakly stationary because the


normal distribution is uniquely characterized by its first two
moments.

1  ( y  Y )2  1  y2 
fY ( y )  exp  exp 
2  Y  2 Y 
2
2  2

12/08/20 Random Processes 39


• Central limit theorem
– The sum of a large number of independent and
identically distributed random variables getting
closer to Gaussian distribution
• Thermal noise can be closely modeled by Gaussian
process

• The theorem states that the probability distribution


of N random variables approaches a normalized
Gaussian distribution in the limit as N approaches
infinity.

12/08/20 Random Processes 40


Gaussian Random Processes
• Gaussian random processes have some special properties

- If a Gaussian random process is wide-sense stationary,


then it is also stationary
- If the input to a linear system is a Gaussian random
process, then the output is also a Gaussian random
process
- if the random variable X(t1), X(t2),…..X(tn) obtained by
sampling a Gaussian process X(t) at times t1, t2, ……tn are
uncorrelated then these random variables are statistically
independent.

12/08/20 Random Processes 41


Noise
• Unwanted waves that tend to disturb the
transmission and processing of signals in
communication systems.

Types of noise

• Shot noise as a random process X (t )   h(t  
k  
k )
current pulse generated at time instant

(t 0 ) k   t 0
Poisson distributi on  e
k!
Sample function of Poisson counting process and autocovaiance plot

12/08/20 Random Processes 42


WHITE NOISE
• The primary spectral characteristic of thermal noise is that its power
spectral density is the same for all frequencies of interest in most
communication systems
• A thermal noise source emanates an equal amount of noise power per
unit bandwidth at all frequencies—from dc to about 1012 Hz.
• Power spectral density G(f)
• Autocorrelation function of white noise is

• The average power P of white noise if infinite

12/08/20 Random Processes 43


WHITE NOISE

12/08/20 Random Processes 44


Narrowband Noise Representation
• The noise process appearing at the output of a
narrowband filter is called narrowband noise.
• With the spectral components of narrowband noise
concentrated about some midband frequency  f c
• We find that a sample function n(t) of such a process
appears somewhat similar to a sine wave of frequency f c

• Representations of narrowband noise


– A pair of component called the in-phase and quadrature
components.
– Two other components called the envelop and phase.
Representation of Narrowband Noise in Terms of In-Phase and
Quadrature Components
• Consider a narrowband noise n(t ) of bandwidth 2B centered on
frequency f c, as illustrated in Figure
• We may represent n(t) in the canonical (standard) form:
n(t )  nI (t ) cos(2f c t )  nQ (t ) sin(2f c t )
where, nI (t ) is in-phase component of n(t ) and nQ (t ) is quadrature
component of n(t. )
PSD of narrow-band noise

12/08/20 Random Processes 47


Representation of Narrowband Noise in Terms
of Envelope and Phase Components
Properties of narrow-band noise
• The in-phase and quadrature components of
narrow-band noise have zero mean.
• If the narrow-band noise is Gaussian then its in-
phase component and quqdrature component are
jointly Gaussian.
• If the narrow-band noise is wide sense stationary
(w.s.s.), then its in-phase component and
quqdrature component are jointly w.s.s.
• PSDs of these quadrature components are same as
that of PSD of original narrow-band noise.
• The quadrature components have same variance as
the narrow-band noise.
12/08/20 Random Processes 49
Thank You

12/08/20 Random Processes 50

You might also like