0% found this document useful (0 votes)
15 views12 pages

Unit I Lect 8 Onwards

Uploaded by

Ashish Kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
15 views12 pages

Unit I Lect 8 Onwards

Uploaded by

Ashish Kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 12

14‐02‐2021

Course Outcomes
Books
1. Understand random variables and random processes. 1. Simon Haykin, Communication Systems, 4th Edition, John Wiley
& Sons, 2001.
Principles of Communication 2. Analyse different amplitude modulation schemes. 2. G R Cooper and C D McGillem, Probabilistic Methods of Signals
Engineering I 3. Analyse different angle modulation schemes. and Systems Analysis, Oxford University Press, 1998.
3. H Taub, D L Schilling and G Saha, Principles of Communication
4. Explain sampling processes and reconstruction. Systems, 3rd Edition, Tata McGraw Hill, 2008.
5. Analyse the behaviour of communication system in the 4. A B Carlson, Communication Systems, McGraw Hills, 2002.
presence of noise. 5. J G Proakis and M Salehi, Communication Systems Engineering,
2nd Edition, Pearson Education, 2006.

Syllabus
Principles of Communication
Engineering I Unit I: Random Variables and Stochastic Processes
• Review of Random Variables; Probability Distribution and Probability Density Functions;
Uniform, Gaussian, Exponential and Poisson Random Variables; Statistical Averages;
Random Processes; Correlation; Power Spectral Density; Analysis of Linear Time
Invariant Systems With Random Input; Noise and Its Representations
Unit II: Amplitude Modulation
• Course No : ELC2420
• Introduction to Modulation; Amplitude Modulation Systems (AM, DSBSC, SSBSC, VSB Unit I
• Credits : 4 Modulation/Demodulations); Frequency Division Multiplexing; Superhetrodyne Radio
• Course Category : Departmental Core Receiver; Equivalent Receiver Model, Noise in CW Receivers Using Coherent Detection, Random Variables and Processes
Noise in CW Receivers Using Envelope Detector
• Pre‐requisite(s) : ELC2410 (Signals and Systems)
Unit III: Angle Modulation
• Contact Hours (L‐T‐P) : 3‐1‐0 • Angle Modulation: Frequency and Phase Modulation; Generation and Demodulation of
• Type of Course : Theory Narrowband and Wideband FM; FM Broadcasting; Non‐linear Effects in FM Systems;
Noise in FM Receivers, FM Threshold Effect
Unit IV: Sampling and Pulse Modulation
• Sampling Theorem; Various Sampling Techniques; Sampling of Low Pass and Bandpass
Signals; Time Division Multiplexing; Generation and Recovery of PAM, PWM and PPM
Signals
14‐02‐2021

Random Processes Ensemble Average


clear all
• Random processes are functions of time. clc i
for i=1:1000
• Random processes are random in the sense that it is not possible to x(i,:)=randn(1,10000); • Averages for a stochastic process are called ensemble
predict exactly what waveform will be observed in the future. y(i,:)=5*i+x(i,:); averages. The nth moment of the random variable 𝑋 is
% subplot(2,3,i)
• Random processes can be represented as a sample space where each % hist(x(i,:))
defined as:
outcome of the experiment is associated with a sample point. end
z=sum(x);
• Each sample point represents a time‐varying function. % subplot(2,3,6) 𝐸𝑋 𝑥 𝑓 𝑥 𝑑𝑥
% hist(z)
• With a random variable, the outcome of random experiment is % figure
mapped to a real number. % plot(y') • In general, the value of the nth moment will depend on the
• With a random process, the outcome of random experiment is % hold on
time instant ti if thePDF of 𝑋 depends on 𝑡 .
% plot(z)
mapped into a waveform that is a function of time. figure
hist(x(:,1))

Random Processes Random Processes


Stationary Random Processes
An ensemble of sample functions. • At any given time instant, the value of a stochastic
process is a random variable indexed by the parameter
𝑡. We denote such a process by 𝑋 𝑡 . • With real‐world random processes, we often find that the
statistical characterization of a process is independent of the
• In general, the parameter 𝑡 is continuous, whereas
𝑋 may be either continuous or discrete, depending on time at which the observations occur.
the characteristics of the source that generates the • If a random process is divided into a number of time
stochastic process. intervals, the various sections of the process exhibit same
• The noise voltage generated by a single resistor or a statistical properties.
single information source represents a single realization
of the stochastic process. It is called a sample function. • Such a process is said to be stationary random process.
Otherwise, it is said to be nonstationary.

For a fixed time instant tk, x t , x t ,… x t   X t , s , X t , s ,…, X t , s 


1 k 2 k n k k 1 k 2 k n

constitutes a random variable.


14‐02‐2021

Stationary Random Processes Strictly Stationary Process Wide‐sense stationary (WSS)


• Let 𝐹 𝑥 be the probability distribution function associated with observations of
the different sample functions of the random process at time 𝑡 . Suppose the same • If the equivalence between distribution functions holds for • If a random process has the following properties
random process is observed at time 𝑡 𝜏 and the corresponding distribution all time shifts 𝜏, all 𝑘, and all possible observation times
1. The mean value of the process is independent of time (a constant)
function is 𝐹 𝑥 . Then if 𝑡 , 𝑡 , … , 𝑡 then we say the process 𝑋 𝑡 is strictly stationary.
𝐹 𝑥 𝐹 𝑥 2. The autocorrelation function of the random process depends only
• In other words, a random process is strictly stationary if the on the time difference i.e. it satisfies the condition that
for all 𝑡 and all 𝜏 we say the process is stationary to the first order. joint distribution of any set of random variables obtained by
• A first‐order stationary random process has a distribution function that is observing the random process 𝑋 𝑡 is invariant with respect 𝑅 𝑡 ,𝑡 𝑅 𝑡 −𝑡 𝑅 τ for all 𝑡 and τ
independent of time. to the location of the origin t 0. • Then the process is said to be it is wide‐sense stationary or weakly
• Statistical parameters such as the mean and variance are also independent of time for stationary process.
such a process.

𝜇 𝑥𝑓 𝑥 𝑑𝑥 𝑥𝑓 𝑥 𝑑𝑥

Autocorrelation Function
Second Order Stationary Process Autocorrelation Function Properties
• Consider sampling the random process 𝑋 𝑡 at two points in time • We define the autocorrelation function of the process 𝑋 𝑡 as the
𝑡 and 𝑡 with the corresponding joint distribution function expectation of the product of two random variables 𝑋 𝑡 and 𝑋 𝑡 .
• Power of a Wide‐Sense Stationary Process: The second
𝐹 𝑥 ,𝑥 . 𝑅 𝑡 ,𝑡 𝐸𝑋 𝑡 𝑋 𝑡
, moment or mean‐square value of a real‐valued random
• Suppose a second set of observations are made at times 𝑡 𝜏 and process is given by
𝑥 𝑥 𝑓 , 𝑥 , 𝑥 𝑑𝑥 𝑑𝑥
𝑡 𝜏 and the corresponding joint distribution is 𝑅 0 𝐸𝑋 𝑡 𝑋 𝑡 𝐸 𝑋2 𝑡 equivalent to the average power
𝐹 , 𝑥 , 𝑥 . Then if, for all 𝑡 , 𝑡 and 𝜏 we find that
• We say a random process 𝑋 𝑡 is stationary to second order if the joint • Symmetry: The autocorrelation of a real‐valued wide‐sense
𝐹 , 𝑥 ,𝑥 𝐹 , 𝑥 ,𝑥 distribution 𝐹 , 𝑥 , 𝑥 depends on the difference between the stationary process has even symmetry.
we say the process is stationary to the second order. observation time 𝑡 and 𝑡 . 𝑅 τ 𝑅 τ
• It implies that statistical quantities like covariance and correlation, 𝑅 𝑡 ,𝑡 𝑅 𝑡 𝑡 for all 𝑡 and 𝑡 • Maximum Value: The autocorrelation function of a wide‐
do not depend upon absolute time. • The auto covariance function of a stationary random process 𝑋 𝑡 is sense stationary random process is a maximum at the origin.
written as: (Prove it !!!)
𝐶 𝑡 ,𝑡 𝐸 𝑋 𝑡 𝜇 𝑋 𝑡 𝜇 𝑅 𝑡 𝑡 𝜇
14‐02‐2021

Proof

• t=0.001:.001:1;
Proof • x1=sin(2*pi*50*t)+1.1*randn(1,length(t));%;
• x2=sin(2*pi*50*t)+0.1*randn(1,length(t));%;
𝑅 τ 𝐸𝑋 𝑡 𝜏 𝑋 𝑡 𝐸𝑋 𝑡 𝑋 𝑡 𝜏
• plot(x1)
𝑅 τ
• ac1=autocorr(x1,'NumLags',floor((length(x1))/2));
• ac2=autocorr(x2,'NumLags',floor((length(x2))/2));
0 𝐸 𝑋 𝑡 𝜏 𝑋 𝑡
• hold on; %figure
0 𝐸𝑋 𝑡 𝜏 2𝐸 𝑋 𝑡 𝜏 𝑋 𝑡 𝐸𝑋 𝑡 • plot(x2,'m')
• plot(ac1,'k');
0 2𝑅 0 2𝑅 τ • plot(ac2,'r')

𝑅 τ 𝑅 0

Autocorrelation Function significance Autocorrelation


• Find the autocorrelation of a sinusoidal signal with randomphase:

• The physical significance of the autocorrelation function 𝑅 τ is , 𝜋 𝜃 𝜋


x[n]=[3, 4, 5] 𝑋 𝑡 𝐴𝑐𝑜𝑠 2𝜋𝑓 𝑡 Θ 𝑓 𝜃
that it provides a means of describing the “interdependence” of 0, 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
x’[n]=x[n] two random variables obtained by observing a random process
𝑋 𝑡 at times τ seconds apart. 𝑅 𝜏 𝐸𝑋 𝑡 𝜏 𝑋 𝑡 𝑅 𝜏 𝐸𝑋 𝑡 𝜏 𝑋 𝑡
𝐴 𝐴
𝐸 𝑐𝑜𝑠 4𝜋𝑓 𝑡 2𝜋𝑓 𝜏 2Θ 𝐸 𝑐𝑜𝑠 2𝜋𝑓 𝜏
2 2
𝐴 1 𝐴
𝑐𝑜𝑠 4𝜋𝑓 𝑡 2𝜋𝑓 𝜏 2𝜃 𝑑𝜃 𝑐𝑜𝑠 2𝜋𝑓 𝜏
2 2𝜋 2
𝐴
𝑐𝑜𝑠 2𝜋𝑓 𝜏
2
14‐02‐2021

Ensemble Average Ergodic Processes


• The expectation of a random process at a particular point in time requires
separate independent realizations of the random process.
• In many instances, it is difficult or impossible to observe all sample
• For a random process, 𝑋 𝑡 with 𝑁 equiprobable realizations 𝑥 𝑡 : 𝑗
functions of a random process at a given time.
1,2, … , 𝑁 , the the expected value and second moment of the random
process at time 𝑡 𝑡 are respectively given by the ensemble averages • It is often more convenient to observe a single sample function for
a long period of time.
1
𝐸𝑋 𝑡 𝑥 𝑡
𝑁 • For many stochastic processes of interest in communications, the
time averages and ensemble averages are equal, a property
1 known as ergodicity.
𝐸𝑋 𝑡 𝑥 𝑡
𝑁
• This property implies that whenever an ensemble average is
• If the process is wide‐sense stationary, then the mean value and second required, we may estimate it by using a time average.
moment computed by these two equations do not depend upon the time 𝑡 .

Time Average Cyclostationary Processes


• Practically random processes is not available to the user, but one of its
sample functions, 𝑥 𝑡 .
• Cyclostationary Processes (in the widesense)
• In such cases, the most easily measurable parameters are time averages.
• There is another important class of random processes commonly
• For example, the time average of a continuous sample function drawn from a
encountered in practice, the mean and autocorrelation function of
real‐valued process is given by
which exhibit periodicity:

𝜀𝑥 lim
1
𝑥 𝑡 𝑑𝑡
 X t1  T    X t1 
→ 2𝑇
RX (t1  T,t2  T)  RX (t1,t2)
and the time‐autocorrelation of the sample function is given by:

1 for all t1 and t2.


𝑅 𝜏 lim 𝑥 𝑡 𝑥 𝑡 𝜏 𝑑𝑡
→ 2𝑇 • Modeling the process X(t) as cyclostationary adds a new dimension,
namely, period T to the partial description of the process.
• If the statistics of the random process do not change with time, then the time
averages and ensemble averages to be equivalent.
14‐02‐2021

Transmission of a Random Process Power Spectral Density


Through a LinearFilter
• The Fourier transform of the autocorrelation function 𝑅 τ is called
• When the input random process 𝑋 𝑡 is wide‐sense stationary, the mean 𝜇 𝑡 is a the power spectral density 𝑆 𝑓 of the random process 𝑋 𝑡 .
constant 𝜇 , then mean 𝜇 𝑡 is also a constant 𝜇 .

𝜇 𝑡 𝜇 ℎ 𝜏 𝑑𝜏 𝜇 𝐻 0 𝑆 𝑓 𝑅 τ 𝑒𝑥𝑝 𝑗2𝜋𝑓𝜏 𝑑𝜏

where 𝐻 0 is the zero‐frequency (dc) response of the system.


• The autocorrelation function of the output random process 𝑌 𝑡 is given by:
𝑅 τ 𝑆 𝑓 𝑒𝑥𝑝 𝑗2𝜋𝑓𝜏 𝑑𝑓

𝑅 𝑡,𝑢 𝐸 Y t Y u 𝐸 ℎ 𝜏 𝑋 𝑡 𝜏 𝑑𝜏 ℎ 𝜏 𝑋 𝑢 𝜏 𝑑𝜏
• These Equations are basic relationship in the theory of spectral analysis
of random processes, and together are usually called the Einstein‐
𝑑𝜏 ℎ 𝜏 𝑑𝜏 ℎ 𝜏 𝐸 𝑋 𝑡 𝜏 𝑋 𝑢 𝜏 Wiener‐Khintchine relations.

𝑑𝜏 ℎ 𝜏 𝑑𝜏 ℎ 𝜏 𝑅 𝑡 𝜏 ,𝑢 𝜏

Transmission of a Random Process Through Transmission of a Random


a LinearFilter Process Through a LinearFilter Properties of the Power Spectral Density

• Property 1: 𝑆 0 𝑅 τ 𝑑𝜏
• Suppose that a random process 𝑋 𝑡 is applied as input to linear • When the input 𝑋 𝑡 is a wide‐sense stationary random
time‐invariant filter of impulse response ℎ 𝑡 , producing a new process, the autocorrelation function of 𝑋 𝑡 is only a • Property 2:𝐸 𝑋 𝑡 𝑆 𝑓 𝑑𝑓
random process 𝑌 𝑡 at the filter output. function of the difference between the observation times:
• Property 3: 𝑆 𝑓 0 for all 𝑓
• Assume that 𝑋 𝑡 is a wide‐sense stationary random process. 𝑅 τ ℎ 𝜏 ℎ 𝜏 𝑅 τ 𝜏 𝜏 𝑑𝜏 𝑑𝜏
• The mean of the output random process 𝑌 𝑡 is given by: • Property 4: 𝑆 𝑓 𝑆 𝑓
• Property 5: If a stationary random process 𝑋 𝑡 with spectrum
• If the input to a stable linear time‐invariant filter is a wide‐
𝜇 𝑡 𝐸𝑌 𝑡 𝐸 ℎ 𝜏 𝑋 𝑡 𝜏 𝑑𝜏 𝑆 𝑓 is passed through a linear filter with frequency response
sense stationary random process, then the output of the
filter is also a wide‐sense stationary random process. 𝐻 𝑓 the spectrum of the stationary output random process
𝑌 𝑡 is given by:
ℎ 𝜏 𝐸𝑋 𝑡 𝜏 𝑑𝜏 ℎ 𝜏 𝜇 𝑡 𝜏 𝑑𝜏
𝑆 𝑓 𝑆 𝑓 𝐻 𝑓 (Prove it)
14‐02‐2021

Proof Power Spectral Density


• It can be shown that:
• Mixing of a Random Process with a Sinusoidal Process
𝑆 𝑓 𝑆 𝑓 𝐻 𝑓
• A situation that often arises in practice is that of mixing (i.e.,
multiplication) of a WSS random process X t with a
𝑅 τ 𝑆 𝑓 𝑒𝑥𝑝 𝑗2𝜋𝑓𝜏 𝑑𝑓 𝑆 𝑓 𝐻 𝑓 𝑒𝑥𝑝 𝑗2𝜋𝑓𝜏 𝑑𝑓 sinusoidal signal 𝑐𝑜𝑠 2𝜋𝑓 𝑡 Θ , where the phase Θ is a
random variable that is uniformly distributed over the
𝑅 0 𝐸𝑌 𝑡 𝑆 𝑓 𝐻 𝑓 𝑑𝑓 0 for any 𝐻 𝑓 interval (0,2π).

• Suppose we let 𝐻 𝑓 1 for any arbitrarily small interval 𝑓 𝑓 𝑓 • Determining the power spectral density of the random process
and 𝐻 𝑓 0 outside this interval. Then, we have: Y t defined by:
Y t X t cos 2𝜋𝑓 𝑡 Θ
𝑆 𝑓 𝑑𝑓 0 • We note that random variable Θ is independent of X t .

• This is possible if an only if 𝑆 𝑓 0 for all 𝑓.


• Conclusion: 𝑆 𝑓 0 for all 𝑓.

Sinusoidal Signal with Random Phase Power Spectral Density


Central Limit Theorem
Mixing of a Random Process with a Sinusoidal Process • Let 𝑋 , 𝑖 1, 2, … , 𝑁, be a set of random variables that satisfies the following
• Consider the random process X t 𝐴𝑐𝑜𝑠 2𝜋𝑓 𝑡 Θ ,
(continued) The autocorrelation function of Y t is givenby: requirements:
where Θ is a uniformly distributed random variable over the
interval (‐ 𝜋, 𝜋). 𝑅 τ ΕY t 𝜏 Y t 1. The 𝑋 are statistically independent.
• The autocorrelation function of this random process is: Ε X t 𝜏 𝑐𝑜𝑠 2𝜋𝑓 𝑡 2𝜋𝑓 𝜏 Θ X t 𝑐𝑜𝑠 2𝜋𝑓 𝑡 Θ 2. The 𝑋 have the same probability density function
𝐴 Ε X t 𝜏 X t Ε 𝑐𝑜𝑠 2𝜋𝑓 𝑡 2𝜋𝑓 𝜏 Θ 𝑐𝑜𝑠 2𝜋𝑓 𝑡 Θ 3. Both the mean and the variance exist for each 𝑋
𝑅 τ 𝑐𝑜𝑠 2𝜋𝑓 𝜏 1 • Let Y be a new random variable defined as.
2
𝑅 τ E 𝑐𝑜𝑠 2𝜋𝑓 𝑡 2𝜋𝑓 𝜏 Θ 𝑐𝑜𝑠 2𝜋𝑓 𝜏
• Taking the Fourier transform of both sides of this relation: 2
1 𝑌 𝑋
𝑆 𝑓 𝛿 𝑓 𝑓 𝛿 𝑓 𝑓 𝑅 τ 𝑐𝑜𝑠 2𝜋𝑓 𝜏
2 • Then, according to the central limit theorem, the normalized random variable
1
𝑆 f 𝑆 f 𝑓 𝑆 f 𝑓 𝑌 𝐸𝑌
4 𝑍
𝜎
• approaches a Gaussian random variable with zero mean and unit variance as
the number of the random variables increases without limit
14‐02‐2021

Central Limit Theorem Gaussian Process Properties Shot Noise


• Shot noise arises in electronic devices such as diodes and
• Property 2: Consider the set of random variables or samples X(t1), transistors because of the discrete nature of current flow in
• That is, as 𝑛 becomes large, the distribution of 𝑍 approaches that of a X(t2), …X(tn), obtained by observing a random process X(t) at time t1, these devices.
zero‐mean Gaussian random variable with unit variance, as shown by t2, …, tn. If the process X(t) is Gaussian, then this set of random • For example, in a photodetector circuit a current pulse is
1 𝑠 variables is jointly Gaussian for any n, with their n‐fold joint generated every time an electron is emitted by the cathode due
𝐹 𝑧 → 𝑒𝑥𝑝 𝑑𝑠
2𝜋 2 probability density function being completely determined by to incident light from a source of constant intensity. The electrons
specifying the set of means: are naturally emitted at random times denote by τk.
• The normalized distribution of the sum of independent, identically • If the random emissions of electrons have been going on for a
distributed random variables approaches a Gaussian distribution as the 𝜇 𝑡 𝐸 𝑋 𝑡 , 𝑖 1, 2, … , 𝑛 long time, then the total current flowing through the
number of random variables increases, regardless of the individual and the set of autocovariance functions: photodetector may be modeled as an infinite sum of current
distributions. pulses, as shown by
𝐶 𝑡 ,𝑡 𝐸 𝑋 𝑡 𝜇 𝑋 𝑡 𝜇 ,𝑘 1, 2, … , 𝑛
𝑋 𝑡 ℎ 𝑡 𝜏
• Property 3: If a Gaussian process is wide‐sense stationary, then the
process is also stationary in the strict sense. where ℎ 𝑡 𝜏 is the current pulse generated at time τk.
• The process 𝑋 𝑡 is a stationary process, called shot noise.

Noise
Gaussian Process Shot Noise
• The sources of noise may be external to the system (e.g.,
atmospheric noise, galactic noise, man‐made noise), or internal
• A random process 𝑋 𝑡 , with 𝑡 taking values in the set 𝑇, is • The number of electrons, N(t), emitted in the time interval (0, t)
to the system.
constitutes a discrete stochastic process, the value of which increase by
said to be a Gaussian process if, for any integer k and any • The second category includes an important type of noise that one each time an electron is emitted.
subset 𝑡 , 𝑡 , … , 𝑡 of 𝑇 , the k random variables arises from spontaneous fluctuations of current or voltage in • Let the mean value of the number of electrons, v, emitted between times
𝑋 𝑡 ,𝑋 𝑡 ,…,𝑋 𝑡 is jointly Gaussian distributed. electrical circuits. This type of noise represents a basic limitation
t and t+t0 be E   t0
on the transmission or detection of signals in communication
• Gaussian process has the following properties: systems involving the use of electronic devices. λ: a constant called the rate of the process
• The total number of electrons emitted in the interval (t, t+t0) is
• Property 1: If a Gaussian process X(t) is applied to a stable linear • The two most common examples of spontaneous fluctuations   N t  t0   N t
filter, then the output of Y(t) is also Gaussian. in electrical circuits are:
• It follows a Poisson distribution with a mean value equal to λt0.
• Shot noise • The probability that k electrons are emitted in the interval (t, t+t0) is
• Thermal noise
t0 k e k
P  k   k  0, 1, 
k!
14‐02‐2021

White Noise Narrowband Noise


Thermal Noise
• The receiver of a communication system usually includes some
• The equivalent noise temperature of a system is defined as the provision for preprocessing the received signal.
• Thermal noise is the name given to the electrical noise arising temperature at which a noisy resistor has to be maintained such that, by
from the random motion of electrons in a conductor. • The preprocessing may take the form of a narrowband filter whose
connecting the resistor to the input of a noiseless version of the system,
bandwidth is just large enough to pass the modulated component of
it produces the same available noise power at the output of the system
the received signal essentially undistorted but not so large as to admit
• The mean‐square value of the thermal noise voltage VTN , as that produced by all the sources of noise in the actual system.
excessive noise through the receiver.
appearing across the terminals of a resistor, measured in a • The autocorrelation function is the inverse Fourier transform of the
• The noise process appearing at the output of such a filter is called
bandwidth of ∆𝑓 Hertz, is given by: power spectral density:
narrowband noise.
N
𝐸𝑉 4𝑘𝑇𝑅∆𝑓 volts2 RW τ  0  τ
2
𝑘 : Boltzmann’s constant=1.38 ×10‐23 joules per degree Kelvin.
• Any two different samples of white noise, no
𝑇 : Absolute temperature in degrees Kelvin.
matter how closely together in time they are
𝑅 : The resistance in ohms. taken, are uncorrelated.
• If the white noise w(t) is also Gaussian, then the two
samples are statistically independent. Fig. (a). PSD of narrowband noise. (b). Sample function of narrowband noise, which appears
somewhat similar to a sine wave of frequency fc, which undulates slowly in both amplitude and phase.

Representation of Narrowband Noise in Terms of In‐


White Noise Ideal Low‐Pass Filtered White Noise phase and Quadrature Components
• The noise analysis is customarily based on an idealized form of noise • Suppose that a white Gaussian noise 𝑤 𝑡 of zero mean and power
spectral density N0/2 is applied to an ideal low‐pass filter of • Consider a narrowband noise 𝑛 𝑡 of bandwidth 2B centered at
called white noise, the power spectral density of which is independent
bandwidth B Hz and passband amplitude response of one. frequency 𝑓 , it can be represented as
of the operating frequency.
• The PSD of the noise 𝑛 𝑡 is 𝑛 𝑡 𝑛 𝑡 co𝑠 2π𝑓 𝑡 𝑛 𝑡 sin 2π𝑓 𝑡
• White is used in the sense that white light contains equal amount 𝑛 𝑡 : in‐phase component of 𝑛 𝑡
of all frequencies within the visible band of electromagnetic 𝑛 𝑡 : quadrature component of 𝑛 𝑡
radiation. Both 𝑛 𝑡 and 𝑛 𝑡 are low‐pass signal.
• The power spectral density of white noise, with a sample
function denoted by w(t), is expressed as

• The autocorrelation function of n(t)is

N0  kTe
• The dimensions of N0 are in watts per Hertz, k is Boltzmanns
constant and Te is the equivalent noise temperature of the receiver. Fig. (a). Extraction of in‐phase and quadrature components of a narrowband process. (b).
Generation of a narrowband process from its in‐phase and quadrature components.
14‐02‐2021

Representation of Narrowband Noise in Terms Ideal Band‐Pass Filtered White Noise Representation of Narrowband Noise in Terms of
of In‐phase and Quadrature Components In‐phase and Quadrature Components
• Consider a white Gaussian noise of zero mean and power spectral density
N0/2, which is passed through an ideal band‐pass filter of passband
• 𝑛 𝑡 and 𝑛 𝑡 of a narrowband noise 𝑛 𝑡 have some magnitude response equal to one, mid‐band frequency fc, and bandwidth 2B. • The narrowband noise 𝑛 𝑡 can be represented in terms of its envelope and
important properties: phase components:
• The power spectral density characteristic of the filtered noise 𝑛 𝑡 is shown
𝑛 𝑡 𝑟 𝑡 𝑐𝑜𝑠 2𝜋𝑓 𝑡 Ψ 𝑡
1) The 𝑛 𝑡 and 𝑛 𝑡 of 𝑛 𝑡 have zero mean. in Fig. (a). The power spectral density characteristic of 𝑛 𝑡 and 𝑛 𝑡 are /
shown in Fig. 𝑟 𝑡 𝑛 𝑡 𝑛 𝑡
2) If 𝑛 𝑡 is Gaussian, then 𝑛 𝑡 and 𝑛 𝑡 are jointly Gaussian. 𝑟 𝑡 : envelope of 𝑛 𝑡 ;
3) If 𝑛 𝑡 is stationary, then 𝑛 𝑡 and 𝑛 𝑡 are jointly stationary. Ψ 𝑡 : phase of 𝑛 𝑡
4) Both 𝑛 𝑡 and 𝑛 𝑡 have the same power spectral density, which is
related to the power spectral density 𝑆 𝑓 of 𝑛 𝑡 as: • Both 𝑟 𝑡 and Ψ 𝑡 are sample functions of low‐pass random processes.
𝑆 𝑓 𝑓 𝑆 𝑓 𝑓 , 𝐵 𝑓 𝐵 • The probability distributions of 𝑟 𝑡 and Ψ 𝑡 may be obtained from those of
𝑆 𝑓 𝑆 𝑓 𝑛 𝑡 and 𝑛 𝑡 .
0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒

Representation of Narrowband Noise in Terms


Representation of Narrowband Noise in Terms Representation of Narrowband Noise of In‐phase and Quadrature Components
of In‐phase and Quadrature Components • Ideal Band‐Pass Filtered WhiteNoise
• The autocorrelation function of 𝑛 𝑡 is the inverse Fourier • Let 𝑁 and 𝑁 denote the random variables obtained by observing the
• Narrowband noise properties:
transform of the power spectral density characteristic: random processes represented by the sample functions 𝑛 𝑡 and 𝑛 𝑡
respectively .
5) 𝑛 𝑡 and 𝑛 𝑡 have the same variance as the narrowband  f c B N0 f c B N
RN     exp  j2 f  df   0
exp  j2 f  df • 𝑁 and 𝑁 are independent Gaussian random variables of zero mean and
noise 𝑛 𝑡 .  f c B 2 f c B 2
variance σ2. Their joint probability density function is given by:
6) The cross‐spectral density of 𝑛 𝑡 and 𝑛 𝑡 of 𝑛 𝑡 is purely
 N0B sinc2B [exp  j2 fc   exp  j2 fc )] 1 𝑛 𝑛
imaginary 𝑓 𝑛 ,𝑛 𝑒𝑥𝑝
𝑆 𝑓 𝑆 𝑓  2N0 B sinc 2B cos 2 fc  2𝜋𝜎 2𝜎
• Define: 𝑛 𝑟𝑐𝑜𝑠 𝜓 and 𝑛 𝑟𝑠𝑖𝑛 𝜓 . We have 𝑑𝑛 𝑑𝑛 𝑟𝑑𝑟𝑑𝜓.
𝑗𝑆 𝑓 𝑓 𝑆 𝑓 𝑓 , 𝐵 𝑓 𝐵 • The autocorrelation function of
0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒 • The joint probability density function of 𝑅 and Ψ is:
𝑛 𝑡 and 𝑛 𝑡 is given by:
7) If 𝑛 𝑡 is Gaussian and its power spectral density 𝑆 𝑓 is 𝑟 𝑟
symmetric about the mid‐band frequency 𝑓 , then 𝑛 𝑡 and RN    RN    2N0 B sinc2B  𝑓 , 𝑟, 𝜓
2𝜋𝜎
𝑒𝑥𝑝
2𝜎
I Q

𝑛 𝑡 are statistically independent.


• The 𝜓 is uniformly distributed inside the range 0 to 2𝜋.
14‐02‐2021

Representation of Narrowband Noise in Terms Representation of Narrowband Noise in Terms


of In‐phase and Quadrature Components of In‐phase and Quadrature Components
• The probability density function of the random variable R is:
𝑟 𝑟 • The joint probability density function of the random variables
𝑒𝑥𝑝 , 𝑟 0
𝑓 𝑟 𝜎 2𝜎
• 𝑁 and 𝑁 , corresponding to 𝑛 𝑡 and 𝑛 𝑡 is
0. 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
1 𝑛 𝐴 𝑛
• A random variable having this probability density function 𝑓 , 𝑛 ,𝑛 𝑒𝑥𝑝
2𝜋𝜎 2𝜎
of is said to be Rayleigh distributed.
• Let 𝑟 𝑡 denote the envelope of 𝑥 𝑡 and 𝜓 𝑡 denote its phase.
• The Rayleigh distribution in the normalized form
/ 𝑛 𝑡
𝜗 𝑟 𝑡 𝑛 𝑡 𝑛 𝑡 𝜓 𝑡 𝑡𝑎𝑛
𝜗𝑒𝑥𝑝 , 𝜗 0 𝑛 𝑡
𝑓 𝜗 2
• The joint probability density function of the random variables
0. 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
𝑅 and Ψ is given by:
𝑟 𝑟 𝐴 2𝐴𝑟𝑐𝑜𝑠 𝜓
𝑓 , 𝑟, 𝜓 𝑒𝑥𝑝
2𝜋𝜎 2𝜎
Normalized Rayleigh distribution

Representation of Narrowband Noise in Terms of Representation of Narrowband Noise in Terms


In‐phase and Quadrature Components of In‐phase and Quadrature Components
• The function fR,Ψ (r,ψ) cannot be expressed as a product fR(r)fΨ(ψ).
• Sinusoidal Signal Plus NarrowbandNoise • This is because we now have a term involving the values of both
random variables multiplied together as r cos ψ.
• A sample function of the sinusoidal signal 𝐴𝑐𝑜𝑠 2𝜋𝑓 𝑡 plus
narrowband noise 𝑛 𝑡 is given by: • Rician distribution:
𝑥 𝑡 𝐴𝑐𝑜𝑠 2𝜋𝑓 𝑡 𝑛 𝑡 2π • Modified Bessel function of
f R r   f R, r, ψdψ the first kind of zeroth order.
• Representing 𝑛 𝑡 in terms of its in‐phase and quadrature components 0

around the carrier frequency fc 𝑟 𝑟 𝐴 𝐴𝑟


𝑥 𝑡 𝑛 𝑡 𝑐𝑜𝑠 2𝜋𝑓 𝑡 𝑛 𝑡 𝑛 𝑡 𝐴 𝑛 𝑡 𝑒𝑥𝑝 𝑒𝑥𝑝 𝑐𝑜𝑠𝜓 𝑑𝜓
2𝜋𝜎 2𝜎 𝜎
• Assume that 𝑛 𝑡 is Gaussian with zero mean and variance 𝜎 .
• Both 𝑛 𝑡 and 𝑛 𝑡 are Gaussian and statistically independent.
• The mean of 𝑛 𝑡 is A and that of 𝑛 𝑡 is zero.
The Rician distribution reduces to the Rayleigh
• The variance of both 𝑛 𝑡 and 𝑛 𝑡 is 𝜎 . distribution for small a, and reduces to an approximate
Gaussian distribution when a is large.
Normalized Rician distribution
14‐02‐2021

You might also like