Module 1 Analog Communication
Module 1 Analog Communication
Communication /
23ECPC208
Syllabus
• Module 1:
Random Variables and Processes
Random Variables: Introduction, Probability, Conditional
probability, Random variables, Statistical Averages, Function of a
random variable, Moments
Random Process: Mean, Correlation and covariance function,
Properties of autocorrelation function, Cross–correlation functions,
Gaussian process, Gaussian distribution function
Laboratory Components: (Using Simulation Tools)
• Generate the probability density function of the Gaussian
distribution function
• To plot the correlation between waves
• Module 2:
Amplitude Modulation
Communication System: Introduction to communication System,
AM concepts, Modulation index and percentage of modulation,
Sidebands and the frequency domain, AM power, DSB-SC
modulation, Single sideband modulation, VSB modulation,
Balanced modulators
Laboratory Components: (Using Discrete Components)
• To design and verify amplitude modulation for a given modulation
index
• To design and verify DSB-SC modulation
Module 3:
Noise in Modulation Systems
Noise: Introduction, Shot noise, Thermal noise, White noise, Noise
equivalent bandwidth, Narrow bandwidth, Noise figure, Equivalent
noise temperature
Noise in continuous wave modulation systems: Receiver model,
Noise in DSB-SC receivers, Noise in SSB receivers, Noise in AM
receivers, Threshold effect, Noise in FM receivers
Laboratory Components: (Using Simulation Tools)
• To find the Signal to Noise Ratio of a given signal
Module 4:
• Angle Modulation
Fundamentals of Frequency Modulation: Basic principles of
frequency modulation, Principles of phase modulation, Modulation
index and sidebands, Noise suppression effects of FM, Frequency
modulation versus Amplitude modulation. Ratio discriminator, FM
threshold effect, Pre-emphasis and De-emphasis
Laboratory Components: (Using Discrete Components)
• To design and verify Frequency modulation
• To design and verify Pre-emphasis and De-emphasis
Module 5:
Radio Transmission Concepts
• Radio Transmitters: Transmitter fundamentals, Transmitter
configurations, Carrier generators, Crystal oscillators, Frequency
synthesizers, Phase locked loop synthesizers
• Communication Receivers: Basic principles of signal
reproduction, Superheterodyne receivers, Squelch, Frequency
conversion: Mixing principles, Mixer and converter designs, Local
oscillators and frequency synthesizers, Intermediate frequency and
images
Laboratory Components: (Using Simulation Tools / Discrete
Components)
• Design and test a mixer used for frequency translation
Text Books:
1. Herbert Taub, Donald L Schilling, “Principles of Communication
Systems” 2nd Edition, McGraw-Hill, 2017
2. Rodger E Zeimer, William H Tranter, “Principles of Communication
Systems, Modulation and Noise” 7th Edition, Wiley Publications,
2015
3. George Kennedy, “Electronic Communication Systems” 5th Edition,
McGraw-Hill, 2012 4. Simon Haykins & Moher, “Communication
Systems”, 5th Edition, John Wiley India Pvt. Ltd, 2010
Reference Books:
1. B P Lathi, Zhi Ding, “Modern Digital and Analog Communication
Systems”, 4th Edition, Oxford University Press, 2010
2. Masoud Salehi, John G. Proakis, “Communication System
Engineering”, 2nd Edition, Prentice Hall, 2018
3. A. Michael Noll, “Introduction to Telephones and Telephone
Systems” 3rd Edition, Artech House Publishers, 2005
4. Louis E Frenzel, “Principles of Electronic Communication Systems”,
3rd Edition, McGraw-Hill (India), 2016
Introduction
• There are 2 types of signals
• Deterministic signal: These are class of signals that may be
modeled as completely specified functions of time.
• Random signal: It is encountered in every practical
communication system. Signal is random if it is not possible to
predict its precise value in advance.
Example: Radio communication system. Received signal has
information bearing signal+ random interference component +
receiver noise.
It is not possible to predict the precise value of random signal
in advance, it can be described in terms of its statistical
properties such as average power in the random signal or
average spectral distribution of this power.
• The mathematical discipline that deals with the statistical
characterization of random signals is Probabilility Theory.
• Random Process: It consists of ensemble (family) of sample
functions, each of which varies randomly with time.
• Random variable: It is obtained by observing a random
process at a fixed instant of time.
Probability
• Probability theory is rooted in the phenomenon that can be
modeled by an experiment with an outcome that is subject to
chance.
• If the experiment is repeated the outcome can be different
because of the influence of underlying random phenomenon/
chance mechanism. Such an experiment is referred as random
experiment.
Example: Tossing of fair coin, the possible outcomes of the
trials are “heads” or “Tails”.
• There are 2 approaches to the definition of probability.
• First approach is based on probability of occurance:
In “n” trials of a random experiment, an event “A” occurs “m”
times then we assign the probabilility “m/n” to the event “A”
• In second case, definition based on set theory is used and a set
of related mathematical axioms.
• In situations where experiments are repeatable, the set theory
approach agrees completely with the relative frequency of
occurrence approach.
• Consider an experiment and its possible outcomes as defining
a space and its points.
• If an experiment has K possible outcomes, then for the kth
possible outcome we have a point called sample point, denoted
by sk
• With these we have following definitions (Axioms)
The set of all possible outcomes of the experiments is called
the sample space, denoted by S.
An event corresponds to either a single sample points or a set
of sample points in the space S.
A single sample point is called an elementary event.
An entire sample space S is called the Sure event; and the null
set ø is called the null or impossible event.
Two events are mutually exclusive if the occurrence of one
event precludes the occurrence of the other event.
• The sample space S may be discrete with countable number of
outcomes such as outcome of tossing a die.
• The sample space S may be continuous, such as voltage
measured at the output of a noise source.
• A probability measure “P” is a function that assigns a non-negative
number to an event “A” in the sample space “S” and satisfies the
following 3 properties (axioms):
1. 0<=P[A]<=1
2. P[S]=1
3. If “A” and “B” are two mutually exclusive events, then,
P[AUB] = P[A]+P[B]
• Consider probability system illustrated in Figure 5.1.
• The sample space S is mapped to event via random experiment
• The probability function assigns a value between 0 and 1 to each of
events via random experiment.
• The probability is not unique to the event.
• Mutually exclusive events may be assigned the same probability.
• The probability of union of all events, sure event is always unity.
• The 3 axioms and their relationship to the relative frequency
may be illustrated by the Venn diagram of figure 5.1.
• If we equate P to a measure of the area in the venn diagram with
the total area of “S” equal to 1, then the axioms are simple
statements of familiar geometric results regarding area.
• The following properties of probability measure P may be
derived from the above axioms:
1. P[Al] = 1- P[A], where A’ is the complement of the event A.
2. When events A and B are not mutually exclusive, then the
probability of the union event “A” or “B” satisfies
P[AUB] = P[A]+P[B]- P[A B]
Where, P[A B] is the probability of the joint event “A” and “B”.
3. If A1, A2, ………………., Am are mutually exclusive events
that include all the possible outcomes of the random
experiments, then
P[A1]+P[A2]+…………+P[Am]=1
Conditional Probability:
• Suppose in an experiment a pair of outputs ‘A’ and ‘B’ are
events.
• Let P[B/A] is the Probability of event ‘B’, given that ‘A’ has
occurred then P[B/A] is a conditional Probability of B given
A.
• i.e P[B/A] = P[A B] / P[A]
• Where, P[A B] is joint probability of A and B.
• P[A B] = P[B/A] P[A]
• P[A B] = P[A/B] P[B]
• The joint probability of two events may be expressed as the
product of conditional probabilility of one event given other
and the elementary probability of the other.
• Conditional Probability, if P[A] is not equal to 0 then
P[B/A] = P[A/B] P[B]/ P[A],
this is special form of Bayes rule.
• Suppose P[B/A] is equal to elementary probability of
occurrence of event B, i.e
P[B/A] = P[B]
• Then Joint probability, P[A B] is
P[A B] = P[A] P[B]
Random Variables:
• A function whose domain is a sample space and range is a set
of real numbers is called as Random variable of the
experiment.
• Example: In tossing a coin, heads corresponds to ‘1’ and tail to
‘0’.
• These numbers are assigned to outcomes of experiments.
• If the outcome of the experiment is ‘S’ then random variable is
X(S) or X.
• The particular outcome of random experiment is X(Sk) =x.
• Note: there may be more than one random variable associated
with the same random experiment.
• Random variable can take discrete values or continuous
values.
• Random variable takes only finite number of discrete values,
then its called discrete random variable.
• Random variable takes continuous values, then its called
continuous random variable.
• Let ‘X’ be a random variable and probability of the event
‘X’<=x i.e P[X<=x]
• The function,
FX[x] = P[X<=x],
is a cumulative distribution function [CDF].
• Where, 1) FX(x) is bounded between zero and one.
• 2) FX(x) is a monotone – non decreasing function.
i.e, FX(x1) <= FX(x2) if x1< x2
• The derivative of CDF, FX[x] = d FX[x] / dx
Several Random Variables
• Few experiments requires several random variables for
description.
• So if ‘X’ and ‘Y’ are two random variables. The joint
distribution function is FX,Y(x,y), where x, y are specifies
values.
• Joint distribution function, FX,Y(x,y) is defined as the
probability that the random variable X is less than or equal to a
specified value x and that the random variable Y is less than or
equal to a specified value y.
• Then the output of the experiment result is sample point lying
inside of the joint sample space,
• i.e FX,Y(x,y) = P[X<=x, Y<=y]
• Suppose joint PDF is continuous then partial derivation
is continuous.
• Joint PDF is monotonic, Non decreasing function of both x
and y.
• The total valued under graph of the joint probability density
function must be unity as shown by,
i.e
• Where, E denotes the statistical expectation operator and µX
denotes mean which gives center of gravity of an area under
probability density function.
Functions of a Random Variable
• Let ‘X’ denote a random variable, g(X) denote a real-valued
function.
Y=g(X)
• Then expected value of y with PDF fY(y) is
Question:
The Random Variable y is the function of another random
variables ‘X’ such that y = cos(x). Where x is a random
variable uniformly distributed in interval (-, ).
• Variance is denoted as
• Square root of the variance is standard deviation of random
variable X.
• Where,
• X(t) is a Gaussian Hypothesis. The input X(t) to a linear filter
is a Gaussian Process, then output Y(t) is also Gaussian
Process.
• Property 2
Consider the set of random variables or sample X(t1), X(t2),
X(t3), . . . . . . . , X(tn), obtained by observing a random process
X(t) at time t1, t2, t3,. . . . . . ., tn .If the process X(t) is Gaussian,
then this set of random variables is jointly Gaussian for any n,
with their n-fold joint probability density function being
completely determined by specifying the set of means