Chapter-4 Combined
Chapter-4 Combined
Chapter 4
Review On Random And Stochastic Process
Review of Probability Theory
Probability theory is based on the phenomena that can be modeled by an
experiment with an outcome that is subject to chance.
Definition: A random experiment is repeated n time (n trials) and the
event A is observed m times (m occurrences). The probability is the
relative frequency of occurrence m/n.
Introduction to Probability and Random Variables
For example:
A 6-sided die has 6 outcomes.
3 of them are even,
Thus P(even) = 3/6
Axiomatic Definition of Probability
•A probability law (measure or function) that assigns probabilities to events
such that:
oP(A) ≥ 0
oP(S) =1
oIf A and B are disjoint events (mutually exclusive),
i.e. A ∩ B = ∅, then P(A ∪ B) = P(A) + P(B)
If the sample space consits of n mutually exclusive events such that
, then
Joint and Marginal Probability
Joint probability:
is the likelihood of two events occurring together.
Joint probability is the probability of event A occurring at the same time
event B occurs.
It is P(A ∩ B) or P(AB).
Marginal probability:
is the probability of one event, ignoring any information about the other
event.
Thus P(A) and P(B) are marginal probabilities of events A and B
Conditional Probability
Let A and B be two events. The probability of event B given that event A
has occured is called the conditional probability.
This is a model for continuous random variables whose range is known, but
nothing else is known about the likelihood of various values that the random
variable can assume.
For example, when the phase of a sinusoid is random it is usually modeled as a
uniform random variable between 0 and 2π.
Important Random Variables.
Uniform Random Variable.
Important Random Variables.
Gaussian or Normal Random Variable. This is a continuous random variable
described by the density function
x y
Marginal cdf: FX ( x) f
X ,Y (u , v) du dv FY ( y ) f
X ,Y (u , v) du dv
2
PDF: f X ,Y ( x, y )
xy
FX ,Y ( x, y ) f
X ,Y (u , v) du dv 1
Marginal pdf: f X ( x) f X ,Y ( x, v) dv fY ( y ) f
X ,Y (u , y ) du
f X ,Y ( x, y )
Conditional pdf: fY ( y | x)
f X ( x)
Statistical Averages
The expected value of a random variable is a measure of the average of the value
that the random variable takes in a large number of experiments.
X EX xf X x dx
Central moments:
x
E X2 2
f X x dx Mean-square value of X
x
E X X f X x dx
n n
X
x
E X X
2 2
X f X x dx 2
X Variance of X
Joint Moments
Correlation:
E X ,Y i k
x y i k
f X ,Y x, y dx dy Expected value of the product
- Also seen as a weighted inner product
Covariance:
covXY E X EX Y EY
EXY X Y
Correlation of the central moment
Stationary to
RX t1 , t 2 EX t1 X t 2 RX t1 , t 2 RX t1 t 2
second order
Cont.
Mean-square value:
RX 0 E X 2 t
RX t , RX t , RY u , RY u
RXY , u RXY
Power Spectral Density
Zero-frequency of PSD S X 0 R dτ
X
Mean-square value
E X t
2
S X f df
PSD is non-negative SX f 0
PSD of a real-valued RP S X f S X f
Example
Text Example
Mixing of a random process with a sinusoidal process
Y t X t cos2f c t
◦ Autocorrelation 1
RY EY t Y t R X cos2f c
2
◦ PSD 1
SY f SY f f c SY f f c
4
Gaussian Process
The Gaussian probability density function for a single variable is
◦ The experiment can be repeated many times and the average taken over all these
functions. Such an average is called ensemble average.
◦ Take any one of these function as being representative of the ensemble and find
the average from a number of samples of this one function. This is called a time
average.
37
Ergodicity & Stationarity
If the time average and ensemble average of a random function are the same, it is
said to be ergodic.
A random function is said to be stationary if its statistics do not change as a
function of time.
◦ This is also called strict sense stationarity (vs. wide sense stationarity).
Any ergodic function is also stationary.
38
Ergodicity & Stationarity
For a stationary signal we have: x (t ) x
39
Ergodicity & Stationarity
When x(t) is ergodic, its mean and autocorrelation are :
N
1
x lim
N 2 N
x(t )
t N
1 N
r ( ) x(t ) x(t ) lim
N 2 N
t N
x(t ) x(t )
40
Cross-correlation
The cross-correlation of two ergodic random functions is :
N
1
rxy ( ) x(t ) y (t ) lim
N N
x(t ) y(t )
t N
41
Power & Cross Spectral Density
42
Part -2
Review of types of Noise and calculation of Noise
Noise
Noise is random signal that exists in communication systems
Noise in electrical terms may be defined as any unwanted introduction of energy tending to interfere
with the proper reception and reproduction of transmitted signals.
Cont.
• Practically, we cannot avoid the existence of unwanted signal together with the
modulated signal transmitted by the transmitter.
• The existence of noise will degrade the level of quality of the received signal
at the receiver.
Noise effect
• Degrades system performance for both analog and digital system.
a) Atmospheric Noise : Atmospheric Noise is also known as static noise which is the
natural source of disturbance caused by lightning, discharge in thunderstorm and the
natural disturbances occurring in the nature.
Cont.
b) Extraterrestrial Noise : Extraterrestrial Noise exist on the basis of their originating
source. They are subdivided into
i) Solar Noise
ii) Cosmic Noise
Solar noise is the noise that originates from the sun.
The sun radiates a broad spectrum of frequencies, including those, which are used for
broadcasting.
Noise made by man easily outstrips any other between the frequencies of 1 to 600
MHz.
This includes such things as car and aircraft ignition, electric motors, switching equipment,
leakage from high voltage lines etc.
2. Internal Noise
This is the noise generated by any of the active or passive devices found in
the receiver.
This type of noise is random and difficult to treat on an individual basis but
can be described statistically.
Random noise power is proportional to the bandwidth over which it is
measured.
Types of internal noise
Internal Noise are the type of Noise which are generated internally or within the
Communication System or in the receiver.
Internal Noises are classified as
a) Shot Noise : These Noise are generally arises in the active devices due to the
random behavior of Charge particles or carries. In case of electron tube, shot
Noise is produces due to the random emission of electron form cathodes.
b) Partition Noise : When a circuit is to divide in between two or more paths then
the noise generated is known as Partition noise. The reason for the generation is
random fluctuation in the division.
Cont.
c) Low- Frequency Noise : They are also known as FLICKER NOISE. These type of
noise are generally observed at a frequency range below few kHz. Power spectral density
of these noise increases with the decrease in frequency. That why the name is given Low-
Frequency Noise.
d) High- Frequency Noise : These noises are also known TRANSIT- TIME Noise. They
are observed in the semi-conductor devices when the transit time of a charge carrier while
crossing a junction is compared with the time period of that signal.
e) Thermal Noise : Thermal Noise are random and often referred as White Noise or
Johnson Noise. Thermal noise are generally observed in the resistor or the sensitive
resistive components of a complex impedance due to the random and rapid movement of
molecules or atoms or electrons.
Cont.
• electronic noise – generated by the thermal agitation of the charge carriers( the
electron) inside an electrical conductor in equilibrium, which happens regardless of
any applied voltage.
• Movement of electrons will form kinetic energy in the conductor related to the
temperature of the conductor.
• When the temperature increases, the movement of free electrons will increase and
the current flow through the conductor.
• Current flow due to the free electrons will create noise voltage, n(t).
• Noise voltage, n(t) is influenced by the temperature and therefore it is called thermal
noise.
Proof:
R2 R1
Vo1 Vn1 Vo 2 V n 2
R1 R2 R1 R2
____ ___ ___
2 2 2
V V
n o1 V o2
____
R R
V
2 4kB
R 2
T1 R1 R12 T2 R2 1 2
n
R1 R2 2 2
R1 R2
_____
_____
4kB R1 R2 (T1 R1 T2 R2 ) R1 R2
V 2
Vnr2 4kTB 4kTBR par
n
R1 R2 2 R1 R2
63
Spectral densities of thermal noise
White noise
White noise
Noise in an idealized form is known as WHITE NOISE
WHITE NOISE contains all Frequency component in equal amount like white light consists
of all colors of light
If the probability of distribution of occurrence of a white noise is specified by a Gaussian
distribution function, then it is called White Gaussian noise.
Since power density spectrum of thermal and shot noise is independent of frequency, they
are referred as White Gaussian noise.
The power spectrum density of white noise is expressed as
PSD=POWER/BW
POWER=PSD X BW
Signal to noise ratio, noise figure, noise temperature,
calculation of noise figure.
Signal to noise ratio
Noise figure
NOTE:
If several devices are cascaded, the total noise factor can be found with Friis' formula
where Fn is the noise factor for the n-th device, and Gn is the power gain (linear, not in dB) of the n-th
device.
The first amplifier in a chain usually has the most significant effect on the total noise figure because the
noise figures of the following stages are reduced by stage gains.
Friis formula for noise factor
Friis's formula is used to calculate the total noise factor of a cascade of stages, each with its own noise
factor and power gain (assuming that the impedances are matched at each stage).