0% found this document useful (0 votes)
13 views79 pages

Chapter-4 Combined

The document discusses probability theory and random variables. It defines key concepts like random experiments, probability, random variables, probability density functions, important distributions, statistical averages, and random processes. Several types of important random variables are also described like the Bernoulli, binomial, uniform, and Gaussian distributions.

Uploaded by

hayatasil19
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views79 pages

Chapter-4 Combined

The document discusses probability theory and random variables. It defines key concepts like random experiments, probability, random variables, probability density functions, important distributions, statistical averages, and random processes. Several types of important random variables are also described like the Bernoulli, binomial, uniform, and Gaussian distributions.

Uploaded by

hayatasil19
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 79

Adama Science & Technology University

School of Electrical Engineering and Computing


Department of Electronics and Communication Engineering

Introduction to Communication systems


(ECE- 3202)

Chapter 4
Review On Random And Stochastic Process
Review of Probability Theory
Probability theory is based on the phenomena that can be modeled by an
experiment with an outcome that is subject to chance.
Definition: A random experiment is repeated n time (n trials) and the
event A is observed m times (m occurrences). The probability is the
relative frequency of occurrence m/n.
Introduction to Probability and Random Variables

A deterministic signal can be derived by mathematical expressions.


A deterministic model (or system) will always produce the same output
from a given starting condition or initial state.

Stochastic (random) signals or processes፡


•Counterpart to a deterministic process
•Described in a probabilistic way
•Given initial condition, many realizations of the process exists
Cont.
Define the probability of an event A as:

where N is the number of possible


outcomes of the random experiment and
NA is the number of outcomes favorable to
the event A.

For example:
A 6-sided die has 6 outcomes.
3 of them are even,
Thus P(even) = 3/6
Axiomatic Definition of Probability
•A probability law (measure or function) that assigns probabilities to events
such that:
oP(A) ≥ 0
oP(S) =1
oIf A and B are disjoint events (mutually exclusive),
i.e. A ∩ B = ∅, then P(A ∪ B) = P(A) + P(B)

That is if A happens, B cannot occur.


Some Useful Properties

 : probability of impossible event


, the complement of A
If A and B are two events, then

If the sample space consits of n mutually exclusive events such that
, then
Joint and Marginal Probability
Joint probability:
is the likelihood of two events occurring together.
Joint probability is the probability of event A occurring at the same time
event B occurs.
It is P(A ∩ B) or P(AB).

Marginal probability:
is the probability of one event, ignoring any information about the other
event.
Thus P(A) and P(B) are marginal probabilities of events A and B
Conditional Probability
Let A and B be two events. The probability of event B given that event A
has occured is called the conditional probability.

If the occurance of B has no effect on A, we say A and B are


indenpenderant events.
In this case
Combining both, we get , when A and B are indenpendent
Random Variables
Definition: A random variable is the assignment of a variable to represent
a random experiment.
X(s) denotes a numerical value for the event s.
When the sample space is a number line, x = s.
Cont.
Definition: The cumulative distribution function (cdf) assigns a probability
value for the occurrence of x within a specified range such that FX(x) = P[X
≤ x].
Properties:
◦ 0 ≤ FX(x) ≤ 1
◦ FX(x1) ≤ FX(x2), if x1 ≤ x2

For discrete random variables FX(x) is a stair-case function.


Examples of CDFs for discrete, continuous, and mixed random variables
are shown
Cont.
Cont.
Definition: The probability density function (pdf) is an alternative
description of the probability of the random variable X: fX(x) = d/dx FX(x)

P[x1 ≤ X ≤ x2] = P[X ≤ x2] - P[X ≤ x1]


= FX(x2) - FX(x1)
=  fX(x)dx over the interval [x1,x2]
Important Random Variables.
Bernoulli Random Variable. This is a discrete random variable taking two
values one and zero with probabilities p and 1 − p.
is a good model for a binary data generator.
a Bernoulli random variable can be employed to model the channel
errors.
Important Random Variables.
Binomial Random Variable. This is a discrete random variable giving the
number of 1’s in a sequence of n independent Bernoulli trials.
The PMF is given by

This random variable models, for example,


the total number of bits received in error when
a sequence of n bits is transmitted over a
channel with bit-error probability of p.
Important Random Variables.
Uniform Random Variable. This is a continuous random variable taking values
between a and b with equal probabilities over intervals of equal length.
The density function is given by

This is a model for continuous random variables whose range is known, but
nothing else is known about the likelihood of various values that the random
variable can assume.
For example, when the phase of a sinusoid is random it is usually modeled as a
uniform random variable between 0 and 2π.
Important Random Variables.
Uniform Random Variable.
Important Random Variables.
Gaussian or Normal Random Variable. This is a continuous random variable
described by the density function

The Gaussian random variable is the most important and frequently


encountered random variable in communications.
The reason is that thermal noise, which is the major source of noise in
communication systems, has a Gaussian distribution.
Important Random Variables.
Gaussian or Normal Random Variable.
Several Random Variables
CDF:
FX ,Y ( x, y )  PX  x, Y  y 

 x  y
Marginal cdf: FX ( x)  f
  
X ,Y (u , v) du dv FY ( y )  f
  
X ,Y (u , v) du dv

 
 2

PDF: f X ,Y ( x, y ) 
xy
FX ,Y ( x, y ) f
  
X ,Y (u , v) du dv  1

 

Marginal pdf: f X ( x)  f X ,Y ( x, v) dv fY ( y )  f

X ,Y (u , y ) du


f X ,Y ( x, y )
Conditional pdf: fY ( y | x) 
f X ( x)
Statistical Averages
The expected value of a random variable is a measure of the average of the value
that the random variable takes in a large number of experiments.

 X  EX    xf X x  dx


Function of a random variable:


Y  g( X )
 
EY    yf  y  dy  Eg X    g X  f x  dx
Y X
 
Cont.
nth moments:

  x
E Xn  n
f X x  dx


Central moments:

  x
E X2  2
f X x  dx Mean-square value of X


   x  

E X   X    f X x  dx
n n
X


   x  

E X   X      
2 2
X f X x dx  2
X Variance of X

Joint Moments
Correlation:
 

E X ,Y i k
   x y i k
f X ,Y x, y  dx dy Expected value of the product
- Also seen as a weighted inner product
  
Covariance:
covXY   E X  EX Y  EY 
 EXY   X Y
Correlation of the central moment

Correlation coefficient:   covXY   0, uncorrelated



 X  Y   1, strongly correlated
Random Processes
Definition: a random process is described as a time-varying random variable
A random process may be viewed as a collection of random variables, with
time t as a parameter running through all real numbers.
Cont.

Mean of the random process:  X t   EX t    xf X t  x  dx


Definition: a random process is first-order stationary if its pdf is constant

Definition: the autocorrelation is the expected value of the product of two


random variables at different times
f X t  x   f X t  x   X t    X
1 2
 X2 t    X2 Constant mean, variance

Stationary to
RX t1 , t 2   EX t1 X t 2  RX t1 , t 2   RX t1  t 2 
second order
Cont.

Definition: the autocovariance of a stationary random process is

C X t1 , t 2   E X t1    X  X t 2    X 


 RX t1  t 2    X2
Properties of Autocorrelation
Definition: autocorrelation of a stationary process only depends on the
time differences R    EX t   X t 
X

Mean-square value:  
RX 0   E X 2 t 

Autocorrelation is an even function: RX    RX   

Autocorrelation has maximum at zero: RX    RX 0


Example

Sinusoidal signal with random phase


X t   A cos2f c t   ,
 1
 , -    
f  t    2
 0, otherwise

Autocorrelation RX    EX t   X   As X(t) is compared to


itself at another time, we
A2
 cos2f c  see there is a periodic
2 behavior in correlation
Cross-correlation

Two random processes have the cross-correlation


X t , Y t 
RXY t , u   EX t Y u 

Wide-sense stationary cross-correlation

RX t ,   RX t   , RY u ,   RY u   
RXY  , u   RXY  
Power Spectral Density

Definition: Fourier transform of autocorrelation function is called power spectral


density 
S X  f    RX  e  j 2f dτ


RX τ    X
S  f e j 2f
df


Consider the units of X(t) Volts or Amperes


Autocorrelation is the projection of X(t) onto itself
Resulting units of Watts (normalized to 1 Ohm)
Properties of PSD


Zero-frequency of PSD S X 0    R   dτ
X


Mean-square value 
E X t  
2
 S X  f df


PSD is non-negative SX  f  0

PSD of a real-valued RP S X  f   S X  f 
Example

Text Example
Mixing of a random process with a sinusoidal process
Y t   X t cos2f c t   

Wide-sense stationary RP Uniformly distributed, but not


(to make it easier) time-varying

◦ Autocorrelation 1
RY    EY t   Y t   R X  cos2f c 
2
◦ PSD 1
SY  f   SY  f  f c   SY  f  f c 
4
Gaussian Process
The Gaussian probability density function for a single variable is

When the distribution has zero mean and unit variance


1   y  Y 2 
fY  y   exp  
2  Y  2 2
Y 

The random variable Y is said to be normally distributed as N(0,1)


1  y2 
fY  y   exp  
2  2
Stochastic Processes
Cont.
Cont.
Weakly Stationary
When we say that a process is stationary, we mean that it is weakly stationary, if
we do not explicitly say anything else.
A process that is not stationary is called non-stationary.
It is clear that every strictly stationary process with finite variance is also weakly
stationary.
Ensemble & Time Average
Mean and autocorrelation can be determined in two ways:

◦ The experiment can be repeated many times and the average taken over all these
functions. Such an average is called ensemble average.

◦ Take any one of these function as being representative of the ensemble and find
the average from a number of samples of this one function. This is called a time
average.

37
Ergodicity & Stationarity
If the time average and ensemble average of a random function are the same, it is
said to be ergodic.
A random function is said to be stationary if its statistics do not change as a
function of time.
◦ This is also called strict sense stationarity (vs. wide sense stationarity).
Any ergodic function is also stationary.

38
Ergodicity & Stationarity
For a stationary signal we have: x (t )  x

Stationarity is defined as: p( x1 , x2 , t1 , t 2 )  p( x1 , x2 , )


◦ Where   t 2  t1
And the autocorrelation function is : r ( )   x x p ( x , x , )
x1 , x2
1 2 1 2

39
Ergodicity & Stationarity
When x(t) is ergodic, its mean and autocorrelation are :

N
1
x  lim
N  2 N
 x(t )
t  N

1 N
r ( )  x(t ) x(t   )  lim
N  2 N

t  N
x(t ) x(t  )

40
Cross-correlation
The cross-correlation of two ergodic random functions is :

◦ The subscript xy indicates a cross-correlation.

N
1
rxy ( )  x(t ) y (t   )  lim
N  N
 x(t ) y(t   )
t  N

41
Power & Cross Spectral Density

The Fourier transform of r ( ) (the autocorrelation function of an ergodic


random function) is called the power
 spectral density of x(t) :
S ( )   r (
  
) e  j

The cross-spectral density of two ergodic random functions is :



S xy ( )   xy
r
  
( ) e  j

42
Part -2
Review of types of Noise and calculation of Noise
Noise
 Noise is random signal that exists in communication systems
Noise in electrical terms may be defined as any unwanted introduction of energy tending to interfere
with the proper reception and reproduction of transmitted signals.
Cont.
• Practically, we cannot avoid the existence of unwanted signal together with the
modulated signal transmitted by the transmitter.

• This unwanted signal is called noise.

• Noise is random signal that exists in communication system.

• Random signal cannot be represented with a simple equation

• The existence of noise will degrade the level of quality of the received signal
at the receiver.
Noise effect
• Degrades system performance for both analog and digital system.

• The receiver cannot understand the sender

• The receiver cannot function as it should be.

• Reduce the efficiency of communication system.


Types of noise
1. External noise
External noise is defined as the type of Noise which is generated externally

External Noise are analyzed qualitatively.

Now, External Noise may be classified as

a) Atmospheric Noise : Atmospheric Noise is also known as static noise which is the
natural source of disturbance caused by lightning, discharge in thunderstorm and the
natural disturbances occurring in the nature.
Cont.
b) Extraterrestrial Noise : Extraterrestrial Noise exist on the basis of their originating
source. They are subdivided into
i) Solar Noise
ii) Cosmic Noise
Solar noise is the noise that originates from the sun.

The sun radiates a broad spectrum of frequencies, including those, which are used for
broadcasting.

The sun is an active star and is constantly changing


Cont.
 Cosmic Noise
•Distant stars also radiate noise in much the same way as the sun.
•The noise received from them is called black body noise.
•Noise also comes from distant galaxies in much the same way as they come from the
milky way.
Cont.
c) Industrial Noise : Sources of Industrial noise are auto-mobiles, aircraft, ignition
of electric motors and switching gear.
The main cause of Industrial noise is High voltage wires. These noises is generally
produced by the discharge present in the operations.

Noise made by man easily outstrips any other between the frequencies of 1 to 600
MHz.

This includes such things as car and aircraft ignition, electric motors, switching equipment,
leakage from high voltage lines etc.
2. Internal Noise
 This is the noise generated by any of the active or passive devices found in
the receiver.
 This type of noise is random and difficult to treat on an individual basis but
can be described statistically.
 Random noise power is proportional to the bandwidth over which it is
measured.
Types of internal noise
Internal Noise are the type of Noise which are generated internally or within the
Communication System or in the receiver.
Internal Noises are classified as
a) Shot Noise : These Noise are generally arises in the active devices due to the
random behavior of Charge particles or carries. In case of electron tube, shot
Noise is produces due to the random emission of electron form cathodes.
b) Partition Noise : When a circuit is to divide in between two or more paths then
the noise generated is known as Partition noise. The reason for the generation is
random fluctuation in the division.
Cont.
c) Low- Frequency Noise : They are also known as FLICKER NOISE. These type of
noise are generally observed at a frequency range below few kHz. Power spectral density
of these noise increases with the decrease in frequency. That why the name is given Low-
Frequency Noise.

d) High- Frequency Noise : These noises are also known TRANSIT- TIME Noise. They
are observed in the semi-conductor devices when the transit time of a charge carrier while
crossing a junction is compared with the time period of that signal.

e) Thermal Noise : Thermal Noise are random and often referred as White Noise or
Johnson Noise. Thermal noise are generally observed in the resistor or the sensitive
resistive components of a complex impedance due to the random and rapid movement of
molecules or atoms or electrons.
Cont.
• electronic noise – generated by the thermal agitation of the charge carriers( the
electron) inside an electrical conductor in equilibrium, which happens regardless of
any applied voltage.
• Movement of electrons will form kinetic energy in the conductor related to the
temperature of the conductor.
• When the temperature increases, the movement of free electrons will increase and
the current flow through the conductor.
• Current flow due to the free electrons will create noise voltage, n(t).
• Noise voltage, n(t) is influenced by the temperature and therefore it is called thermal
noise.

 This type of noise is generated by all resistances


(e.g. a resistor, semiconductor, the resistance of a
resonant circuit, i.e. the real part of the
impedance, cable etc).
55
Cont.
Voltage and current models of a noisy resistor
Current model of noisy resistor
Addition of noise due to several sources in series
Addition of noise due to several sources in parallel
Resistance in Parallel

Proof:

R2 R1
Vo1  Vn1 Vo 2  V n 2
R1  R2 R1  R2
____ ___ ___
2 2 2
V V
n o1 V o2

____
R R 
V 
2 4kB
R 2

T1 R1 R12 T2 R2   1 2 
n
R1  R2 2 2
 R1 R2 
_____
_____
4kB R1 R2 (T1 R1 T2 R2 )  R1 R2 
V 2
 Vnr2  4kTB    4kTBR par
n
R1  R2 2  R1  R2 

63
Spectral densities of thermal noise
White noise
White noise
Noise in an idealized form is known as WHITE NOISE
WHITE NOISE contains all Frequency component in equal amount like white light consists
of all colors of light
If the probability of distribution of occurrence of a white noise is specified by a Gaussian
distribution function, then it is called White Gaussian noise.
Since power density spectrum of thermal and shot noise is independent of frequency, they
are referred as White Gaussian noise.
The power spectrum density of white noise is expressed as

Here the factor1/2 has been included to show that


half of the power is associated with the +ve frequencies
and remaining half with –ve frequencies shown in figure below.
Calculation of thermal noise for single noise source
Transfer function
Part-1
vno t 
H    Part-2
vni t 
v 2 no t  / R Power  P  PSD X BW
power   H   2
v t / R
2
ni
Po  S no  X BW
v 2 no t   H 2   v 2 ni t  Pi  S ni  X BW
from eqn(1)
Noise voltage o / p power Po  v 2 no t 
S no  X BW  H 2   S ni  X BW
Noise voltage i / p power Pi  v 2 ni t 
Po  H   Pi    (1)
2 S no    H 2   S ni  
Power relation with Power spectral density

PSD=POWER/BW
POWER=PSD X BW
Signal to noise ratio, noise figure, noise temperature,
calculation of noise figure.
Signal to noise ratio
Noise figure

NOTE:

Power gain, in decibels (dB), is defined as follows:


Noise temperature
Noise Factor of a Device(Friis' formula)
The noise factor of a device is related to its noise temperature Te

If several devices are cascaded, the total noise factor can be found with Friis' formula

where Fn is the noise factor for the n-th device, and Gn is the power gain (linear, not in dB) of the n-th
device.

The first amplifier in a chain usually has the most significant effect on the total noise figure because the
noise figures of the following stages are reduced by stage gains.
Friis formula for noise factor

Friis's formula is used to calculate the total noise factor of a cascade of stages, each with its own noise
factor and power gain (assuming that the impedances are matched at each stage).

The total noise factor is given as

Friis's formula can be equivalently expressed in terms of noise temperature


!!!
OU
K Y
AN
TH

You might also like