0% found this document useful (0 votes)
54 views44 pages

ADC - Lecture 3 Representation of Information

The document discusses Fourier analysis and its applications in signal processing. It covers Fourier series, the Fourier transform, impulse response and frequency response, energy and power spectral density, autocorrelation functions, and signal transmission through linear systems. Key concepts are defined for Fourier series, the discrete Fourier transform and fast Fourier transform, spectral density functions, autocorrelation, and characteristics of ideal distortionless transmission.

Uploaded by

nabeel hasan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
54 views44 pages

ADC - Lecture 3 Representation of Information

The document discusses Fourier analysis and its applications in signal processing. It covers Fourier series, the Fourier transform, impulse response and frequency response, energy and power spectral density, autocorrelation functions, and signal transmission through linear systems. Key concepts are defined for Fourier series, the discrete Fourier transform and fast Fourier transform, spectral density functions, autocorrelation, and characteristics of ideal distortionless transmission.

Uploaded by

nabeel hasan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 44

ADC

Frequency Domain Analysis


Representation of Information
Today’s Goal
Fourier series
Fourier transform
Impulse response and frequency response
Energy and power spectral density
Autocorrelation function
Transmission though Linear System
Probability and Random Process

2
Fourier Series
A periodic signal can be represented by appropriately weighted cosines or
complex exponentials

6
CTFT Examples
DFT and FFT
 The sequence of N complex numbers x0, ..., xN−1 is transformed into
the sequence of N complex numbers X0, ..., XN−1 by the Discrete
Fourier Transform (DFT) according to the formula:

 Fast Fourier Transform (FFT) is an efficient algorithm to compute


DFT and its inverse, Cooley-Tukey algorithm.
Spectral Density

The spectral density of a signal characterizes the distribution of the signal’s


energy or power in the frequency domain.

This concept is particularly important when considering filtering in


communication systems while evaluating the signal and noise at the filter
output.

The energy spectral density (ESD) or the power spectral density (PSD) is
used in the evaluation.

15
Energy Spectral Density (ESD)
Energy spectral density describes the signal energy per unit bandwidth measured
in joules/hertz.
Represented as ψx(f), it is given by
2
x( f ) X ( f )
As, the energy of x(t):  

 x 2 (t) dt = 
2
Ex = |X(f)| df
- -


Therefore: Ex = 
-
x (f) df

The Energy spectral density is symmetrical in frequency about origin and total
energy of the signal x(t) can be expressed
 as:
E x = 2  x (f) df
0

16
Power Spectral Density (PSD)
The power spectral density (PSD) function Gx(f ) of the periodic signal x(t) is a
real, even, and nonnegative function of frequency that gives the distribution
of the power of x(t) in the frequency domain.
PSD is represented as:

G x (f ) =  n  ( f  nf0 )
|C
n=-
| 2

Whereas the average power of a periodic


T0 /2 signal x(t) is represented

as:
1
Px 
T0  x 2 (t) dt   n
|C
n=-
| 2

T0 / 2

Using PSD, the average normalized power of a real-valued signal is


represented as:  
Px  G

x (f) df  2  G x (f) df
0

17
Autocorrelation

Correlation is a matching process; autocorrelation refers to the matching


of a signal with a delayed version of itself.
Autocorrelation function of a real-valued energy signal x(t) is defined as:

R x ( ) =  x(t) x (t +  ) dt

for - <  < 

The autocorrelation function Rx(τ) provides a measure of how closely the


signal matches a copy of itself as the copy is shifted τ units in time.
Rx(τ) is not a function of time; it is only a function of the time difference τ
between the waveform and its shifted copy.

18
Autocorrelation of an Energy Signal
The autocorrelation function of a real-valued energy signal has the
following properties:

R x ( ) =R x (- )
symmetrical about zero

R x ( )  R x (0) for all 


maximum value occurs at the
origin
R x ( ) autocorrelation
 x (f) and ESD form a
Fourier transform pair, as
designated by the double-headed arrows

value2at the origin is equal to

R x (0)  thexenergy

(t) dtof the signal

19
Autocorrelation of a Power Signal
Autocorrelation function of a real-valued power signal x(t) is defined as:

T /2
1
R x ( )  lim
T 

T T / 2
x(t) x (t +  ) dt for - <  < 

When the power signal x(t) is periodic with period T0, the autocorrelation
function can be expressed as

T0 / 2
1
R x ( ) 
T0 
T0 / 2
x(t) x (t +  ) dt for - <  < 

20
Autocorrelation of a Power Signal

The autocorrelation function of a real-valued periodic signal has the


following properties similar to those of an energy signal:

R x ( ) =R x symmetrical
(- ) in about zero
R x ( )  R x (0) for all value occurs at the
maximum
origin
R x ( )  Gautocorrelation
x (f) and PSD form a
Fourier transform
T0 / 2 pair
1
R x (0)  
2
value x
at (t)
the dtorigin is equal to
T0  T0 / 2
the average power of the signal

21
Signal Transmission through
Linear Systems
A system can be characterized equally well in the time domain
or the frequency domain

The systems under consideration are assumed to be LTI.

It is also generally assumed that there is no stored energy in the


system at the time the input is applied i.e. zero initial condition.

22
Impulse Response
The linear time invariant system or network is characterized in the time domain by an impulse
response h (t ),to an input unit impulse (t)

y (t )  h(t ) when x(t )   (t )


The response of the network to an arbitrary input signal x (t )is found by the convolution of
x(t )with h(t )

y (t )  x(t )  h(t )   x( )h(t   )d


The system is assumed to be causal, which means that there can be no output prior to the
time, t =0,when the input is applied.
The above relation is called convolution integral .

23
Frequency Transfer Function

The frequency-domain output signal Y (f )is obtained by taking the Fourier transform
of

Y( f ) X( f )H( f )
Frequency transfer function or the frequency response is the FT of the impulse
response and is defined as:
Y( f )
H( f ) 
X(f )
H ( f )  H ( f ) e j ( f )
The phase response is defined as:
Im{H ( f )}
 ( f )  tan 1

Re{H ( f )}

24
Distortionless Transmission
The output signal from an ideal transmission line may have some time
delay and different amplitude than the input
It must have no distortion—it must have the same shape as the input.
For ideal distortion-less transmission:

Output signal in time domain


y (t )  Kx(t  t0 )
Output signal in frequency domain
Y ( f )  KX ( f )e  j 2 ft0
System Transfer Function

H ( f )  Ke  j 2 ft0

25
Distortionless Transmission

The overall system response must have a constant magnitude


response
The phase shift must be a linear function of frequency
All of the signal’s frequency components must also arrive with
identical time delay in order to add up correctly
Time delay t0 is related to the phase shift  and the radian frequency
 = 2f by: ( f )  
1 d ( f )
2 df

t0 (seconds) =  (radians) / 2f (radians/seconds)


Another characteristic often used to measure delay distortion of a
signal is called envelope delay or group delay:

26
Ideal Filters
For the ideal low-pass filter transfer function with bandwidth Wf = fu hertz
can be written as:

H ( f )  H ( f ) e  j ( f )

Where

1 for | f |  f u
H( f ) 
0 for | f |  f u

 j ( f )  j 2 ft0
e e
Ideal low-pass filter
29
Ideal Filters
The impulse response of the ideal low-pass filter:

h(t )  1{H ( f )}

 

H ( f )e j 2 ft df

fu

 
 fu
e  j 2 ft0 e j 2 ft df

fu

 
 fu
e j 2 f (t t0 ) df

sin 2 fu (t  t0 )
 2 fu
2 fu (t  t0 )
 2 fu sin nc 2 fu (t  t0 )

30
Ideal Filters

For the ideal band-pass filter  For the ideal high-pass filter
transfer function transfer function

Figure1.11 (a) Ideal band-pass filter Figure1.11 (c) Ideal high-pass filter

31
Bandwidth Of Digital Data
Baseband versus Bandpass
An easy way to translate the spectrum of a
low-pass or baseband signal x(t) to a higher
frequency is to multiply or heterodyne the
baseband signal with a carrier wave cos 2fct
xc(t) is called a double-sideband (DSB)
modulated signal
xc(t) = x(t) cos 2fct
From the frequency shifting theorem
Xc(f) = 1/2 [X(f-fc) + X(f+fc) ]
Generally the carrier wave frequency is
much higher than the bandwidth of the
baseband signal
fc >> fm and therefore WDSB = 2fm

33
Probability Theory and Random Processes

35
Random Signals
Random Variables

All useful message signals appear random; that is, the receiver does not know, a
priori, which of the possible waveform have been sent.

Let a random variable X(A) represent the functional relationship between a


random event A and a real number.

The (cumulative) distribution function FX(x) of the random variable X is given by


FX ( x)  P ( X  x)

dFX variable
Another useful function relating to the random ( x ) X is the probability
density function (pdf) PX ( x) 
dx
37
Ensemble Averages

xp
The first moment of a probability
m X  E{ X }  X ( x )dx distribution of a random variable X is
 called mean value mX, or expected

value of a random variable X
E{ X 2 }  

x 2 p X ( x)dx The second moment of a probability
distribution is the mean-square value
of X

var( X )  E{( X  m X ) 2 } Central moments are the moments of


the difference between X and mX and
 the second central moment is the
 

( x  m X ) 2 p X ( x)dx variance of X
Variance is equal to the difference
between the mean-square value and
var( X )  E{ X 2 }  E{ X }2 the square of the mean

38
Random Signals
Random Processes
2. Random Processes
A random process X(A, t) can be viewed as a function of two variables: an
event A and time.

39
Statistical Averages of a Random Process

A random process whose distribution functions are continuous can be described


statistically with a PDF.
In general, the form of the PDF of a random process may be different at different
times.
A partial description consisting of the mean and autocorrelation function are
often adequate for the needs of communication systems.
Mean of the random process
 X(t) :
E{ X (tk )}   xp

Xk ( x) dx  m X (tk )

Autocorrelation function of the random process X(t)


RX (t1 , t2 )  E{ X (t1 ) X (t2 )}
40
Central Limit Theorem
Under very general conditions, the probability distributions of sum of
statistically independent random variables approaches the Gaussian
distribution, as their number approaches infinity, no matter what the
individual distributions may be.

47
Random Processes and Linear Systems

If a random process forms the input to a time-invariant


linear system, the output will also be a random process.

The input power spectral density GX(f )and the output


power spectral density GY (f )are related as:

2
GY ( f )  GX ( f ) H ( f )

48
Noise in Communication Systems

The term noise refers to unwanted electrical signals that are always
present in electrical systems; e.g spark-plug ignition noise, switching
transients, and other radiating electromagnetic signals.

We can describe thermal noise as1a zero-mean


 1 n   Gaussian random process.
2

p ( n)  exp     
 2  2    

A Gaussian process n(t) is a random function whose amplitude at any


arbitrary time t is statistically characterized by the Gaussian probability
density function

49
Noise in Communication Systems
The normalized or standardized Gaussian density function of a zero-mean
process is obtained by assuming unit variance.

50
White Noise
The primary spectral characteristic of thermal noise is that its power spectral density is the
same for all frequencies of interest in most communication systems
Power spectral density Gn(f )

N0
Gn ( f )  watts / hertz
2
Autocorrelation function of white noise is

N0
Rn ( )   {Gn ( f )} 
1
 ( )
2
This means that two samples of noise are highly uncorrelated.

N0
p ( n)   df  
The average power Pn of white noise is infinite


2
51
White Noise

52
White Noise

The effect on the detection process of a channel with additive white


Gaussian noise (AWGN) is that the noise affects each transmitted symbol
independently.

Such a channel is called a memoryless channel.

The term “additive” means that the noise is simply superimposed or added
to the signal

53
Assignment
Do following problems from Sklar’s book:
◦ 1.2, 1.4, 1.5, 1.12, 1.14, 1.15
Date:
Submission Date:

54

You might also like