ADTT c1
ADTT c1
(DTS)?
• Why “digital” instead of “analog”?
• What do we need to know before taking off toward
designing a DTS?
– Shannon Channel Capacity
– Classification of signals
– Random process
– Autocorrelation
– Power and energy spectral densities
– Noise in communication systems
– Signal transmission through filters; linear distortions
Lecture 1 20/2/2019 1
Communications era: telephones, radios and televisions, computer
terminals to access Internet; senses for ships, aircrafts, rockets and
satellites → Passing to mobile communications;
Lecture 1 20/2/2019 2
Data = encoded information represented by binary elements;
the fundamental information unit = binary digit (bit).
Lecture 1 20/2/2019 3
• Advantages of digital communications:
– Regenerator (decision + reshaping) receiver
Original Regenerated
pulse pulse
Propagation distance
Lecture 1 20/2/2019 4
Noise + other impairments
Transmitter
Source Channel
Formatter Modulator
encoder encoder
Receiver
Source Channel
Formatter Demodulator
decoder decoder
Lecture 1 20/2/2019 5
• Important features of a DTS:
– Transmitter sends a waveform from a finite set
of possible waveforms during a limited time
– Channel distorts, attenuates the transmitted
signal and adds noise to it.
– Receiver decides which waveform was
transmitted from the noisy received signal
– Quality of Service (QoS): Probability of
erroneous decision is an important measure for
the system performance
Lecture 1 20/2/2019 6
Two major communication resources:
◦ Transmit power and channel bandwidth
◦ Bandwidth-limited systems:
save bandwidth at the expense of power (for example, by using
spectrally efficient modulation schemes)
Lecture 1 20/2/2019 7
Channel capacity: The maximum data rate at
which the error-free communication over the
channel is performed (1948). Claude Elwood Shannon
1916 - 2001
Channel capacity of the noisy (AWGN) channel
(Shannon-Hartley capacity theorem): C. E. Shannon, “A
Mathematical Theory of
S Communication”, The
C = W log 2 1 + [bits/s] Bell System Technical
N Journal, Vol. 27, pp.
379–423, 623–656, July,
W [ Hz ] : Bandwidth Oct., 1948.
S [W] : Average received signal power
N = N 0W [W] : Average noise power
Lecture 1 20/2/2019 8
Shannon theorem puts a limit on
transmission data rate, not on error Claude Elwood Shannon
probability: 1916 - 2001
Lecture 1 20/2/2019 9
Spectral efficiency = C/W [b/s/Hz]
Practical region
SNR [dB]
Lecture 1 20/2/2019 10
• Deterministic and random signals
– Deterministic signal: No uncertainty with respect
to the signal value at any time.
• Are modeled by explicit mathematical expressions,
such as x ( t ) = 5 co s(1 0 t )
– Random signal: Some degree of uncertainty in
signal values before it actually occurs.
• Can be described in terms of statistical averages
• Example: Thermal noise in electronic circuits due to
the random movement of electrons
• Ex.: Reflection of radio waves from different surfaces
Lecture 1 20/2/2019 11
• Periodic and non-periodic signals
−T 0 T 2T
Analog signal Discrete signal
Lecture 1 20/2/2019 12
• Power and energy of signals
– The instantaneous power of a complex signal x(t):
∆
2
p (t ) = x (t )
– The energy disipated during an interval T:
∆ T /2
∫
2
E xT = x (t ) d t
−T /2
Lecture 1 20/2/2019 13
• Energy and power signals
– A signal is an energy signal if, and only if, it has
nonzero but finite energy for all time: (0 < Ex < ∞)
∆ T /2 ∞
∫ ∫
2 2
E x = lim x (t ) d t = x (t ) d t
T→∞
−T /2 −∞
Lecture 1 20/2/2019 14
• Energy signals (Parseval’s theorem):
• Random process:
– Power spectral density (PSD):
Lecture 1 20/2/2019 15
• Autocorrelation of an energy signal (real-valued x(t))
∆ ∞
R x (τ ) = ∫ −∞
x (t ) x (t + τ ) d t −∞ < τ < ∞
• Autocorrelation of a real-valued power signal:
∆ 1 T /2
R x (τ ) = lim
T→∞ T ∫ −T /2
x (t ) x (t + τ ) d t −∞ < τ < ∞
Lecture 1 20/2/2019 16
• For real-valued (and WSS in case of random
signals):
1. Autocorrelation is symmetric around zero, R x (τ ) = R x ( −τ )
2. Its maximum value occurs at the origin, R x (τ ) ≤ R x (0 )
3. Autocorrelation and spectral density form a Fourier
transform pair, Ψ x ( f ) = F {R x (τ )} o r G x ( f ) = F { R x (τ )}
4. Its value at the origin is equal to the average energy or
average power, R x (0 ) = E x o r R x (0 ) = Px
Lecture 1 20/2/2019 17
• High entropy signals (useful info + noise) = random signals.
• A random process is a collection of time functions, or signals,
corresponding to various outcomes of a random experiment.
For each outcome, there exists a deterministic function, which
is called a sample function or a realization.
Random
variables
Real number
Sample functions
or realizations
(deterministic
function)
time (t)
Lecture 1 20/2/2019 18
• The distribution function FX(x) of a random variable X:
∆
FX ( x ) = P ( X ≤ x )
• The probability density function (pdf):
∆d FX ( x ) x2
pX (x) =
dx
w h ere P ( x1 ≤ X ≤ x 2 ) = ∫ x1
p X ( x)dx
• The mean value mX of a random variable X :
∆ ∞
m X = E{ X } = ∫ −∞
xp X ( x ) d x
• The variance σX2 of a random variable X:
∆
{( X }
∞
∫− ∞ ( x − m X ) p X ( x)dx = E {X } X
2 2
σ 2
X =E − mX ) = 2
− m 2
Lecture 1 20/2/2019 19
• Strictly stationary: If none of the statistics of the random process are
affected by a shift in the time origin.
Lecture 1 20/2/2019 20
• Autocorrelation of a random signal
R X ( t1 , t 2 ) = E { X ( t 1 ) X ( t 2 ) }
– For a WSS process:
R X (τ ) = E { X ( t ) X ( t + τ )}
Lecture 1 20/2/2019 21
Thermal noise is described by a zero-mean Gaussian random process, n(t).
Its PSD is flat, hence, it is called white noise (AWGN – Additive White Gaussian
Noise).
1 n2 N0
p (n) = exp − 2
Gn ( f ) = [ W /Hz ]
2
σ 2π 2σ
Power spectral
density
σ =1
N0
R n (τ ) = δ (τ )
2
Autocorrelation
function
Probability density function
Lecture 1 20/2/2019 22
Any system introduces an attenuation and propagation delay. Any transmission
channel acts as a filter.
The general representation of a circuit (diport):
Input Ouput
s(t) h(t) r(t)
S(f) H(f) R(f)
Band-pass High-pass
• Realizable filters:
1
RC filters H ( f ) =
1 + j 2π f RC
Lecture 1 20/2/2019 25