0% found this document useful (0 votes)
47 views25 pages

ADTT c1

- The key features of a Data Transmission System (DTS) include a transmitter that sends a waveform from a finite set of possible waveforms during a limited time, a channel that distorts and attenuates the signal while adding noise, and a receiver that decides which waveform was transmitted from the noisy received signal. - There are advantages to using digital signals over analog signals, such as regenerative receivers and all bits being treated identically regardless of their data content. - Important concepts in digital communication systems include Shannon capacity, which defines the maximum data rate possible over a noisy channel, channel bandwidth and transmit power, signal classification (deterministic vs random, periodic vs non-periodic, analog vs discrete), and power/

Uploaded by

Eugen Simion
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
47 views25 pages

ADTT c1

- The key features of a Data Transmission System (DTS) include a transmitter that sends a waveform from a finite set of possible waveforms during a limited time, a channel that distorts and attenuates the signal while adding noise, and a receiver that decides which waveform was transmitted from the noisy received signal. - There are advantages to using digital signals over analog signals, such as regenerative receivers and all bits being treated identically regardless of their data content. - Important concepts in digital communication systems include Shannon capacity, which defines the maximum data rate possible over a noisy channel, channel bandwidth and transmit power, signal classification (deterministic vs random, periodic vs non-periodic, analog vs discrete), and power/

Uploaded by

Eugen Simion
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 25

• What are the features of a Data Transmissions System

(DTS)?
• Why “digital” instead of “analog”?
• What do we need to know before taking off toward
designing a DTS?
– Shannon Channel Capacity
– Classification of signals
– Random process
– Autocorrelation
– Power and energy spectral densities
– Noise in communication systems
– Signal transmission through filters; linear distortions

Lecture 1 20/2/2019 1
Communications era: telephones, radios and televisions, computer
terminals to access Internet; senses for ships, aircrafts, rockets and
satellites → Passing to mobile communications;

General block scheme of a transmission (one-way) system:

Lecture 1 20/2/2019 2
Data = encoded information represented by binary elements;
the fundamental information unit = binary digit (bit).

Data (by source):


◦ String of characters (text files), printable or control (ASCII, EBCDIC,
Unicode);
◦ Analog data = digital converted voice, music, picture, and movies;
◦ Pure binary data (as compiled programs in machine language) → data
files.

For processing and transmission – information is represented


as an electrical or optical signal; the characteristics of the
channel affect the design of the building blocks in the
transmission system.

Lecture 1 20/2/2019 3
• Advantages of digital communications:
– Regenerator (decision + reshaping) receiver
Original Regenerated
pulse pulse

Propagation distance

– Digital signals with different content (source)


are treated identically.
Voice
Data A bit is a bit!
Media

Lecture 1 20/2/2019 4
Noise + other impairments

Transmitted Received Received


Info. signal signal info.
Source
SOURC Transmitter Channel Receiver User
E

Transmitter

Source Channel
Formatter Modulator
encoder encoder

Receiver

Source Channel
Formatter Demodulator
decoder decoder

Lecture 1 20/2/2019 5
• Important features of a DTS:
– Transmitter sends a waveform from a finite set
of possible waveforms during a limited time
– Channel distorts, attenuates the transmitted
signal and adds noise to it.
– Receiver decides which waveform was
transmitted from the noisy received signal
– Quality of Service (QoS): Probability of
erroneous decision is an important measure for
the system performance

Lecture 1 20/2/2019 6
Two major communication resources:
◦ Transmit power and channel bandwidth

In many communication systems, one of these


resources is more precious than the other. Hence,
systems can be classified as:
◦ Power-limited systems:
save power at the expense of bandwidth (for example, by using coding
schemes)

◦ Bandwidth-limited systems:
save bandwidth at the expense of power (for example, by using
spectrally efficient modulation schemes)

Lecture 1 20/2/2019 7
Channel capacity: The maximum data rate at
which the error-free communication over the
channel is performed (1948). Claude Elwood Shannon
1916 - 2001
Channel capacity of the noisy (AWGN) channel
(Shannon-Hartley capacity theorem): C. E. Shannon, “A
Mathematical Theory of
 S Communication”, The
C = W log 2 1 +  [bits/s] Bell System Technical
 N Journal, Vol. 27, pp.
379–423, 623–656, July,
W [ Hz ] : Bandwidth Oct., 1948.
S [W] : Average received signal power
N = N 0W [W] : Average noise power

Lecture 1 20/2/2019 8
Shannon theorem puts a limit on
transmission data rate, not on error Claude Elwood Shannon
probability: 1916 - 2001

◦ Theoretically, it is possible to transmit information at


any rate Rb[b/s], where Rb ≤ C, with an arbitrary small
error probability by using a sufficiently complicated
coding scheme;

◦ For an information rate Rb > C, it is not possible to


find a code that can achieve an arbitrary small error
probability.

Lecture 1 20/2/2019 9
Spectral efficiency = C/W [b/s/Hz]

Unattainable Claude Elwood Shannon


region 1916 - 2001

Practical region

SNR [dB]
Lecture 1 20/2/2019 10
• Deterministic and random signals
– Deterministic signal: No uncertainty with respect
to the signal value at any time.
• Are modeled by explicit mathematical expressions,
such as x ( t ) = 5 co s(1 0 t )
– Random signal: Some degree of uncertainty in
signal values before it actually occurs.
• Can be described in terms of statistical averages
• Example: Thermal noise in electronic circuits due to
the random movement of electrons
• Ex.: Reflection of radio waves from different surfaces
Lecture 1 20/2/2019 11
• Periodic and non-periodic signals

A periodic signal A non-periodic signal

• Analog (continous) and discrete signals


x(kT )

−T 0 T 2T
Analog signal Discrete signal

Lecture 1 20/2/2019 12
• Power and energy of signals
– The instantaneous power of a complex signal x(t):

2
p (t ) = x (t )
– The energy disipated during an interval T:
∆ T /2


2
E xT = x (t ) d t
−T /2

Lecture 1 20/2/2019 13
• Energy and power signals
– A signal is an energy signal if, and only if, it has
nonzero but finite energy for all time: (0 < Ex < ∞)
∆ T /2 ∞

∫ ∫
2 2
E x = lim x (t ) d t = x (t ) d t
T→∞
−T /2 −∞

– A signal is a power signal if, and only if, it has finite


but nonzero power for all time: (0 < Px < ∞)
∆ T /2
1

2
Px = lim x (t ) d t
T →∞ T
−T /2
– Notes: 1. Energy signals have null average power;
2. Periodic and random signals are power signals (having
infinite energy; they exist for all time). Signals that are both
deterministic and non-periodic are energy signals.

Lecture 1 20/2/2019 14
• Energy signals (Parseval’s theorem):

– Energy spectral density (ESD) in J/Hz:

• Power signals (x(t) = x(t+T0)):

– Power spectral density (PSD) in W/Hz:

• Random process:
– Power spectral density (PSD):

Lecture 1 20/2/2019 15
• Autocorrelation of an energy signal (real-valued x(t))
∆ ∞
R x (τ ) = ∫ −∞
x (t ) x (t + τ ) d t −∞ < τ < ∞
• Autocorrelation of a real-valued power signal:
∆ 1 T /2
R x (τ ) = lim
T→∞ T ∫ −T /2
x (t ) x (t + τ ) d t −∞ < τ < ∞

– For a periodic signal (with period T0):


1 T0 / 2
R x (τ ) =
T0 ∫ − T0 / 2
x (t ) x (t + τ ) d t − ∞ < τ < ∞

Lecture 1 20/2/2019 16
• For real-valued (and WSS in case of random
signals):
1. Autocorrelation is symmetric around zero, R x (τ ) = R x ( −τ )
2. Its maximum value occurs at the origin, R x (τ ) ≤ R x (0 )
3. Autocorrelation and spectral density form a Fourier
transform pair, Ψ x ( f ) = F {R x (τ )} o r G x ( f ) = F { R x (τ )}
4. Its value at the origin is equal to the average energy or
average power, R x (0 ) = E x o r R x (0 ) = Px

Lecture 1 20/2/2019 17
• High entropy signals (useful info + noise) = random signals.
• A random process is a collection of time functions, or signals,
corresponding to various outcomes of a random experiment.
For each outcome, there exists a deterministic function, which
is called a sample function or a realization.
Random
variables
Real number

Sample functions
or realizations
(deterministic
function)
time (t)

Lecture 1 20/2/2019 18
• The distribution function FX(x) of a random variable X:

FX ( x ) = P ( X ≤ x )
• The probability density function (pdf):
∆d FX ( x ) x2
pX (x) =
dx
w h ere P ( x1 ≤ X ≤ x 2 ) = ∫ x1
p X ( x)dx
• The mean value mX of a random variable X :
∆ ∞
m X = E{ X } = ∫ −∞
xp X ( x ) d x
• The variance σX2 of a random variable X:

{( X }

∫− ∞ ( x − m X ) p X ( x)dx = E {X } X
2 2
σ 2
X =E − mX ) = 2
− m 2

• These are ensemble averages, where E{.} is the expected value.

Lecture 1 20/2/2019 19
• Strictly stationary: If none of the statistics of the random process are
affected by a shift in the time origin.

• Wide sense stationary (WSS): If the mean and autocorrelation function


do not change with a shift in the origin time.

• Cyclostationary: If the mean and autocorrelation function are periodic in


time.

• Ergodic process: A random process is ergodic in mean (mX) and


autocorrelation [RX(τ)], if time averages and ensamble averages are equal:
1 T /2
m X = lim
T→∞ T ∫ −T /2
X (t ) d t
and
1 T /2
R X (τ ) = lim
T →∞ T ∫ −T /2
X (t ) X (t + τ ) d t
, respectively.

Lecture 1 20/2/2019 20
• Autocorrelation of a random signal
R X ( t1 , t 2 ) = E { X ( t 1 ) X ( t 2 ) }
– For a WSS process:
R X (τ ) = E { X ( t ) X ( t + τ )}

Lecture 1 20/2/2019 21
Thermal noise is described by a zero-mean Gaussian random process, n(t).
Its PSD is flat, hence, it is called white noise (AWGN – Additive White Gaussian
Noise).

1  n2  N0
p (n) = exp  − 2 
Gn ( f ) = [ W /Hz ]
2
σ 2π  2σ 
Power spectral
density

σ =1
N0
R n (τ ) = δ (τ )
2
Autocorrelation
function
Probability density function

Lecture 1 20/2/2019 22
Any system introduces an attenuation and propagation delay. Any transmission
channel acts as a filter.
The general representation of a circuit (diport):
Input Ouput
s(t) h(t) r(t)
S(f) H(f) R(f)

h(t) = impulse response; the convolution integral (product):



r (t ) = ∫ s(τ )h(t − τ )dτ
−∞

H(ω) = F{h(t)} = transfer function


- for deterministic signals: R( f ) = S ( f ) H ( f )
2
- for random signals: GR ( f ) = GS ( f ) H ( f )
Lecture 1 20/2/2019 23
Ideal distortionless transmission: All the frequency components of the signal not
only arrive with an identical time delay, but also are amplified or attenuated
equally.
Hence, conditions for no distortions (or to have linear distortions) in time domain:

r (t ) = ks(t − t 0 ), k = ct., t 0 = ct.


In frequency domain:
R( f )  H ( f ) = k φ( f )
R( f ) = kS ( f )e− j 2π ft0 ⇒ H ( f ) = = ke− j 2π ft0 ⇒  ; t0 = phase delay = −
S( f ) φ ( f ) = −2π ft0 2π f

We have amplitude distortions if H ( f ) ≠ ct



We have phase (delay) distortions if −
df
≠ ct

To reduce distortions we use equalizers:


o Amplitude equalizer;
o Phase (group delay) equalizer.
Lecture 1 20/2/2019 24
• Ideal filters:
Non-causal!
Low-pass

Band-pass High-pass

• Realizable filters:
1
RC filters H ( f ) =
1 + j 2π f RC

Lecture 1 20/2/2019 25

You might also like