Chapter-1-Overview of Communication System
Chapter-1-Overview of Communication System
2
Communication systems
Telephone network
Internet
Radio and TV broadcast
Mobile communications
Wi-Fi
Satellite and space communications
Smart power grid, …
Analogue communications
AM, FM
Digital communications
Transfer of information in digits
Dominant technology today
Broadband, 3G, …
3
What is Communication?
Communication involves the transfer of information from
one point to another.
Three basic elements
Transmitter: converts message into a form suitable for
transmission
Channel: the physical medium, introduces distortion, noise,
interference
Receiver: reconstruct a recognizable form of the message
4
Communication Channel
The channel is central to operation of a communication
System
Linear (e.g., mobile radio) or nonlinear (e.g., satellite)
Time invariant (e.g., fiber) or time varying (e.g., mobile radio)
The information-carrying capacity of a communication system is
proportional to the channel bandwidth
Pursuit for wider bandwidth
Copper wire: 1 MHz
Coaxial cable: 100 MHz
Microwave: GHz
Optical fiber: THz
• Uses light as the signal carrier
• Highest capacity among all practical signals
5
Noise in Communications
Unavoidable presence of noise in the channel
Noise refers to unwanted waves that disturb communications
Signal is contaminated by noise along the path.
External noise: interference from nearby channels, human
made noise, natural noise...
Internal noise: thermal noise, random emission... In
electronic devices
Noise is one of the basic factors that set limits on
communications.
A widely used metric is the signal-to-noise (power) ratio
(SNR)
6
Transmitter and Receiver
The transmitter modifies the message signal into a form suitable
for transmission over the channel
This modification often involves modulation
Moving the signal to a high-frequency carrier (up-conversion) and
varying some parameter of the carrier wave
Analog: AM, FM, PM
Digital: ASK, FSK, PSK (SK: shift keying)
The receiver recreates the original message by demodulation
Recovery is not exact due to noise/distortion
The resulting degradation is influenced by the type of modulation
Design of analog communication is conceptually simple
Digital communication is more efficient and reliable; design is
more sophisticated
7
Objectives of System Design
Two primary resources in communications
Transmitted power (should be green)
Channel bandwidth (very expensive in the commercial market)
In certain scenarios, one resource may be more important than
the other
Power limited (e.g. deep-space communication)
Bandwidth limited (e.g. telephone circuit)
Objectives of a communication system design
The message is delivered both efficiently and reliably, subject to
certain design constraints: power, bandwidth, and cost.
Efficiency is usually measured by the amount of messages sent in
unit power, unit time and unit bandwidth.
Reliability is expressed in terms of SNR or probability of error.
8
Information Theory
In digital communications, is it possible to operate at zero
error rate even though the channel is noisy?
Shannon capacity:
The maximum rate of reliable transmission is calculated.
The famous Shannon capacity formula for a channel with
bandwidth W (Hz)
C = W log(1+SNR) bps (bits per second)
Zero error rate is possible as long as actual signaling rate is
less than C.
Many concepts were fundamental and paved the way for future
developments in communication theory.
Provides a basis for tradeoff between SNR and bandwidth, and for
comparing different communication schemes.
9
Milestones in Communications
1837, Morse code used in telegraph
1864, Maxwell formulated the electromagnetic (EM) theory
1887, Hertz demonstrated physical evidence of EM waves
1890’s-1900’s, Marconi & Popov, long-distance radio
telegraph
– Across Atlantic Ocean
– From Cornwall to Canada
1875, Bell invented the telephone
1906, radio broadcast
1918, Armstrong invented superheterodyne radio receiver
(and FM in 1933)
1921, land-mobile communication
10
Milestones in Communications
1928, Nyquist proposed the sampling theorem
1947, microwave relay system
1948, information theory
1957, era of satellite communication began
1966, Kuen Kao pioneered fiber-optical communications
(Nobel Prize Winner)
1970’s, era of computer networks began
1981, analog cellular system
1988, digital cellular system debuted in Europe
2000, 3G network
11
Cellular Mobile Phone Network
A large area is partitioned into cells
Frequency reuse to maximize capacity
12
Growth of Mobile Communications
1G: analog communications
AMPS
2G: digital communications
GSM
IS-95
3G: CDMA networks
WCDMA
CDMA2000
TD-SCDMA
4G: data rate up to1 Gbps (giga bits per second)
Pre-4G technologies:
WiMAX, 3G LTE
13
Wi-Fi
Wi-Fi connects “local” computers (usually within
100m range)
14
Satellite/Space Communication
Satellite communication
Cover very large areas
Optimized for one-way transmission
Radio (DAB) and SatTV broadcasting
Two-way systems
The only choice for remote-area and maritime
communications
Propagation delay (0.25 s) is uncomfortable in voice
communications
Space communication
Missions to Moon, Mars, …
Long distance, weak signals
High-gain antennas
Powerful error-control coding
15
Future Wireless Networks
Ubiquitous Communication Among People and Devices
16
Communication Networks
Today’s communication networks are complicated systems
A large number of users sharing the medium
Hosts: devices that communicate with each other
Routers: route data through the network
17
Concept of Layering
Partitioned into layers, each doing a relatively simple task
Protocol stack
19
Why Probability/Random Process?
Probability is the core mathematical tool for communication theory.
The stochastic model is widely used in the study of communication
systems.
Consider a radio communication system where the received signal is a
random process in nature:
Message is random. No randomness, no information.
Interference is random.
Noise is a random process.
And many more (delay, phase, fading, ...)
Other real-world applications of probability and random processes
include
Stock market modeling, gambling …
20
Probabilistic Concepts
What is a random variable (RV)?
It is a variable that takes its values from the outputs of a
random experiment.
What is a random experiment?
It is an experiment the outcome of which cannot be
predicted precisely.
All possible identifiable outcomes of a random
experiment constitute its sample space S.
An event is a collection of possible outcomes of the
random experiment.
Example
For tossing a coin, S = { H, T }
For rolling a die, S = { 1, 2, …, 6 }
21
Probability Properties
PX(xi): the probability of the random variable X taking
on the value xi
The probability of an event to happen is a non-
negative number, with the following properties:
The probability of the event that includes all possible
outcomes of the experiment is 1.
The probability of two events that do not have any common
outcome is the sum of the probabilities of the two events
separately.
Example
Roll a die: PX(x = k) = 1/6 for k = 1, 2, …, 6
22
Overview
Introduction
Probability and random processes
Probability
Introduction
cdf and pdf
Mean and variance
Joint distribution
Central limit theorem
Random processes
Noise
23
CDF and PDF
The (cumulative) distribution function (cdf) of a random variable X
is defined as the probability of X taking a value less than the argument
x:
Properties:
24
Overview
Introduction
Probability and random processes
Probability
Introduction
cdf and pdf
Mean and variance
Joint distribution
Central limit theorem
Random processes
Noise
25
Mean and Variance
Mean (or expected value):
Variance :
26
Normal (Gaussian) Distribution
27
Uniform Distribution
28
Overview
Introduction
Probability and random processes
Probability
Introduction
cdf and pdf
Mean and variance
Joint distribution
Central limit theorem
Random processes
Noise
29
Joint Distribution
Joint distribution function for two random variables X and Y
Properties:
30
Independent vs. Uncorrelated
Independent implies Uncorrelated
Uncorrelated does not imply Independence
For normal RVs (jointly Gaussian), Uncorrelated implies
Independent (this the only exceptional case!)
An example of uncorrelated but dependent RV’s
31
Joint Distribution of n RVs
Joint cdf
Joint pdf
Independent
32
Overview
Introduction
Probability and random processes
Probability
Introduction
cdf and pdf
Mean and variance
Joint distribution
Central limit theorem
Random processes
Noise
33
Central Limit Theorem
For i.i.d. random variables,
z = x1 + x2 +· · ·+ xn
tends to Gaussian as n goes
to infinity.
Extremely useful in communications.
That’s why noise is usually Gaussian. We
often say “Gaussian noise” or
“Gaussian channel” in communications.
34
Overview
Introduction
Probability and random processes
Probability
Random processes
Definition
Stationary random processes
Power spectral density
Noise
35
What is a Random Process?
A random process is a time-varying function that assigns
the outcome of a random experiment to each time instant:
X(t).
For a fixed (sample path): a random process is a time
varying function, e.g., a signal.
For fixed t: a random process is a random variable.
If one scans all possible outcomes of the underlying
random experiment, we shall get an ensemble of signals.
Noise can often be modeled as a Gaussian random
process.
36
An Ensemble of Signals
37
Statistics of a Random Process
For fixed t: the random process becomes a random
variable, with mean
38
Overview
Introduction
Probability and random processes
Probability
Random processes
Definition
Stationary random processes
Power spectral density
Noise
39
Stationary Random Processes
A random process is (wide-sense) stationary if
Its mean does not depend on t
40
Example
41
Overview
Introduction
Probability and random processes
Probability
Random processes
Definition
Stationary random processes
Power spectral density
Noise
42
Power Spectral Density
Power spectral density (PSD) is a function that measures the
distribution of power of a random process with frequency.
PSD is only defined for stationary processes.
Wiener-Khinchine relation: The PSD is equal to the Fourier transform
of its autocorrelation function:
43
Passing Through a Linear System
44
Overview
Introduction
Probability and random processes
Noise
45
Noise
Noise is the unwanted and beyond our control waves that
disturb the transmission of signals.
External sources: e.g., atmospheric, galactic noise, interference;
Internal sources: generated by communication devices
themselves.
This type of noise represents a basic limitation on the
performance of electronic communication systems.
Shot noise: the electrons are discrete and are not moving in
a continuous steady flow, so the current is randomly
fluctuating.
Thermal noise: caused by the rapid and random motion of
electrons within a conductor due to thermal agitation.
Both are often stationary and have a zero-mean Gaussian
distribution (following from the central limit theorem).
46
White Noise
The additive noise channel
n(t) models all types of noise
zero mean
White noise
Its power spectrum density (PSD) is constant over all
frequencies, i.e.,
48
Ideal Low-Pass White Noise
Suppose white noise is applied to an ideal low-pass
filter of bandwidth B such that
Where
49
Bandpass Noise
Any communication system that uses carrier modulation
will typically have a bandpass filter of bandwidth B at the
front-end of the receiver.
50
Example
If white noise with PSD of N0/2 is passed through an
ideal bandpass filter, then the PSD of the noise that
enters the receiver is given by:
Autocorrelation function
51
Representation of Bandpass Noise
Consider bandpass noise within │f-fc│≤ B with any
PSD ( i.e., not necessarily white as in the previous
example)
Consider a frequency slice Δf at frequencies fk and
−fk.
For Δf small:
52
Representation of Bandpass Noise
The complete bandpass noise waveform n(t) can be
constructed by summing up such sinusoids over the
entire band, i.e.,
53
Extraction and Generation
nc(t) and ns(t) are fully representative of bandpass noise.
Given bandpass noise, one may extract its in-phase
and quadrature components (using LPF of bandwith
B). This is extremely useful in analysis of noise in
communication receivers.
Given the two components, one may generate
bandpass noise. This is useful in computer simulation.
54
Properties
If the noise n(t) has zero mean, then nc(t) and ns(t) have
zero mean.
If the noise n(t) is Gaussian, then nc(t) and ns(t) are
Gaussian.
If the noise n(t) is stationary, then nc(t) and ns(t) are
stationary.
If the noise n(t) is Gaussian and its power spectral density
S( f ) is symmetric with respect to the central frequency fc,
then nc(t) and ns(t) are statistical independent.
The components nc(t) and ns(t) have the same variance
(= power) as n(t).
55
PSD
Further, each baseband noise waveform will have the
same PSD:
56
Phasor Representation
We may write bandpass noise in the alternative form:
57
Summary
White noise: PSD is constant over an infinite bandwidth.
Gaussian noise: PDF is Gaussian.
Bandpass noise
In-phase and quadrature compoments nc(t) and ns(t) are
low-pass random processes.
nc(t) and ns(t) have the same PSD.
nc(t) and ns(t) have the same variance as the band-pass
noise n(t).
Such properties will be essential to the performance
analysis of bandpass communication systems.
The in-phase/quadrature representation and phasor
representation are not only basic to the characterization of
bandpass noise itself, but also to the analysis of bandpass
communication systems.
58