0% found this document useful (0 votes)
63 views17 pages

Introductory Concept: Presented by

Channel capacity is the maximum rate at which information can be transmitted over a communications channel. It is calculated using the Shannon-Hartley theorem as the channel bandwidth multiplied by the logarithm of one plus the signal-to-noise ratio. Channel capacity sets an upper bound on reliable data transmission and is affected by factors like bandwidth, signal power, and noise. Digital communication systems transmit digital data using either digital radio that modulates an analog carrier or digital transmission over a physical medium.

Uploaded by

harano mamun
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
63 views17 pages

Introductory Concept: Presented by

Channel capacity is the maximum rate at which information can be transmitted over a communications channel. It is calculated using the Shannon-Hartley theorem as the channel bandwidth multiplied by the logarithm of one plus the signal-to-noise ratio. Channel capacity sets an upper bound on reliable data transmission and is affected by factors like bandwidth, signal power, and noise. Digital communication systems transmit digital data using either digital radio that modulates an analog carrier or digital transmission over a physical medium.

Uploaded by

harano mamun
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 17

Introductory Concept

Presented By
Dr. Md. Fazlul Kader
Associate Professor
EEE, University Of Chittagong, Bangladesh
Channel Capacity

◼ Channel capacity is the tight upper bound on the rate at


which information can be reliably transmitted over a communications
channel.
◼ By the noisy-channel coding theorem, the channel capacity of a
given channel is the limiting information rate (in units of information per
unit time) that can be achieved with arbitrarily small error probability.
◼ The Shannon–Hartley theorem tells the maximum rate at which
information can be transmitted over a communications channel of a
specified bandwidth in the presence of noise.

2
Channel Capacity

Claude Shannon
1916-2001 Ralph Hartley
Bell Labs, MIT 1888-1970
Bell Labs
images from wikipedia.com
Shannon Capacity (C)

◼ C = W*Log2(1 + SNR) bps


◼ W = channel bandwidth (Hz)
◼ SNR = channel signal-to-noise ratio,
◼ Maximum bit rate that can be reliably achieved via a connection
◼ EX) Analog Modem (30 dB SNR)
C = 3500 *Log2(1 + 1000) = 34,885 bps
◼ EX) 6 MHz TV RF Channel (42 dB SNR)
C = 6,000,000 *Log2(1 + 15,849) = 83.71 Mbps

◼ SNR: from linear to decibel: 10Log10(SNRLinear)


◼ SNR: from decibel to linear: 10^(SNRdB/10)
Decibel (dB)

◼ The decibel (dB) is a logarithmic unit used to express the ratio


between two values of a physical quantity, often power or intensity.

5
Shannon Capacity (C)

◼ C is the capacity in bits per second,


◼ B is the bandwidth in Hertz,
◼ Ps is the signal power and
◼ No is the noise spectral density.

6
Capacity vs Throughput

◼ Channel Capacity: Physical data rate


◼ Throughput or network throughput is the rate of successful message
delivery over a communication channel.
◼ Throughput is usually measured in bits per second (bit/s or bps), and
sometimes in data packets per second (p/s or pps) or data packets
per time slot.
◼ For example, if the throughput is 70 Mbit/s in a 100 Mbit/s Ethernet
connection, the channel efficiency is 70%. In this example, effective
70Mbits of data are transmitted every second.

7
Capacity with increasing signal power

%Matlab/Octave script for plotting


%capacity vs power
B=1;
N0=1;
P= [0:10^4];
C = B.*log2(1+P./(N0*B));
plot(P,C);
xlabel('power, P');
ylabel('capacity, C bit/sec');
title('Capacity vs Power')

8
Capacity with increasing bandwidth

◼ 1. More bandwidth means we can have more transmissions per second, hence
higher the capacity.
◼ 2. However, more bandwidth also means that there is more noise power at the
receiver.

%Matlab/Octave script for plotting capacity


%vs bandwidth
P = 1;
N0 = 1;
B = [1:10^3];
C = B.*log2(1+P./(N0*B));
plot(B,C)
xlabel('bandwidth, B Hz');
ylabel('capacity, C bit/sec');
title('Capacity vs Bandwidth')

9
Noise

◼ Noise is unwanted electrical or electromagnetic energy that degrades


the quality of signals and data.
◼ Noise occurs in digital and analog systems, and can affect files and
communications of all types, including
➢ text,
➢ programs,
➢ images,
➢ audio, and
➢ telemetry.
◼ Noise can come from a variety of sources, including
➢ radio waves,
➢ nearby electrical wires,
➢ lightning, and
➢ bad connections.
10
Additive white Gaussian noise (AWGN)

◼ Additive white Gaussian noise (AWGN) is a basic noise model used


in Information theory to mimic the effect of many random processes
that occur in nature.
◼ The modifiers denote specific characteristics:
➢ 'Additive' because it is added to any noise that might be intrinsic to
the information system.
➢ 'White' refers to idea that it has uniform power across the frequency
band for the information system. It is an analogy to the color white
which has uniform emissions at all frequencies in the visible spectrum.
➢ 'Gaussian' because it has a normal distribution in the time domain
with an average time domain value of zero.

11
Additive white Gaussian noise
(AWGN)

%The commands below add white Gaussian


noise to a sawtooth signal. It then plots the 1.5
original and noisy signals. Original signal
Signal with AWGN
1

t = 0:.1:10;
x = sawtooth(t); % Create sawtooth signal. 0.5

y = awgn(x,10,'measured'); % Add white


Gaussian noise. 0

plot(t,x,t,y) % Plot both signals.


-0.5
legend('Original signal','Signal with AWGN');
-1

-1.5
0 1 2 3 4 5 6 7 8 9 10

12
Fading

◼ In wireless communications, fading is variation or


the attenuation of a signal with various variables. These
variables include time, geographical position, and radio
frequency.
◼ Fading is often modeled as a random process.
◼ A fading channel is a communication channel that experiences
fading.
◼ In wireless systems, fading may either be due to multipath
propagation, referred to as multipath-induced fading, weather
(particularly rain), or shadowing from obstacles affecting
the wave propagation, sometimes referred to as shadow
fading.
◼ Reliable achievable data rate over a fading channel

C = B.*log2(1+|h|2 P./(N0*B));
Digital Communication

◼ Digital communications include


◼ Digital radio: systems where relatively high-frequency
analog carriers are modulated by relatively low frequency
digital information signals and
◼ Digital transmission: systems involving the transmission
of digital pulses.
Digital Communication

◼ Digital transmission systems transport information in


digital form and, therefore, require a physical facility
between the transmitter and receiver, such as a metallic
wire pair, a coaxial cable, or an optical fiber cable.
◼ In digital radio systems, the carrier facility could be a
physical cable, or it could be free space.
Digital Communication

◼ The property that distinguishes digital radio systems from


conventional analog modulation communications systems is the
nature of the modulating signal.
◼ Both analog and digital modulation systems use analog carriers
to transport the information through the system.
◼ However, with analog modulation systems, the information signal is
also analog, whereas with digital modulation, the information
signal is digital, which could be computer generated data or digitally
encoded analog signals.
17

You might also like