0% found this document useful (0 votes)
301 views5 pages

Experiment No. 6: Channel Capacity

The document discusses Shannon's capacity theorem and calculating the capacity of binary symmetric channels (BSCs). It provides the theory behind channel capacity and Shannon's theorem relating maximum channel capacity to bandwidth, signal power, and noise power. The document also includes MATLAB source code that experimentally verifies Shannon's capacity theorem and calculates BSC capacity for varying bandwidths and signal-to-noise ratios. The results demonstrate how capacity changes with these channel parameters. Additionally, the document covers Shannon-Fano coding theory and includes source code to implement Shannon-Fano coding on a series of probabilities.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
301 views5 pages

Experiment No. 6: Channel Capacity

The document discusses Shannon's capacity theorem and calculating the capacity of binary symmetric channels (BSCs). It provides the theory behind channel capacity and Shannon's theorem relating maximum channel capacity to bandwidth, signal power, and noise power. The document also includes MATLAB source code that experimentally verifies Shannon's capacity theorem and calculates BSC capacity for varying bandwidths and signal-to-noise ratios. The results demonstrate how capacity changes with these channel parameters. Additionally, the document covers Shannon-Fano coding theory and includes source code to implement Shannon-Fano coding on a series of probabilities.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

MUSHTAQ AHMAD DAR 2019PEE0032

Experiment No. 6
AIM: Experimental verification of the following using MATLAB.
➢ Shannon capacity theorem
➢ Channel capacity calculations for binary symmetric channel, cascading of
BSC

REQUIREMENTS: - Computer with MATLAB Software.


THEORY:
Channel Capacity:
Suppose a source sends r messages per second and the entropy of a message is H bits per
message. The information rate is R = r H bits/second. One can intuitively reason that, for a
given communication system, as the information rate increases the number of errors per
second will also increase.
Shannon’s theorem:
Shannon-Hartley equation relates the maximum capacity (transmission bit rate) that can be achieved
over a given channel with certain noise characteristics and bandwidth. For an AWGN the maximum
capacity is given by

C=B*log2 (1+S/N)

Here C is the maximum capacity of the channel in bits/second otherwise called Shannon’s capacity
limit for the given channel, B is the bandwidth of the channel in Hertz, S is the signal power in Watts
and N is the noise power, also in Watts. The ratio S/N is called Signal to Noise Ratio (SNR). It can be
ascertained that the maximum rate at which we can transmit the information without any error, is
limited by the bandwidth, the signal level, and the noise level. It tells how many bits can be transmitted
per second without errors over a channel of bandwidth B Hz, when the signal power is limited to S
Watts and is exposed to Gaussian White Noise of N Watts of additive nature.

Source code :-
clc;
clear all;
close all;
B=1:0.001:100;
S=1;
N0=1;
N=N0*B;
C = B.*log2(1 + S./N);
subplot(2,1,1);
plot(B,C);
title("constant snr");
xlabel('Bandwidth');
ylabel('capacity');
MUSHTAQ AHMAD DAR 2019PEE0032

B=10;
S=1:0.01:500;
N0=1;
N=N0*B;
snr=S/N;
C = B.*log2(1 +snr );
subplot(2,1,2);

plot(snr,C);
title("constant Bandwidth");
xlabel('snr');
ylabel('capacity');

Results: -

Shannon Fanon Coding Algorithm:


The procedure evaluates the symbol's probability and assigns code words with a corresponding
code length. Compared to other methods the Shannon-Fano coding is easy to implement. In
practical operation Shannon-Fano coding is not of larger importance.
Utilization of Shannon-Fano coding makes primarily sense if it is desired to apply a simple
algorithm with high performance and minimum requirements for programming.
MUSHTAQ AHMAD DAR 2019PEE0032

Source Code: -

clc;
clear all;
close all;

m=input('Enter the no. of prob : ');


z=[];
p=[];
h=0;l=0;
display('Enter the probabilities ');
for i=1:m
fprintf('probability %d\n',i);
m(i)=input('');
end
p1=sort(m,'descend');
display(p1);
p=[p p1];

a(1)=0;
for j=2:length(p)
a(j)=a(j-1)+p(j-1);
end
fprintf(' A Matrix');
display(a);

for i=1:length(p)
n(i)= ceil(-1*(log2(p(i))));
end
fprintf(' Code length matrix');
display(n);

for i=1:length(p)
b=a(i);
for j=1:n(i)
f=b*2;
c=floor(f);
f=f-c;
z=[z c];
b=f;
end

fprintf('Codeword %d',i);
display(z);
z=[];
end
MUSHTAQ AHMAD DAR 2019PEE0032

Output:

Enter the no. of prob : 5

Enter the probabilities

probability 1-0.15

probability 2-0.05

probability 3-0.25

probability 4-0.15

probability 5-0.4

p1 = Columns 1 through 5

0.4000 0.2500 0.1500 0.1500 0.0500

A Matrix

Columns 1 through 5

0 0.4000 0.6500 0.8000 0.9500

Code length matrix

n= 2 2 3 3 5

Codeword 1-0 0

Codeword 2- 0 1

Codeword 3-1 0 1

Codeword 4- 1 1 0

Codeword 5- 1 1 1 1 0

clc;
clear all;
close all;
% Entropy of coin
p=0:0.01:1
h=-p.*log2(p)-(1-p).*log2(1-p);
subplot(221);
plot(h);
title('entropy coin');
xlabel('probability');
ylabel('H(x)');

% Entropy of BSC channel

m=1-h;
subplot(223);
MUSHTAQ AHMAD DAR 2019PEE0032

plot(m);
title('BSC');
xlabel('probability');
ylabel('H(x)');

Result:

You might also like