0% found this document useful (0 votes)
51 views

Channel Coding I

This document provides an introduction to channel coding and information theory. It discusses channel coding preliminaries and different channel models including binary symmetric channels, discrete memoryless channels, and additive white Gaussian noise channels. It then covers channel capacity calculations and limits based on Shannon's noisy channel coding theorem. Examples are provided to illustrate key concepts around channel capacity and repetition coding intuition. MATLAB simulations are introduced to analyze digital communication systems and channel coding.

Uploaded by

David Siegfried
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
51 views

Channel Coding I

This document provides an introduction to channel coding and information theory. It discusses channel coding preliminaries and different channel models including binary symmetric channels, discrete memoryless channels, and additive white Gaussian noise channels. It then covers channel capacity calculations and limits based on Shannon's noisy channel coding theorem. Examples are provided to illustrate key concepts around channel capacity and repetition coding intuition. MATLAB simulations are introduced to analyze digital communication systems and channel coding.

Uploaded by

David Siegfried
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 22

ECNG 6703 - Principles of Communications

Introduction to Information Theory - Channel Coding: Part I

Sean Rocke

September 30th , 2013

ECNG 6703 - Principles of Communications

1 / 22

Outline

Channel Coding Preliminaries Channel Models Channel Capacity Conclusion

ECNG 6703 - Principles of Communications

2 / 22

Channel Coding Preliminaries

Channel Coding in Context


Consider the following Digital Comms Examples: Taking & posting a narcissistic picture of you ziplining in Chaguaramas, on your Facebook prole A GSM phone conversation Sending instrumentation data to a control system for a manufacturing plant Downloading a legal copy of an ebook from amazon.com Live transmission of a Machel Montano concert over the Internet Last lecture. . . Source Coding: Representing the information to be transmitted with as few bits as possible. . . (Data Compression) This lecture. . . Channel Coding: How do we ensure that the compressed data is transmitted reliably on a possibly unreliable channel? (Error Detection & Correction)
ECNG 6703 - Principles of Communications 3 / 22

Channel Coding Preliminaries

Recall: Elements of a Digital Communications System


Information source and input transducer Source encoder Channel encoder Digital modulator

Channel

Output transducer

Source decoder

Channel decoder

Digital demodulator

Elements not specically included in the illustration: Carrier and Symbol Synchronization A\D interface Channel interfaces (e.g., RF front end (RFFE), ber optic front end (FOFE), BAN front end (BANFE), . . . )

ECNG 6703 - Principles of Communications

4 / 22

Channel Models

Channel Coding Dened


Channel encoding: To introduce, in a controlled manner, some redundancy in the binary information sequence, which can be used at the receiver to overcome the effects of noise & interference encountered during signal transmission through the channel. Channel coding challenge: How can the source output be transmitted across an unreliable channel and reliably received, with as little power, bandwidth, and implementation complexity as possible? Key performance metrics: coding rate, bit error rate, power efciency, bandwidth efciency, implementation complexity To answer the above, it is essential to model the channel. . .
ECNG 6703 - Principles of Communications 5 / 22

Channel Models

Channel Models
A general communication channel is described in terms of:
1 2 3

Input alphabet: Set of possible inputs, X = {x1 , . . . , xm } Output alphabet: Set of possible outputs, Y = {y1 , . . . , yn } Transition probabilities: Conditional probability for each each possible inputtooutput mapping, P (Y = yj |X = xi )

Note: For hard decoding |X | = |Y|. For soft decoding |X | = |Y|. Memoryless Channel: For length n input sequence, x = (x [1], . . . , x [n]), and length n output sequence, y = (y [1], . . . , y [n]), the output at time i depends upon the input at time, i. (i.e., P (y|x) = n i =1 P (y [i ]|x [i ]) for all n.)
ECNG 6703 - Principles of Communications 6 / 22

Channel Models

Binary Symmetric Channel (BSC)


Binary Symmetric Channel (BSC):
1 2 3

Input alphabet: X = {0, 1} Output alphabet: Y = {0, 1} Transition probability matrix: P (Y = 0|X = 0) P (Y = 1|X = 0) 1p p P (Y |X ) = = P (Y = 0|X = 1) P (Y = 1|X = 1) p 1p p - average probability of bit errors in transmitted sequence (i.e., due to channel noise and other disturbances) Channel is obviously memoryless!

ECNG 6703 - Principles of Communications

7 / 22

Channel Models

Discrete Memoryless Channel (DMC)


Discrete Memoryless Channel (DMC):
1 2 3

Input alphabet: X = {x0 , . . . , xm } (Discrete alphabet) Output alphabet: Y = {y0 , . . . , yn } (Discrete alphabet) Transition probability matrix: P (Y = y0 |X = x0 ) . .. . P (Y |X ) = . . P (Y = y0 |X = xm ) P (Y = yn |X = x0 ) . . . P (Y = yn |X = xm )

ECNG 6703 - Principles of Communications

8 / 22

Channel Models

Channel Examples
Example: Sketch the channels and discuss the relationships between the input and output for the following channels: 3 1 4 4 0 0 0 2 1 1 Lossless Channel, [P (Y |X )] = 0 0 3 3 0 0 0 0 0 1 1 0 0 1 0 0 2 0 1 0 Deterministic Channel, [P (Y |X )] = 0 1 0 0 0 1 1 0 0 0 0 1 0 0 3 Noiseless Channel, [P (Y |X )] = 0 0 1 0 0 0 0 1
ECNG 6703 - Principles of Communications 9 / 22

Channel Models

DiscreteInput, ContinuousOutput Channel

DiscreteInput, ContinuousOutput Channel:


1 2 3

Input: X = {x0 , . . . , xm } (Discrete alphabet) Output: Y = R (Unquantized/Continuous detector output) Transition probabilities: P (Y = y |X = xi ) = fY |X (y , xi ), xi X , y R

Example: AWGN Channel - Y = X + N , where N is a zeromean Gaussian RV with variance, 2 (i.e., N N (0, 2 )), and fY |X (y , xi ) =
1 e 2 2
(y xi )2 2 2

ECNG 6703 - Principles of Communications

10 / 22

Channel Models

Other Channel Models

DiscreteTime AWGN Channel:


1 2

Input: X = R (Unquantized/Continuous valued input) Output: Y = R (Unquantized/Continuous valued detector output) At time instant, i , y [i ] = x [i ] + n[i ], where n[i ] N (0, 2 )) Transition probabilities: P (Y = y [i ]|X = x [i ]) = fY |X (y [i ], x [i ]), (x [i ], y [i ]) R

3 4

ECNG 6703 - Principles of Communications

11 / 22

Channel Models

Other Channel Models


AWGN Waveform Channel:
1

Input: x (t ) (Continuoustime, Unquantized/Continuousvalued input) Output: y (t ) (Continuoustime, Unquantized/Continuous detector output) Continuous time interpretation. At time, t , y (t ) = x (t ) + n(t ), where n(t ) is a sample function of the AWGN process with power 0 spectral density, N 2 Transition probabilities: P (Y = y [i ]|X = x [i ]) = fY |X (y [i ], x [i ]), (x [i ], y [i ]) R

Having looked at various channel models, for any given channel model, how much information can the channel reliably convey?
ECNG 6703 - Principles of Communications 12 / 22

Channel Capacity

Channel Capacity Calculation


Channel Capacity, C = maxp I (X ; Y ) bits/transmission, where maximization is performed over all PMFs of the form p = (p1 , p2 , . . . , p|X | ) on the input alphabet X . Questions: Sketch the channels and determine if equiprobable input symbols maximize the information rate through the channel. 0.6 0.3 0.1 1 [P (Y |X )] = 0.1 0.3 0.6
2

[P (Y |X )] =

0.6 0.3 0.1 0.3 0.1 0.6

ECNG 6703 - Principles of Communications

13 / 22

Channel Capacity

Channel Capacity Limits


Shannons 2nd Theorem- The Noisy Channel Coding Theorem Reliable communication over a DMC is possible if the communication rate R satises R < C, where C is the channel capacity. At rates higher than capacity, reliable communication is impossible. Shannons noisy channel coding theorem indicates the maximum achievable rate for reliable communication over a DMS This is a yardstick to measure the performance of communications systems However, the theorem does not indicate how to achieve this rate For reliable communication we must have R < C , where C = W log2 (1 + SN R) (ShannonHartley Limit) For a bandlimited AWGN, R < W log2 1 +
ECNG 6703 - Principles of Communications

P N0 W
14 / 22

Channel Capacity

Channel Capacity
Example: Sketch the channel dened by: 0.5 0.5 0 0 0 0.5 0.5 0 [P (Y |X )] = 0 0 0.5 0.5 0.5 0 0 0.5
1 2

What type of channel is this? Is the channel capacity sufcient for an uniform PMF input (i.e., pX (0) = pX (1) = pX (2) = pX (3))? What happens if the input PMF is changed to pX (0) = pX (2) = 0.5, pX (1) = pX (3) = 0?
ECNG 6703 - Principles of Communications 15 / 22

Channel Capacity

Channel Capacity
Questions: For a bandlimited AWGN, R < W log2 1 +
1

P N0 W

Can the channel capacity be increased indenitely by increasing the transmit power, (i.e., as P )? Can the channel capacity be increased indenitely by increasing the bandwidth, (i.e., as W )? What is the fundamental relation between bandwidth and power efciency of a communications system? Energy per bit, b = Solve for
b N0 R W log2 M

To answer the above, note the following:


1 2 3

PTs log2 M

P R

Observe what happens when r =

0
16 / 22

ECNG 6703 - Principles of Communications

Channel Capacity

Bandwidth vs Power Efciency

ECNG 6703 - Principles of Communications

17 / 22

Channel Capacity

Channel Coding Intuition


Repetition Coding Example: Consider a BSC with p = 0.1.
1 2

Determine the probability of a bit error. Assume a repetition code is used, where the bit is repeated 3 times and a majority vote is used at the receiver to decide what was transmitted. What is the probability in this case? What happens for a repetition code where each bit is repeated n times? What is the impact on the rate due to repetition coding use?

ECNG 6703 - Principles of Communications

18 / 22

Channel Capacity

Error Control Mechanisms

Error Control Stop & Wait ARQ Continuous ARQ Go-BackN Selective Repeat Non-linear Non-cyclic Golay FECC Block codes Group(Linear) Polynomially generated (cyclic) BCH ReedBinary BCH Solomon Hamming(e = 1) e>1 Convolutional codes

ECNG 6703 - Principles of Communications

19 / 22

Channel Capacity

MATLAB Nuggets

Lets take a look at Simulink...

ECNG 6703 - Principles of Communications

20 / 22

Conclusion

Conclusion
We covered: Channel models Channel capacity Shannons noisy channel coding theorem Channel coding intuition Introducing Simulink Your goals for next class: Continue ramping up your MATLAB & Simulink skills Review channel coding handout on course site Complete HW 3 for submission next week Complete At-Home Exercise for submission next week

ECNG 6703 - Principles of Communications

21 / 22

Q&A

Thank You

Questions????

ECNG 6703 - Principles of Communications

22 / 22

You might also like