Additive white Gaussian noise (AWGN) is a channel model in which the only impairment to communication is a linear addition of wideband or white noise with a constant spectral density (expressed as watts per hertz of bandwidth) and a Gaussian distribution of amplitude. The model does not account for fading, frequency selectivity, interference, nonlinearity or dispersion. However, it produces simple and tractable mathematical models which are useful for gaining insight into the underlying behavior of a system before these other phenomena are considered.

Wideband Gaussian noise comes from many natural sources, such as the thermal vibrations of atoms in conductors (referred to as thermal noise or Johnson-Nyquist noise), shot noise, black body radiation from the earth and other warm objects, and from celestial sources such as the Sun.

The AWGN channel is a good model for many satellite and deep space communication links. It is not a good model for most terrestrial links because of multipath, terrain blocking, interference, etc. However, for terrestrial path modeling, AWGN is commonly used to simulate background noise of the channel under study, in addition to multipath, terrain blocking, interference, ground clutter and self interference that modern radio systems encounter in terrestrial operation.

Contents

Channel capacity [link]

The AWGN channel is represented by a series of outputs Failed to parse (Missing texvc executable; please see math/README to configure.): Y_i

at discrete time event index Failed to parse (Missing texvc executable; please see math/README to configure.): i

. Failed to parse (Missing texvc executable; please see math/README to configure.): Y_i

is the sum of the input Failed to parse (Missing texvc executable; please see math/README to configure.): X_i
and noise, Failed to parse (Missing texvc executable; please see math/README to configure.): Z_i

, where Failed to parse (Missing texvc executable; please see math/README to configure.): Z_i

is independent and identically distributed and drawn from a zero-mean normal distribution with variance Failed to parse (Missing texvc executable; please see math/README to configure.): n
(the noise). The Failed to parse (Missing texvc executable; please see math/README to configure.): Z_i
are further assumed to not be correlated with the Failed to parse (Missing texvc executable; please see math/README to configure.): X_i

.

Failed to parse (Missing texvc executable; please see math/README to configure.): Z_i \sim N(0, n) \,\!
Failed to parse (Missing texvc executable; please see math/README to configure.): Y_i = X_i + Z_i\sim N(X_i, n). \,\!


The capacity of the channel is infinite unless the noise n is nonzero, and the Failed to parse (Missing texvc executable; please see math/README to configure.): X_i

are sufficiently constrained. The most common constraint on the input is the so-called "power" constraint, requiring that for a codeword Failed to parse (Missing texvc executable; please see math/README to configure.): (x_1, x_2, \dots , x_k)
transmitted through the channel, we have:
Failed to parse (Missing texvc executable; please see math/README to configure.): \frac{1}{n}\sum_{i=1}^k x_i^2 \leq P,

where Failed to parse (Missing texvc executable; please see math/README to configure.): P

represents the maximum channel power.

Therefore, the channel capacity for the power-constrained channel is given by:

Failed to parse (Missing texvc executable; please see math/README to configure.): C = \max_{f(x) \text{ s.t. }E \left( X^2 \right) \leq P} I(X;Y) \,\!


Where Failed to parse (Missing texvc executable; please see math/README to configure.): f(x)

is the distribution of Failed to parse (Missing texvc executable; please see math/README to configure.): X

. Expand Failed to parse (Missing texvc executable; please see math/README to configure.): I(X;Y) , writing it in terms of the differential entropy:

Failed to parse (Missing texvc executable; please see math/README to configure.): \begin{align} I(X;Y) = h(Y) - h(Y|X) &= h(Y)-h(X+Z|X) &= h(Y)-h(Z|X) \end{align} \,\!


But Failed to parse (Missing texvc executable; please see math/README to configure.): X

and Failed to parse (Missing texvc executable; please see math/README to configure.): Z
are independent, therefore:
Failed to parse (Missing texvc executable; please see math/README to configure.): I(X;Y) = h(Y) - h(Z) \,\!


Evaluating the differential entropy of a Gaussian gives:

Failed to parse (Missing texvc executable; please see math/README to configure.): h(Z) = \frac{1}{2} \log(2 \pi e n) \,\!


Because Failed to parse (Missing texvc executable; please see math/README to configure.): X

and Failed to parse (Missing texvc executable; please see math/README to configure.): Z
are independent and their sum gives Failed to parse (Missing texvc executable; please see math/README to configure.): Y
Failed to parse (Missing texvc executable; please see math/README to configure.): E(Y^2) = E(X+Z)^2 = E(X^2) + 2E(X)E(Z)+E(Z^2) = P + n \,\!


From this bound, we infer from a property of the differential entropy that

Failed to parse (Missing texvc executable; please see math/README to configure.): h(Y) \leq \frac{1}{2} \log(2 \pi e(P+n)) \,\!


Therefore the channel capacity is given by the highest achievable bound on the mutual information:

Failed to parse (Missing texvc executable; please see math/README to configure.): I(X;Y) \leq \frac{1}{2}\log(2 \pi e (P+n)) - \frac {1}{2}\log(2 \pi e n) \,\!


Where Failed to parse (Missing texvc executable; please see math/README to configure.): I(X;Y)

is maximized when:
Failed to parse (Missing texvc executable; please see math/README to configure.): X \sim N(0, P) \,\!


Thus the channel capacity Failed to parse (Missing texvc executable; please see math/README to configure.): C

for the AWGN channel is given by:
Failed to parse (Missing texvc executable; please see math/README to configure.): C = \frac {1}{2} \log\left(1+\frac{P}{n}\right) \,\!


Channel capacity and sphere packing [link]

Suppose that we are sending messages through the channel with index ranging from Failed to parse (Missing texvc executable; please see math/README to configure.): 1

to Failed to parse (Missing texvc executable; please see math/README to configure.): M

, the number of distinct possible messages. If we encode the Failed to parse (Missing texvc executable; please see math/README to configure.): M

messages to Failed to parse (Missing texvc executable; please see math/README to configure.): n
bits, then we define the rate Failed to parse (Missing texvc executable; please see math/README to configure.): R
as:
Failed to parse (Missing texvc executable; please see math/README to configure.): R = \frac {\log M}{n} \,\!


A rate is said to be achievable if there is a sequence of codes so that the maximum probability of error tends to zero as Failed to parse (Missing texvc executable; please see math/README to configure.): n

approaches infinity. The capacity Failed to parse (Missing texvc executable; please see math/README to configure.): C
is the highest achievable rate.

Consider a codeword of length Failed to parse (Missing texvc executable; please see math/README to configure.): n

sent through the AWGN channel with noise level Failed to parse (Missing texvc executable; please see math/README to configure.): N

. When received, the codeword vector variance is now Failed to parse (Missing texvc executable; please see math/README to configure.): N , and its mean is the codeword sent. The vector is very likely to be contained in a sphere of radius Failed to parse (Missing texvc executable; please see math/README to configure.): \sqrt{n(N+\epsilon)}

around the codeword sent. If we decode by mapping every message received onto the codeword at the center of this sphere, then an error occurs only when the received vector is outside of this sphere, which is very unlikely.

Each codeword vector has an associated sphere of received codeword vectors which are decoded to it and each such sphere must map uniquely onto a codeword. Because these spheres therefore must not intersect, we are faced with the problem of sphere packing. How many distinct codewords can we pack into our Failed to parse (Missing texvc executable; please see math/README to configure.): n -bit codeword vector? The received vectors have a maximum energy of Failed to parse (Missing texvc executable; please see math/README to configure.): n(P+N)

and therefore must occupy a sphere of radius Failed to parse (Missing texvc executable; please see math/README to configure.): \sqrt{n(P+N)}

. Each codeword sphere has radius Failed to parse (Missing texvc executable; please see math/README to configure.): \sqrt{nN} . The volume of an n-dimensional sphere is directly proportional to Failed to parse (Missing texvc executable; please see math/README to configure.): r^n , so the maximum number of uniquely decodeable spheres that can be packed into our sphere with transmission power P is:

Failed to parse (Missing texvc executable; please see math/README to configure.): \frac{(n(P+N))^\frac{n}{2}}{(nN)^\frac{n}{2}} = 2^{\frac{n}{2}\log(1+P/N)} \,\!


By this argument, the rate R can be no more than Failed to parse (Missing texvc executable; please see math/README to configure.): \frac{1}{2}\log(1+P/N) .

Achievability [link]

In this section, we show achievability of the upper bound on the rate from the last section.

A codebook, known to both encoder and decoder, is generated by selecting codewords of length n, i.i.d. Gaussian with variance Failed to parse (Missing texvc executable; please see math/README to configure.): P-\epsilon

and mean zero. For large n, the empirical variance of the codebook will be very close to the variance of its distribution, thereby avoiding violation of the power constraint probabilistically.

Received messages are decoded to a message in the codebook which is uniquely jointly typical. If there is no such message or if the power constraint is violated, a decoding error is declared.

Let Failed to parse (Missing texvc executable; please see math/README to configure.): X^n(i)

denote the codeword for message Failed to parse (Missing texvc executable; please see math/README to configure.): i

, while Failed to parse (Missing texvc executable; please see math/README to configure.): Y^n

is, as before the received vector. Define the following three events:
  1. Event Failed to parse (Missing texvc executable; please see math/README to configure.): U
the power of the received message is larger than Failed to parse (Missing texvc executable; please see math/README to configure.): P

.

  1. Event Failed to parse (Missing texvc executable; please see math/README to configure.): V
the transmitted and received codewords are not jointly typical.
  1. Event Failed to parse (Missing texvc executable; please see math/README to configure.): E_j
Failed to parse (Missing texvc executable; please see math/README to configure.): (X^n(j), Y^n)
is in Failed to parse (Missing texvc executable; please see math/README to configure.): A_\epsilon^{(n)}

, the typical set where Failed to parse (Missing texvc executable; please see math/README to configure.): i \neq j , which is to say that the incorrect codeword is jointly typical with the received vector.

An error therefore occurs if Failed to parse (Missing texvc executable; please see math/README to configure.): U , Failed to parse (Missing texvc executable; please see math/README to configure.): V

or any of the Failed to parse (Missing texvc executable; please see math/README to configure.): E_i
occur. By the law of large numbers, Failed to parse (Missing texvc executable; please see math/README to configure.): P(U)
goes to zero as n approaches infinity, and by the joint Asymptotic Equipartition Property the same applies to Failed to parse (Missing texvc executable; please see math/README to configure.): P(V)

. Therefore, for a sufficiently large Failed to parse (Missing texvc executable; please see math/README to configure.): n , both Failed to parse (Missing texvc executable; please see math/README to configure.): P(U)

and Failed to parse (Missing texvc executable; please see math/README to configure.): P(V)
are each less than Failed to parse (Missing texvc executable; please see math/README to configure.): \epsilon

. Since Failed to parse (Missing texvc executable; please see math/README to configure.): X^n(i)

and Failed to parse (Missing texvc executable; please see math/README to configure.): X^n(j)
are independent for Failed to parse (Missing texvc executable; please see math/README to configure.): i \neq j

, we have that Failed to parse (Missing texvc executable; please see math/README to configure.): X^n(i)

and Failed to parse (Missing texvc executable; please see math/README to configure.): Y^n
are also independent. Therefore, by the joint AEP, Failed to parse (Missing texvc executable; please see math/README to configure.): P(E_j) = 2^{-n(I(X;Y)-3\epsilon)}

. This allows us to calculate Failed to parse (Missing texvc executable; please see math/README to configure.): P^{(n)}_e , the probability of error as follows:

Failed to parse (Missing texvc executable; please see math/README to configure.): \begin{align} P^{(n)}_e & \leq P(U) + P(V) + \sum_{j \neq i} P(E_j) \\ & \leq \epsilon + \epsilon + \sum_{j \neq i} 2^{-n(I(X;Y)-3\epsilon)} \\ & \leq 2\epsilon + (2^{nR}-1)2^{-n(I(X;Y)-3\epsilon)} \\ & \leq 2\epsilon + (2^{3n\epsilon})2^{-n(I(X;Y)-R)} \\ & \leq 3\epsilon \end{align}


Therefore, as n approaches infinity, Failed to parse (Missing texvc executable; please see math/README to configure.): P^{(n)}_e

goes to zero and Failed to parse (Missing texvc executable; please see math/README to configure.): R < I(X;Y) - 3\epsilon

. Therefore there is a code of rate R arbitrarily close to the capacity derived earlier.

Coding theorem converse [link]

Here we show that rates above the capacity Failed to parse (Missing texvc executable; please see math/README to configure.): C = \frac {1}{2} \log(1+\frac{P}{n})

are not achievable. 

Suppose that the power constraint is satisfied for a codebook, and further suppose that the messages follow a uniform distribution. Let Failed to parse (Missing texvc executable; please see math/README to configure.): W

be the input messages and Failed to parse (Missing texvc executable; please see math/README to configure.): \hat{W}
the output messages. Thus the information flows as:

Failed to parse (Missing texvc executable; please see math/README to configure.): W \longrightarrow X^{(n)}(W) \longrightarrow Y^{(n)} \longrightarrow \hat{W}


Making use of Fano's inequality gives:

Failed to parse (Missing texvc executable; please see math/README to configure.): H(W|\hat{W}) \leq 1+nRP^{(n)}_e = n \epsilon_n

where Failed to parse (Missing texvc executable; please see math/README to configure.): \epsilon_n \rightarrow 0
as Failed to parse (Missing texvc executable; please see math/README to configure.): P^{(n)}_e \rightarrow 0


Let Failed to parse (Missing texvc executable; please see math/README to configure.): X_i

be the encoded message of codeword index i. Then:
Failed to parse (Missing texvc executable; please see math/README to configure.): \begin{align} nR & = H(W) \\ & =I(W;\hat{W}) + H(W|\hat{W}) \\ & \leq I(W;\hat{W}) + n\epsilon_n \\ & \leq I(X^{(n)}; Y^{(n)}) + n\epsilon_n \\ & = h(Y^{(n)}) - h(Y^{(n)}|X^{(n)}) + n\epsilon_n \\ & = h(Y^{(n)}) - h(Z^{(n)}) + n\epsilon_n \\ & \leq \sum_{i=1}^{n} Y_i- h(Z^{(n)}) + n\epsilon_n \\ & \leq \sum_{i=1}^{n} I(X_i; Y_i) + n\epsilon_n \end{align}


Let Failed to parse (Missing texvc executable; please see math/README to configure.): P_i

be the average power of the codeword of index i:
Failed to parse (Missing texvc executable; please see math/README to configure.): P_i = \frac{1}{2^{nR}}\sum_{w}x^2_i(w) \,\!


Where the sum is over all input messages Failed to parse (Missing texvc executable; please see math/README to configure.): w . Failed to parse (Missing texvc executable; please see math/README to configure.): X_i

and Failed to parse (Missing texvc executable; please see math/README to configure.): Z_i
are independent, thus the expectation of the power of Failed to parse (Missing texvc executable; please see math/README to configure.): Y_i
is, for noise level Failed to parse (Missing texvc executable; please see math/README to configure.): N
Failed to parse (Missing texvc executable; please see math/README to configure.): E(Y_i^2) = P_i+N \,\!


And, if Failed to parse (Missing texvc executable; please see math/README to configure.): Y_i

is normally distributed, we have that
Failed to parse (Missing texvc executable; please see math/README to configure.): h(Y_i) \leq \frac{1}{2}\log{2 \pi e} (P_i +N) \,\!


Therefore,

Failed to parse (Missing texvc executable; please see math/README to configure.): \begin{align} nR & \leq \sum(h(Y_i)-h(Z_i)) + n \epsilon_n \\ & \leq \sum \left( \frac{1}{2} \log(2 \pi e (P_i + N)) - \frac{1}{2}\log(2 \pi e N)\right) + n \epsilon_n \\ & = \sum \frac{1}{2} \log (1 + \frac{P_i}{N}) + n \epsilon_n \end{align}


We may apply Jensen's equality to Failed to parse (Missing texvc executable; please see math/README to configure.): \log(1+x) , a concave (downward) function of x, to get:

Failed to parse (Missing texvc executable; please see math/README to configure.): \frac{1}{n} \sum_{i=1}^{n} \frac{1}{2}\log\left(1+\frac{P_i}{N}\right) \leq \frac{1}{2}\log\left(1+\frac{1}{n}\sum_{i=1}^{n}\frac{P_i}{N}\right) \,\!


Because each codeword individually satisfies the power constraint, the average also satisfies the power constraint. Therefore

Failed to parse (Missing texvc executable; please see math/README to configure.): \frac{1}{n}\sum_{i=1}^{n} \frac{P_i}{N} \,\!


Which we may apply to simplify the inequality above and get:

Failed to parse (Missing texvc executable; please see math/README to configure.): \frac{1}{2}\log\left(1+\frac{1}{n}\sum_{i=1}^{n}\frac{P_i}{N}\right) \leq \frac{1}{2}\log\left(1+\frac{P}{N}\right) \,\!


Therefore, it must be that Failed to parse (Missing texvc executable; please see math/README to configure.): R \leq \frac{1}{2}\log \left(1+ \frac{P}{N}\right) + \epsilon_n . Therefore, R must be less than a value arbitrarily close to the capacity derived earlier, as Failed to parse (Missing texvc executable; please see math/README to configure.): \epsilon_n \rightarrow 0 .

Effects in time domain [link]

Zero-Crossings of a Noisy Cosine

In serial data communications, the AWGN mathematical model is used to model the timing error caused by random jitter (RJ).

The graph to the right shows an example of timing errors associated with AWGN. The variable Δt represents the uncertainty in the zero crossing. As the amplitude of the AWGN is increased, the signal-to-noise ratio decreases. This results in increased uncertainty Δt.[1]

When affected by AWGN, The average number of either positive going or negative going zero-crossings per second at the output of a narrow bandpass filter when the input is a sine wave is:

Failed to parse (Missing texvc executable; please see math/README to configure.): \frac{\mathrm{positive\ zero\ crossings}}{\mathrm{second}} = \frac{\mathrm{negative \ zero\ crossings}}{\mathrm{second}}


Failed to parse (Missing texvc executable; please see math/README to configure.): = f_0 \sqrt{\frac{\mathrm{SNR} + 1 + \frac{B^2}{12f_0^2}}{\mathrm{SNR} + 1}}


Where

  • f0 = the center frequency of the filter
  • B = the filter bandwidth
  • SNR = the signal-to-noise power ratio in linear terms

Effects in phasor domain [link]

AWGN Contributions in the Phasor Domain

In modern communication systems, bandlimited AWGN cannot be ignored. When modeling bandlimited AWGN in the phasor domain, statistical analysis reveals that the amplitudes of the real and imaginary contributions are independent variables which follow the Gaussian distribution model. When combined, the resultant phasor's magnitude is a Rayleigh distributed random variable while the phase is uniformly distributed from 0 to 2π.

The graph to the right shows an example of how bandlimited AWGN can affect a coherent carrier signal. The instantaneous response of the Noise Vector cannot be precisely predicted, however its time-averaged response can be statistically predicted. As shown in the graph, we confidently predict that the noise phasor will reside inside the 1σ circle about 38% of the time; the noise phasor will reside inside the 2σ circle about 86% of the time; and the noise phasor will reside inside the 3σ circle about 98% of the time.[1]

See also [link]

References [link]

  1. ^ a b McClaning, Kevin, Radio Receiver Design, Noble Publishing Corporation 

https://wn.com/Additive_white_Gaussian_noise

Podcasts:

PLAYLIST TIME:

Real Love

by: Yoko Ono

All the little girls and boys,
Playing with their little toys,
All they really needed from you is maybe some love.
All the little boys and girls,
Living in this crazy world,
All they really needed from you is maybe some love.
Why must we be alone?
Why must we be alone?
It's real love,
Yes, it's real.
I don't expect you to understand,
The king above heaven is in your hand.
I don't expect you to awake from your dreams,
Too late for pride now it seems.
All the little plans and schemes,
Nothing but a bunch of dreams,




×