Razavi RFIC
Razavi RFIC
= 1 pan lim = t)dt, (2.57) fing [eo 57) which is also called the “mean square” power (with respect to a 1-@ resistor) if n(¢) isa voltage quantity. The second-order ensemble average is n(t) | * 12 (t) Py dn (2.58) For our purposes, < 17(t) > = n?(t). Probability Density Function When considering a random signal in the time domain, we usually need to know how often its amplitude is between certain limits, For example, if a binary data sequence is corrupted by additive noise (Fig. 2.20), it is important to find the probability that a logical ONE is32 Chap. 2 Basic Concepts in RF Design interpreted as a ZERO and vice versa, that is, how often the noise amplitude exceeds half of the signal amplitude. The amplitude statistics of a random signal x(0) is characterized by the probability density function, P,(x), defined as P.(x)dx = probability ofx < X < x + dx, (2.59) where X is the measured value of x(t) at some point in time. To estimate the PDF, we sample x(#) at many points (for many functions in the ensemble), construct bins of small width, choose the bin height equal to the number of samples whose value falls between the two edges of the bin, and normalize the bin heights to the total number of samples. Note that the PDF provides no information as to how fast the random signal varies in the time domain t Figure 2.20 Binary signal corrupted by noise. An important example of PDFs is the Gaussian (or normal) distribution. ‘The central limit theorem states that if many independent random processes with arbitrary PDFs are added, the PDF of the sum approaches a Gaussian distribution. Itis therefore not surprising that many natural phenomena exhibit Gaussian statistics. For example, since the noise of a resistor results from random “walk” of a very large number of electrons, each having relatively independent statistics, the overall amplitude follows a Gaussian PDF. ‘The Gaussian PDF is defined as 1 —@ — m)? 260 odin? 2 26) where @ and m are the standard deviation and mean of the distribution, re- spectively. From the PDF of the amplitude ofa random signal, we can also answer the following question: If a large number of samples are taken, what percentage will fall between x; and x2? This is given by the area under P,(x) from x; to xp, and for a Gaussian PDI m4 P(t;oo. As shown in Fig. 2.21, this condition is violated by two classes of signals: periodic waveforms and random signals, In most cases, however, these waveforms have a finite power: 1 pa P= jim = Ix(t)|’"dt < 00. 2.65) ting [oot (265) For periodic signals with P < 00, the Fourier transform can still be defined by representing each component of the Fourier series with an impulse in the frequency domain. For random signals, on the other hand, this is generally not possible because a frequency impulse indicates the existence of a deterministic sinusoidal component. Another practical problem is that even if we somehow define a Fourier transform for a random (stationary or nonstationary) process, the result itself is also a random process [5]. : SS ANANANMA t - t Sughd te ? t 3 The definition of energy can be visualized if.x() is a voltage applied across a 1-2 resistor.34 © Chap.2 Basic Concepts in RF Design From the above discussion we infer that frequency-domain characteristics of random signals are embodied in a function different from a direct Fourier transform. The power spectral density (PSD) (also called the “spectral density” orsimply the “spectrum”) is such a function. Before giving a formal definition of PSD, we describe its meaning from an intuitive point of view [6]. The spectral density, S-(f), of a random signal x(t) shows how much power the signal carries in a unit bandwidth around frequency f, As illustrated in Fig, 2.22, if we apply the signal to a bandpass filter with a 1-Hz bandwidth centered at f and measure the average output power over a sufficiently long time (on the order of 1 s), we obtain an estimate of S,(f). If this measurement is performed for each value of f, the overall spectrum of the signal is obtained. This is in fact the principle of operation of spectrum analyzers.* Band-Pass Filters, Power Meters Figure 222 Measurement of spectrum. The formal definition of the PSD is as follows [3]: S61) = jim, ver (2.66) where Xr(f) -[ x(t) exp(—j2mft)dt. (2.67) 0 4 Building a low-loss BPF with 1-Hz bandwidth and a center frequeney of, say, 1 GEIz is im- practical, Thus, actual spectrum analyzers both translate the spectrum to a lower center frequency and measure the power in a band wider than 1 Hz.Sec.23 Random Processes and Noise 35 ‘The definition can be understood with the aid of a corresponding computational algorithm (Fig. 2.23): (1) truncate x(t) to a relatively long interval (0, 7], (2) calculate the Fourier transform of the result and hence |Xy(f)|?, (3) repeat steps 1 and 2 for many sample functions of x(t) (e.g, for many noise voltage waveforms measured across a resistor), and (4) take the average of all |Xr()/? functions to arrive at [X7(f)/? and normalize the result to T. This algorithm proves useful in time-domain simulations incorporating random noise wave forms. 4x, (n/? x(t) s wat Nien marrage, ot. AL ; va (f)|? erates oath, Figure 2.23. Algorithm for PSD estimation. Since Sy(f) is an even function of f for real x(/) [3], as depicted in Fig. 2.24(a) the total power carried by x(t) in the frequency range [ /; fs] is equal to “fi phe a f SPdf t+ fo SA Pdf = f 2S.( Pdf. (2.68) fh fi A In fact, the right-hand side integral is the quantity measured by a spectrum analyzer; ie., the negative-frequency part of the spectrum is folded around the vertical axis and is added to the positive-frequency part [Fig. 2.24(b)]. W call the representation of Fig. 2.24(a) the “two-sided” spectrum and that of . 2.24(b) the “one-sided” spectrum @ ) Figure 2.24 (a) Two-sided and (b) one-sided spectra,36 Chap. 2 Basic Concepts in RF Design In graphical analysis of frequency-domain operations, itis generally sim- pler to use a two-sided spectrum, whereas actual noise calculations are more casily carried out with a one-sided spectrum, Nevertheless, these two repre- sentations bear no fundamental difference—though they can cause confusion. Asan example of $,(/), we consider the thermal noise voltage across a resistor of value R. The two-sided PSD is Si(f) = 2kTR, (2.69) where & is the Boltzmann constant and equal to 1.38 x 10-* J/K and T is the absolute temperature. Such a flat spectrum is called “white” because it contains the same level of power at all frequencies. Equation (2,69) raises two interesting questions, First, is the total noise power of a resistor [the area under S,(f)] infinite? In reality, S, () is flat for only | f| < 100 GHz, dropping beyond this frequency such that the total power remains finite [3]. Second, is the dimension of 2k7 R power per unit bandwidth (W/Hz)? No, the actual dimension is mean square voltage per unit bandwidth. We tacitly assume that this voltage is applied across a 1-& resistor to generate a power of 2kTR in a 1-Hz bandwidth. In circuit noise calculations, we often write Ve = 4kTR- Af, (2.70) where V7 is the mean square noise voltage generated by resistor R in a band- width Af. Called the “spot noise” for Af = 1 Hz, V2 is measured in V?/Hz. ‘To summarize the concepts of PDF and PSD, we note that the former is a statistical indication of how often the amplitude of a random process falls in a given range of values while the latter shows how much power the signal is expected to contain in a small frequency interval. In general, the PDF and PSD bear no relationship: thermal noise has a Gaussian PDF and a white PSD, whereas flicker (1//) noise has the same type of PDF but a PSD proportional tol/f. Random Signalsin Linear Systems _ The principal reason for defining the power spectral density function is that it allows many of the frequency-domain operations used with deterministic signals to be applied to random processes as well, It can be shown that if a signal with spectral density S,(f) is applied toa linear time-invariant system with transfer function H(s) (Fig, 2.25), then ‘the output spectrum is Sf) = SCMIACP, Qn where H(f) = H(s = j2rf) [3]. This agrees with our intuition that the spectrum of the signal is shaped by the transfer function of the system. It can also be shown that if x(#) is Gaussian, so is y(t) [3]