Chapter 5
Chapter 5
5.1
Random Processes
2 2 4 5 11 11 15 17 28 28 29 30 32
Random Processes . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.1.1 5.1.2 5.1.3 Denition and Examples . . . . . . . . . . . . . . . . . . . . . Ensemble Averages and Stationarity . . . . . . . . . . . . . . . Time Averages and Ergodicity . . . . . . . . . . . . . . . . . . Autocorrelation and Autocovariance . . . . . . . . . . . . . . . Power Spectral Density . . . . . . . . . . . . . . . . . . . . . Deterministic Power and Energy Signals . . . . . . . . . . . . .
5.2
Examples of Autocorrelation Functions and Power Spectral Densities 19 Expected Value of the Output Random Process . . . . . . . . .
5.3
Excitation of LTI Systems with Stationary Random Processes . . . . . . . 5.3.1 5.3.2 5.3.3 5.3.4 Autocorrelation Function of the Output Random Process . . . . . Power Spectral Density of the Output Random Process . . . . . . Cross-Correlation between Input and Output Random Process . . .
5.1
5.1.1 Denition and Examples
Random Processes
random variable
random experiment X (t , 1 )
1 1
0.02
0.04
0.06
0.08
0.1
0.12
0.14
0.16
0.18
0.2
* 1 * i * 3 * 2
* X (t , ) 2
0.5 0 0.5 0 0.02 0.04 0.06 0.08 0.1 0.12 0.14 0.16 0.18 0.2
* X (t , 3 )
1 0.5 0 0.5 0 0.02 0.04 0.06 0.08 0.1 0.12 0.14 0.16 0.18 0.2
sample space S
X (t , i )
0.02
0.04
0.06
0.08
0.1
0.12
0.14
0.16
0.18
0.2
t1
Dr. Tanja Karp 5.1 Random Processes
t2
2
Random Process: A random process X (t) describes the mapping of a random experiment with sample space S onto an ensemble of sample functions X (t, i). For each point in time t1, X (t1) describes a random variable. Example: Rolling a Die Random variable: X = i if number i is on top of the die Random process: Y (t) = X cos(0t). Example: Tossing a Coin N Times Random variable: Xn = 0 if the nth result is head, Xn = 1 if the nth result is tail PN Random process: Y (t) = n=1 Xn rect(t n + 0.5).
Y (t , 1 ) t Y (t , 1 ) t Y (t , 1 ) t Y (t , 4 ) t
Dr. Tanja Karp 5.1 Random Processes
Y (t , 2N ) t
3
h( )X (t )d
For each time instance of a random process, the average value, variance etc. can be calculated from all sample functions X (t, i). Expected Value E {X (t)}: For a random process X (t) with probability density function fX (t)(x), the expected value E {X (t)} = mX (t) is given by: Z x fX (t)(x)dx = mX (t) E {X (t)} =
Variance X (t):
Z
|x mX (t)| fX (t)(x)dx
Random Processes
For a stationary random process the probability density function is independent of time t, thus the expected value and the variance are also a constant over time.
t, t0
X (t) = X (t + t0) = X
So far, the average value and the variance of a random process X (t) were calculated based on the probability density function fX (t). However, in practical experiments the probability density function of a random process is often unknown. Also, in many cases, there is only one sample function X (t, i) available. Therefore, it is favorable to average over time instead of taking the ensemble average. Average Value mX (i):
mX (i)
2 Variance X ( ) : i 2 X ( ) i
1 = lim T T
T /2
X (t, i)dt
T / 2
1 = lim T T
T /2 T / 2
(X (t, i) mX (i)) dt
5.1
Random Processes
ensemble average
random experiment X (t , 1 )
1 1
0.02
0.04
0.06
0.08
0.1
0.12
0.14
0.16
0.18
0.2
* 1 * i * 3 * 2
* X (t , ) 2
0.5 0 0.5 0 0.02 0.04 0.06 0.08 0.1 0.12 0.14 0.16 0.18 0.2
* X (t , 3 )
1 0.5 0 0.5 0 0.02 0.04 0.06 0.08 0.1 0.12 0.14 0.16 0.18 0.2
sample space S
X (t , i )
time average
0 0.02 0.04 0.06 0.08 0.1 0.12 0.14 0.16 0.18 0.2
t1
t2
5.1
Random Processes
Ergodicity: A stationary random process X (t) is called ergodic, if the time averages of each sample function X (t, i) converge towards the corresponding ensemble average with probability one. In practical applications ergodicity is often assumed since just one sample function is available and therefore the ensemble averages cannot be taken. Example 1: Random process: X (t) = A cos(0t) A: discrete random variable with P (A = 1) = P (A = 2) = 0.5 0: constant Ensemble average:
5.1
Random Processes
Example 2: Random process: X (t) = A A: discrete random variable with P (A = 1) = P (A = 2) = 0.5 Ensemble average:
mX (1)
1 = lim T T 1 = lim T T
T /2
T /2
1 dt = 1
T / 2
T /2
mX (2)
2 dt = 2
T / 2
time averages taken for dierent sample functions are not identical to the ensemble average, the random process is thus not ergodic.
5.1
Random Processes
Example 3: Tossing a Coin N Times Random variable: Xn = 0 if the nth result is head, Xn = 1 if the nth result is tail. N P Xn rect(t n + 0.5) Random process: Y (t) =
n=1
Y (t , 1 ) t Y (t , 1 ) t Y (t , 1 ) t Y (t , 4 ) t Y (t , 2N ) t
Ensemble average:
for N , n1 and n2 converge towards N/2 and mY (t,i) = mY (t) = mY . The random process is thus ergodic.
Dr. Tanja Karp 5.1 Random Processes 9
5.1
Random Processes
10
5.2
5.2.1
We are interested in how the value of a random process X (t) evaluated at t2 depends on its value at time t1. At t1 and t2 the random process is characterized by random variables X1 and X2, respectively. The relationship between X1 and X2 is given by the joint probability density function
X (t , 1 )
1 1 0.5
0.02
0.04
0.06
0.08
0.1
0.12
0.14
0.16
0.18
0.2
X (t , 2 )
0 0 0.02 0.04 0.06 0.08 0.1 0.12 0.14 0.16 0.18 0.2
0.5 1
X (t , 3 )
0.5 0 0 0.02 0.04 0.06 0.08 0.1 0.12 0.14 0.16 0.18 0.2
0.5
0.02
0.04
0.06
0.08
0.1
0.12
0.14
0.16
0.18
0.2
t1
t2
Autocovariance Function:
CXX (t1, t2) = E {(X (t1) mX (t1)) (X (t2) mX (t2))} Z Z = (x1 mX (t1))(x2 mX (t2)) fX (t1)X (t2)(x1, x2)dx1dx2
Autocorrelation and Autocovariance Function of a Stationary Random Process: The joint probability density function of a stationary process does not change if a constant value t is added to both t1 and t2.
=
Dr. Tanja Karp
12
Since the average value is a constant, the autocovariance function is given by:
CXX (t1, t2) = E {(X (t1) mX ) (X (t2) mX )} Z Z = (x1 mX )(x2 mX ) fX (t1)X (t2)(x1, x2)dx1dx2
Z Z
Symmetry: RXX ( ) = RXX ( ) Mean Square Average: RXX (0) = E {X (t)2} 0 Maximum: RXX (0) |RXX ( )| Periodicity: if RXX (0) = RXX (t0), then RXX ( ) is periodic with period t0.
13
Wide Sense Stationary (WSS) Random Process: A random process X (t) is called WSS if the following three properties are satised:
The average value of the random process is a constant: mX (t) = mX The autocorrelation and autocovariance function only depend on the time dierence = t1 t2: RXX (t1, t2) = RXX (t2 t1) = RXX ( ) CXX (t1, t2) = CXX (t2 t1) = CXX ( )
2 The variance is constant and nite: X = CXX (0) = RXX (0) m2 X <
1 RXX ( ) = lim T T 1 T T
T /2 Z
T /2 Z
CXX ( ) = lim
XT (t, i): sample function of random process X (t) windowed to be of length T (starting at T /2 ending at T /2).
Dr. Tanja Karp 5.2 Autocorrelation and Power Spectra 14
5.2.2
Motivation:
Description of random processes in the frequency domain Calculation of the Fourier Transform of a sample function is not useful We assume in the following that the random process considered is at least WSS if not stationary.
The power spectral density (psd) of a WSS random process X (t) is dened as the Fourier Transform of the autocorrelation function RXX ( ): Z j 2f SXX (f ) = F {RXX ( )} = RXX ( ) e d
Inverse transform:
Z
RXX ( ) =
SXX (f ) e
j 2f
df
SXX (f ) = SXX (f ),
Dr. Tanja Karp
SXX (f ) 0,
Im{SXX (f )} = 0
15
Ergodic Random Process x(t): Autocorrelation Function: T /2 T /2 Z Z 1 1 xT (t, i)xT (t + , i)dt = lim xT (t)xT (t + )dt Rxx( ) = lim T T T T
T / 2 T / 2
Sxx(f ) = Z
Rxx( ) e
j 2f
d
j 2f
1 = lim T T 1 = lim T T Z
T /2 T / 2
T /2 T / 2
x T (t + ) e | {z xT (t) e
j 2f t
j 2f
d dt }
1 = XT (f ) lim T T
XT (f )ej 2f t T /2 T / 2
dt
16
5.2.3
Power Signal: The autocorrelation function and the power spectral density can also be calculated for a deterministic power signal x(t). In this case, the signal simply replaces the random process. With xT (t) = x(t) rect(t/T ) we obtain for the autocorrelation function:
1 Rxx( ) = lim T T
T /2 Z
xT (t)xT (t + )dt
T / 2
and for the power spectral density: Z XT (f )XT (f ) |XT (f )|2 Sxx(f ) = Rxx( ) exp(j 2f )d = lim = lim T T T T
1 P = Rxx(0) = lim T T
T /2 Z
xT (t)dt =
T / 2
Sxx(f )df
17
Energy Signals: For an energy signal x(t) an energy autocorrelation function can be dened as Z E Rxx( ) = x(t)f (t + )dt = x( ) x( )
Applying the Fourier Transform to the energy autocorrelation function, we obtain the energy spectral density as: Z E E 2 Sxx(f ) = Rxx( ) exp(j 2f )d = X (f )X (f ) = |X (f )|
E = Rxx(0) =
x(t) dt =
1 2
Sxx(f )df =
|X (f )| df
18
5.2.4
x(t) = A sin(0t + ) A, 0: constant values, : random variable with probability density function f(x): 1/2 for x < f(x) = 0 otherwise
Average value:
A sin(0t + x)f(x) dx Z
1 A sin(0t + x) dx = 0 2
The average value is a constant and independent of t. Since mx = 0 the autocorrelation and autocovariance functions are identical.
Dr. Tanja Karp 5.2 Autocorrelation and Power Spectra 19
Autocorrelation function:
Rxx(t1, t2) = E {0.5A cos(0(t2 t1)) 0.5A cos(0(t2 + t1) + 2)} = 0.5A E {cos(0(t2 t1))} 0.5A E {cos(0(t2 + t1) + 2)} = 0.5A cos(0(t2 t1)) 0 = Rxx( )
The autocorrelation function only depends on = t2 t1 but not on the absolute values of t1 and t2. Power spectral density:
2 2 2
x(t, i) = A sin(0t + i)
Average value (time average):
mx(i)
Autocorrelation function (time average): Z T /2 1 Rx(i)x(i)( ) = lim x(t, i)x(t + , i)dt T T T / 2 Z T /2 1 2 A sin(0t + i) sin(0(t + ) + i)dt = lim T T T / 2 Z T /2 A2 1 = lim cos(0 ) + cos(0(2t + ) + 2i)dt 2 T T T /2 A2 cos(0 ) = Rxx( ) = 2 Time averages of one sample function and ensemble averages are identical the random process is ergodic.
Dr. Tanja Karp 5.2 Autocorrelation and Power Spectra 21
A = 1, 0 = 2
Sample Functions 1 x(t,1) 0 1 1 x(t,2) 0 1 1 x(t,3) 0 1 1 x(t,4) 0 1 1 x(t,5) 0 1 0.5 Autocorrelation Function Rxx()
0.5
1.5
2.5
3.5
4.5
5 0
0.5
1.5
2.5
3.5
4.5
5 0.5 3 2 1 0 1 2 3
0.5
1.5
2.5
3.5
4.5
0.5
1.5
2.5
3.5
4.5
0.5
1.5
2.5 t in sec
3.5
4.5
0 3
0 f
22
Example 2: Binary Data Transmission A binary sequence is transmitted by rectangular pulses of width Tb. The amplitude of the pulse is determined by each bit, i.e. it is one if the bit is one and zero if the bit is zero. We assume that ones and zeros are equally likely and that each bit is statistically independent of all others. Using ergodicity, we obtain the following results from a sample function x(t, 1): Average value (sample function of length N bits with n1 ones): Z T Z N Tb 1 1 x(t, 1)dt = lim x(t, 1)dt mx = E {x(t, 1)} lim N T T N T b 0 0
Z
0
23
0.5 0 2 4 6 8 10 t/T 12 14 16 18 20
b
0.25
Shifted Copies of Sample Function 1.5 1 0.5 0 0.5 1.5 1 0.5 0 0.5 1.5 1 0.5 0 0.5 x(t+0.3Tb,1) 0 5 4 3 2 1 0 /Tb 1 2 3 4 5
10
12
14
16
18
20
0.75
x(t+Tb,1)
0.5 0 2 4 6 8 10 12 14 16 18 20 0.25
x(t+1.5Tb,1)
10 t/Tb
12
14
16
18
20
0 4
0 f Tb
24
f (t)f (t + )dt = ( )
Sf f (f ) = F{Rf f ( )} = sinc (f )
Autocorrelation Function Power Spectral Density
0 f
25
Example 4: White Noise A random process n(t) is called white noise, if it has a constant power spectral density of N0/2 watts per Hz measured over positive frequencies. If in addition the random process has zero mean (mn = 0), the power spectral density is given by:
Snn(f ) = N0/2
Autocorrelation function:
for all f
Rnn( ) = F
{Snn(f )} =
N0 (t) 2
Since only the rst and second moment of the process are known, the probability density function cannot be uniquely determined. In the case of a Gaussian probability density function, the process is called white Gaussian noise. If the white Gaussian noise is added to the signal, we denote it as additive white Gaussian noise (AWGN).
26
1 n(t) 0
1 2
5 t
10
Snn ( f ) N0 /2 f
27
5.3
Excitation of an LTI system with sample function x(t, i) of a stationary random process x(t).
x(t , i )
y(t , i )
> :
j 2 0 1
E {x(t )} h( )d =
mx h( ) e | {z } d = mxH (0)
28
5.3.2
Ryy ( ) = E {y (t)y (t + )} = E {(x(t) h(t))(x(t + ) h(t + ))} 8 9 > > Z <Z = =E h( )x(t )d h()x(t + )d > > : ;
Z Z
=
Z Z
=
Z =
h( )h()Rxx( + )dd
Z
h( )h( + )d Rxx( )d | {z
R E ( ) hh
5.3.3
Rhh( ) = h( ) h( ) h( ) H (f ),
E
h( ) H (f )
2
Rhh( ) H (f )H (f ) = |H (f )|
and thus
2
Syy (f ) = Sxx(f ) |H (f )|
Rxx ()
x(t , i )
y(t , i )
Ryy () = Rxx () RE hh ()
Sxx ( f )
30
Example: Ideal Lowpass Filtering of White Noise Input random process ni(t):
HLP (f ) =
P no =
Snono (f )df =
fc
0.5N0df = N0fc
N0 /2 fc
Dr. Tanja Karp
HLP ( f )
1 S ni ni ( f )
N0 /2 f fc fc f
31
S no no ( f )
fc
5.3.4
The autocorrelation function describes the statistical properties of two random variables X1 and X2 taken from the same random process at times t1 and t2, respectively, X1 = X (t1) and X2 = X (t2). The cross-correlation function describes the statistical properties of two random variables X1 and Y2 taken from two dierent random processes X (t) and Y (t) (here input and output of an LTI system) at times t1 and t 2, respectively, such that X1 = X (t1) and Y2 = Y (t2). It is dened as:
RXY ( ) = E {X (t) Y (t + )}
Two random processes X (t) and Y (t) are called uncorrelated if
32
Here:
Rxy ( ) = E {x(t) y (t + )} = E {x(t)(x(t + ) h(t + ))} 8 9 > > Z < = = E x(t) h()x(t + )d > > : ;
Z
=
Z
h()E {x(t)x(t + )} d
h()Rxx( )d = h( ) Rxx( )
Example: System Identication An LTI system with unknown impulse response is excited with a white noise random process ni(t) with power spectral density Snini = N0/2. The output noise process is no(t). The cross-correlation between input and output noise process is given by: N0 N0 ( ) = h( ) Rnino ( ) = h( ) Rni,ni ( ) = h( ) 2 2
33