6.gaussian Random Processes
6.gaussian Random Processes
First, let us remember a few facts about Gaussian random vectors. As we saw before, random variables X1 , X2 ,..., Xn
are said to be jointly normal if, for all a1 ,a2 ,..., an ∈ R, the random variable
a 1 X 1 + a 2 X 2 +. . . +a n X n
X1
⎡ ⎤
⎢ X2 ⎥
⎢ ⎥
⎢ . ⎥
X = ⎢ ⎥
⎢ . ⎥
⎢ ⎥
⎢ . ⎥
⎢ ⎥
⎣ ⎦
Xn
is said to be normal or Gaussian if the random variables X1 , X2 ,..., Xn are jointly normal. An important property of
jointly normal random variables is that their joint PDF is completely determined by their mean and covariance matrices.
More specifically, for a normal random vector X with mean m and covariance matrix C, the PDF is given by
1 1 −1
T
fX (x) = exp{− (x − m) C (x − m)}.
n
−−−−−
(2π) 2 √det C 2
t1 , t2 , … , tn ∈ J ,
the random variables X(t1 ) , X(t2 ) ,..., X(tn ) are jointly normal.
Example 10.12
2
Solution
1. X(1) is a normal random variable with mean E [X(1)] = 0 and variance
2
Var(X(1)) = E [X(1) ]
= RX (0) = 1.
Thus,
1−0
P (X(1) < 1) = Φ ( )
1
= Φ(1) ≈ 0.84
EY = E [X(1)] + E [X(2)]
= 0;
Note that
2 2
Var(X(1)) = E [X(1) ] − E [X(1)]
2
= RX (0) − μ
X
= 1 − 0 = 1 = Var(X(2));
2
= RX (−1) − μ
X
1
−1
= e −0 = .
e
Therefore,
2
Var(Y ) = 2 + .
e
We conclude Y ∼ N (0, 2 +
2
). Thus,
e
⎛ ⎞
1−0
P (Y < 1) = Φ ⎜ −−−−−⎟
2
⎝ √2 + ⎠
e
= Φ(0.6046) ≈ 0.73
An important property of normal random processes is that wide-sense stationarity and strict-sense stationarity are
equivalent for these processes. More specifically, we can state the following theorem.
Theorem 10.1 Consider the Gaussian random processes {X(t), t ∈ R}. If X(t) is WSS, then X(t) is a stationary
process.
Proof
We need to show that, for all t1 , t2 , ⋯ , tr ∈ R and all Δ ∈ R, the joint CDF of
and
From the above, we conclude that the mean vector and the covariance matrix of
t1 , t2 , … , tm ∈ J
and
′ ′ ′ ′
t , t , … , tn ∈ J ,
1 2