0% found this document useful (0 votes)
246 views3 pages

6.gaussian Random Processes

This document introduces Gaussian (normal) random processes. It defines a Gaussian random process as a random process where any finite collection of random variables from the process are jointly normal. It provides an example of a zero-mean Gaussian process and calculates probabilities related to random variables of the process. It states that for Gaussian processes, wide-sense stationarity and strict-sense stationarity are equivalent and proves this using properties of Gaussian random variables. It also defines jointly Gaussian random processes.

Uploaded by

Ahmed Alzaidi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
246 views3 pages

6.gaussian Random Processes

This document introduces Gaussian (normal) random processes. It defines a Gaussian random process as a random process where any finite collection of random variables from the process are jointly normal. It provides an example of a zero-mean Gaussian process and calculates probabilities related to random variables of the process. It states that for Gaussian processes, wide-sense stationarity and strict-sense stationarity are equivalent and proves this using properties of Gaussian random variables. It also defines jointly Gaussian random processes.

Uploaded by

Ahmed Alzaidi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

10.1.

5 Gaussian Random Processes


Here, we will briefly introduce normal (Gaussian) random processes. We will discuss some examples of Gaussian
processes in more detail later on. Many important practical random processes are subclasses of normal random
processes.

First, let us remember a few facts about Gaussian random vectors. As we saw before, random variables X1 , X2 ,..., Xn
are said to be jointly normal if, for all a1 ,a2 ,..., an ∈ R, the random variable

a 1 X 1 + a 2 X 2 +. . . +a n X n

is a normal random variable. Also, a random vector

X1
⎡ ⎤

⎢ X2 ⎥
⎢ ⎥
⎢ . ⎥
X = ⎢ ⎥
⎢ . ⎥
⎢ ⎥
⎢ . ⎥
⎢ ⎥

⎣ ⎦
Xn

is said to be normal or Gaussian if the random variables X1 , X2 ,..., Xn are jointly normal. An important property of
jointly normal random variables is that their joint PDF is completely determined by their mean and covariance matrices.
More specifically, for a normal random vector X with mean m and covariance matrix C, the PDF is given by

1 1 −1
T
fX (x) = exp{− (x − m) C (x − m)}.
n
−−−−−
(2π) 2 √det C 2

Now, let us define Gaussian random processes.


A random process {X(t), t ∈ J } is said to be a Gaussian (normal) random process if, for all

t1 , t2 , … , tn ∈ J ,

the random variables X(t1 ) , X(t2 ) ,..., X(tn ) are jointly normal.

Example 10.12
2

Let X(t) be a zero-mean WSS Gaussian process with RX (τ ) = e


−τ
, for all τ ∈ R.

1. Find P (X(1) < 1) .

2. Find P (X(1) + X(2) < 1) .

Solution
1. X(1) is a normal random variable with mean E [X(1)] = 0 and variance

2
Var(X(1)) = E [X(1) ]

= RX (0) = 1.

Thus,
1−0
P (X(1) < 1) = Φ ( )
1

= Φ(1) ≈ 0.84

2. Let Y = X(1) + X(2) . Then, Y is a normal random variable. We have

EY = E [X(1)] + E [X(2)]

= 0;

Var(Y ) = Var(X(1)) + Var(X(2)) + 2Cov(X(1), X(2)).

Note that

2 2
Var(X(1)) = E [X(1) ] − E [X(1)]

2
= RX (0) − μ
X

= 1 − 0 = 1 = Var(X(2));

Cov(X(1), X(2)) = E [X(1)X(2)] − E [X(1)]E [X(2)]

2
= RX (−1) − μ
X

1
−1
= e −0 = .
e

Therefore,

2
Var(Y ) = 2 + .
e

We conclude Y ∼ N (0, 2 +
2
). Thus,
e

⎛ ⎞
1−0
P (Y < 1) = Φ ⎜ −−−−−⎟
2
⎝ √2 + ⎠
e

= Φ(0.6046) ≈ 0.73

An important property of normal random processes is that wide-sense stationarity and strict-sense stationarity are
equivalent for these processes. More specifically, we can state the following theorem.

Theorem 10.1 Consider the Gaussian random processes {X(t), t ∈ R}. If X(t) is WSS, then X(t) is a stationary
process.

Proof
We need to show that, for all t1 , t2 , ⋯ , tr ∈ R and all Δ ∈ R, the joint CDF of

X(t1 ), X(t2 ), ⋯ , X(tr )

is the same as the joint CDF of

X(t1 + Δ), X(t2 + Δ), ⋯ , X(tr + Δ).


Since these random variables are jointly Gaussian, it suffices to show that the mean vectors and the
covariance matrices are the same. To see this, note that X(t) is a WSS process, so

μX (ti ) = μX (tj ) = μX , for all i, j,

and

CX (ti + Δ, tj + Δ) = CX (ti , tj ) = CX (ti − tj ), for all i, j.

From the above, we conclude that the mean vector and the covariance matrix of

X(t1 ), X(t2 ), ⋯ , X(tr )

is the same as the mean vector and the covariance matrix of

X(t1 + Δ), X(t2 + Δ), ⋯ , X(tr + Δ).

Similarly, we can define jointly Gaussian random processes.


Two random processes {X(t), t ∈ J } and {Y (t), t ∈ ′
J } are said to be jointly Gaussian (normal), if for all

t1 , t2 , … , tm ∈ J

and
′ ′ ′ ′
t , t , … , tn ∈ J ,
1 2

the random variables


′ ′ ′
X(t1 ), X(t2 ), ⋯ , X(tm ), Y (t ), Y (t ), ⋯ , Y (tn )
1 2

are jointly normal.


Note that from the properties of jointly normal random variables, we can conclude that if two jointly Gaussian random
processes X(t) and Y (t) are uncorrelated, i.e.,

CXY (t1 , t2 ) = 0, for all t1 , t2 ,

then X(t) and Y (t) are two independent random processes.


← previous
next →

You might also like