0% found this document useful (0 votes)
61 views16 pages

Slide 23

This document discusses random signals and random processes. A random signal is a member of an ensemble of possible signals, either discrete or continuous time, such as white noise. A random process is an infinite collection of random variables indexed by time or another parameter. Random processes can model experiments that evolve over time, like noise in a communication channel. A random process can be viewed as a function of both time and the outcome of the underlying random experiment. Properties of random processes include their mean, autocorrelation function, and power spectral density. A process is stationary if its properties do not change over time shifts.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
61 views16 pages

Slide 23

This document discusses random signals and random processes. A random signal is a member of an ensemble of possible signals, either discrete or continuous time, such as white noise. A random process is an infinite collection of random variables indexed by time or another parameter. Random processes can model experiments that evolve over time, like noise in a communication channel. A random process can be viewed as a function of both time and the outcome of the underlying random experiment. Properties of random processes include their mean, autocorrelation function, and power spectral density. A process is stationary if its properties do not change over time shifts.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

EE 179, Lecture 23, Handout #39

Random Signals
A random signal is one of an ensemble of possible signals, discrete time
(time series) or continuous time, such as white noise

A random process (or stochastic process) is an infinite indexed collection


of random variables {X(t) : t T }, defined over a common probability
space

The index parameter t is typically time, but can also be a spatial


dimension.

Random processes are used to model random experiments that evolve in


time:

Received sequence/waveform at the output of a communication channel


Packet arrival times at a node in a communication network
Thermal noise in a resistor
Scores of an NBA team in consecutive games
Daily price of a stock
Winnings or losses of a gambler

EE 179, May 28, 2014

Lecture 23, Page 1

Two Ways to View a Random Process


A random process can be viewed as a function X(t, ) of two variables,
time t T and the outcome of the underlying random experiment
For fixed t, X(t, ) is a random variable over

For fixed , X(t, ) is a deterministic function of t, called a sample functi


X(t, 1 )

t
X(t, 2 )

t
X(t, 3 )

t1
t2
X(t1 , ) X(t2 , )
EE 179, May 28, 2014

t
Lecture 23, Page 2

Discrete-Time Random Process Example


Let Z U[0, 1], and define the discrete time process Xn = Z n for n 1.
Sample paths:
xn
Z=

1
2
1
4

1
2

1
8

1
16

n
xn
Z=

1
4

1
4

1
16

1
64

xn
Z=0
0
1
EE 179, May 28, 2014

0
2

0
3

...
4

...

n
Lecture 23, Page 3

Continuous-Time Random Process Example


Sinusoidal signal with random phase:
X(t) = cos(t + ) ,

t0

where U[0, 2] and and are constants


Sample functions:
x(t)

=0

3
2

x(t)
=

x(t)
=

EE 179, May 28, 2014

Lecture 23, Page 4

Characterization of Random Process


Some random processes can be described analytically. E.g.,
x(t) = A cos(c t + )
where is uniformly distributed in the range [0, 2). Sample functions are
sinusoids with random phase.
In general, a random process is described by joint CDF of n random
variables of the process for all n.
FX(t1 )X(t2 )X(tn ) (x1 , x2 , . . . , xn ) =
P{X(t1 ) x1 , X(t2 ) x2 , . . . , X(tn ) xn }
Kolmogorov showed that if these CDFs were consistent for all n, then the
random process was well defined.

EE 179, May 28, 2014

Lecture 23, Page 5

Ensemble with Finite Number of Sample Functions


Shown below are sample functions of a binary polar random process.
Later we will calculate the frequency content of this process.

EE 179, May 28, 2014

Lecture 23, Page 6

Mean and Autocorrelation


The mean of a random process is determined by the first order PDF.
Z
xpX (x; t) dx
X(t) = E(X(t)) =

The autocorrelation is determined by second order PDF.


RX (t1 , t2 ) = X1 (t)X2 (t) = E(X1 (t)X2 (t))
Z Z
=
x1 x2 pX (x1 , x2 ; t1 , t2 ) dx1 dx2

The autocorrelation function gives information about the frequency content


of the random process.

EE 179, May 28, 2014

Lecture 23, Page 7

Autocorrelation Examples

EE 179, May 28, 2014

Lecture 23, Page 8

Strong Sense Stationary


A random process is strictly stationary (strong-sense stationary) if time
shifts do not change probabilities. For all n, , x1 , . . . , xn ,
P{X(t1 ) x1 , . . . , X(tn ) xn } =
P{X(t1 + ) x1 , . . . , X(tn + ) xn }
In particular, the first order pdf is the same for every t.
Z
Z
xpX (x; t2 ) = X(t2 )
xpX (x; t1 ) dx =
X(t1 ) =

The autocorrelation function of a SSS random process depends only on


difference t2 t1 .
RX (t1 , t2 ) = X(t1 )X(t2 ) = X(t1 + )X(t2 + )
We write autocorrelation as a function of delay.
RX ( ) = RX (t2 t1 )
EE 179, May 28, 2014

Lecture 23, Page 9

Wide-Sense (Weakly) Stationary


A random process is wide-sense stationary (WSS) if its mean and
autocorrelation are time invariant:
X(t) = constant
RX (t1 , t2 ) = RX (t2 t1 ) = X(t1 )X(t2 )
The power of a WSS random process is also time invariant.
E(X(t)2 ) = X(t)X(t) = RX (0)
Important facts about autocorrelation:
The maximum value of |RX ( )| occurs for = 0.

If RX ( ) = RX (0) then X(t) is periodic and conversely.

The PSD of a WSS random process is SX (f ) = F{RX (t)}.


Z
Z
SX (f ) df
SX (f ) df = 2
Total power of WSS r.p. is

For complex-valued random processes,


EE 179, May 28, 2014

Lecture 23, Page 10

PSD of Low-Pass White Noise


White noise with PSD N0 /2 is a low-pass filtered.
N0  f 
RX ( ) = N0 B sinc(2B )

SX (f ) =
2
2B

EE 179, May 28, 2014

Lecture 23, Page 11

Sample Functions of Low-Pass White Noise


0.2
0
0.2

0.5

1.5

2.5

3.5

4.5

0.5

1.5

2.5

3.5

4.5

0.5

1.5

2.5

3.5

4.5

0.5

1.5

2.5

3.5

4.5

0.2
0
0.2
0.1
0
0.1
0.5
0
0.5

EE 179, May 28, 2014

Lecture 23, Page 12

Random Phase Cosine


Let X(t) = A cos(c t + ) where is random from [0, 2).
Once is chosen, the signal realization is known.

The random phase process is wide-sense stationary.


EE 179, May 28, 2014

Lecture 23, Page 13

PSD of Random Phase Cosine


The random phase cosine process is WSS.
Z 2
1
A cos(c t + ) d = 0
X(t) = A cos(c t + ) =
2
0
RX (t1 , t2 ) = A cos(c t1 + ) A cos(c t2 + )
= 12 A2 cos(c (t2 t1 ) + cos(c (t2 + t1 ) + 2
= 21 A2 cos(c (t2 t1 ))

The mean is constant, and the autocorrelation depends only on t2 t1 .


Therefore the process is WSS.

The random phase cosine process is SSS. Exercise for the reader.
EE 179, May 28, 2014

Lecture 23, Page 14

Random Binary Process


A discrete-time random process is not stationary because the signals change
at specific times, multiple of Tb .
A standard trick to make the process stationary is to shift by a random
phase. In other words, let time t = 0 be random.

The random waveforms can be written in terms of the phase shift:


X
X(t) =
an p(t nTb ) , [0, Tb ] uniform
n

We can use this formula to find the autocorrelation.


EE 179, May 28, 2014

Lecture 23, Page 15

Random Binary Process (cont.)


If t2 > t1 + Tb then X(t1 ) and X(t2 ) are independent
RX (t1 , t2 ) = X(t1 )X(t2 ) = X(t1 )X(t2 ) = 0 0 = 0 .
If | | = |t2 t1 | < 1 then the pulses overlap and the overlap decreases as
1. As shown in the figure,
RX ( ) = (Tb ) SX (f ) = Tb sinc2 (Tb vf )

As expected, most of the power of the binary process is contained within


1/TB Hz.
EE 179, May 28, 2014

Lecture 23, Page 16

You might also like