0% found this document useful (0 votes)
38 views16 pages

6 Random Signal Analysis

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
38 views16 pages

6 Random Signal Analysis

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

1

Random Signal Analysis

• Random Variables and Random Processes


• Signal Transmission through a Linear System
2

Discrete Random Variables

• A discrete random variable takes on a countable number


of possible values.
Suppose that a discrete random variable X takes on one of the values
x1,…, xn.

 Distribution functions: n

Probability Mass Function: p ( xi )  Pr{ X  xi }


 p( x )  1
i 1
i

Cumulative Distribution Function: F (a )  Pr{ X  a}   p ( xi )


xi  a
 Moments:
n
Expected Value, or Mean:  X  E[ X ]   xi p ( xi )
n i 1
The m-th Moment: E[ X ]   x p( xi ), m  1, 2,...
m m
i
i 1
3

Continuous Random Variables

• A continuous random variable has an uncountable set of


possible values.
X is a continuous random variable if there exists a nonnegative
function f, defined for all real x  (, ), having the property that for
any set B of real numbers, Pr{ X  B}   f ( x)dx.
B

 f is called the probability density function (pdf) 

of X, denoted as: f X ( x) 


f X ( x)dx  1
a
 Cumulative Distribution Function (cdf): FX (a)   f X ( x)dx


 Expected Value, or Mean:  X  E[ X ]   xf X ( x)dx


 The m-the Moment: E[ X m ]   x m f X ( x)dx, m  1, 2,...

4

Variance

• Define variance of a random variable X as:


Var[ X ]  E ( X  E[ X ]) 2 

 Var[X] describes how far apart X is from its mean on the average.
 Var[X] can be also obtained as: Var[ X ]  E[ X 2 ]   E[ X ]
2

 Var[X] is usually denoted as  X2 .


 The square root of Var[X],  X , is called the standard deviation
of X.
5

Example 1. Uniform Distribution

• X is a uniform random variable on the interval (, ) if its pdf is given by


 1 f X ( x)
  x
f X ( x)     
1
 
0
 otherwise
  x

 cdf:  0 a 

a  a 
FX (a )   f X ( x)dx    a
 


 1 a
  
 Mean:  X   xf X ( x)dx   Variance:
 2
 The second moment:  X2  E[ X 2 ]   X2
  2     2 (   )2
E[ X 2 ]   x 2 f X ( x)dx  
 3 12
6

Example 2. Gaussian (Normal) Distribution

• X is a Gaussian random variable with parameters 0 and  02 if its pdf is


given by
fX(x)
1  ( x  0 ) 2 
f X ( x)  exp   
2 0  2 02 
0 x
• X is denoted as X   ( 0 ,  02 )

 Mean:  X  0  Variance:  X2   02

a 1  ( x  0 ) 2 
 cdf: FX (a )   f X ( x)dx  1   exp   dx
 a
2 0  2 2
0 
x  0
z 
 1  z2  a   x2 
 1  a0
0
exp   dz  1  Q   Q( )  
 1
exp   dx
0 2  2  
 0  
2  2
7

More about Q Function


 1  x2 
Q( )   exp   dx

2  2
• Q() is a decreasing function of .

• For X   ( 0 ,  0 ) ,
2

  1  ( x  0 ) 2   a  0 
Pr{ X  a}   f X ( x)dx   exp   dx  Q  
a a
2 0  2 2
0    0 

fX(x)
Pr{ X  a}

0 a x

Read the textbook for more discussion about Q function.


8

Random Processes

• Sample values of a random process at time t1, t2, …, are a


collection of random variables {X(t1), X(t2), …}.

– Continuous-time random process: t   (set of real numbers)

– Discrete-time random process: t   (set of integers)

• Statistical description of random process X(t)

– A complete statistical description of a random process X(t) is known


if for any integer n and any choice of (t1 ,..., tn )   , the joint PDF of
n

( X (t1 ),..., X (tn )) is given.


Difficult to be obtained!
9

Statistical Averages

• The mean of the random process X(t):



 X (tk )  E[ X (tk )]   xf X (t ) ( x)dx
 k

X(tk) is the random variable obtained by observing the random process X(t)
at time tk, with the pdf f X (t ) ( x).
k

• The autocorrelation function of the random process X(t):


 
RX (t1 , t2 )  E[ X (t1 ) X (t2 )]    x x f X (t1 ), X (t2 ) ( x1 , x2 )dx1dx2
  1 2

f X ( t1 ), X ( t2 ) ( x1 , x2 ) is the joint pdf of X(t1) and X(t2).


10

Power and Power Spectrum of Random Signal


Time Domain
1 T /2
 T T / 2
Deterministic signal s(t): Ps  Tlim
2
| s (t ) | dt

Random signal (described as a random process X(t)):


 1 T /2 2  1 T /2
PX  E  lim 
T  T T / 2
X (t )dt   Tlim
  T

T / 2
RX (t , t )dt

Frequency Domain
Deterministic signal:


1 1 T /2
Ps  Gs ( f )df , Gs ( f )  lim | ST ( f ) |  lim  s (t   ) s* (t )dt
2
 T  T T  T T / 2

Random signal:
 1 1 T /2
 
PX  GX ( f )df , GX ( f )  lim E | X T ( f ) |  lim  RX (t   , t )dt
T  T
2
T  T T / 2
11

Example 1. Wide-Sense Stationary (WSS) Processes

• A random process X(t) is wide-sense stationary (WSS) if the following


conditions are satisfied:

–  X (t )  E[ X (t )] is independent of t;
– RX (t1 , t2 ) depends only on the time difference =t1-t2, and not on t1 and t2
individually.

1 T /2
 Power: PX  lim
T  T 
T / 2
RX (t , t )dt  RX (0)

 Power spectrum: GX ( f )  lim 1 T / 2 RX (t   , t )dt  RX ( )


 T  T T / 2
12

Example 2. Cyclostationary Processes

• A random process X(t) with mean X(t) and autocorrelation function


RX(t+, t) is called cyclostationary, if both the mean and the
autocorrelation are periodic in t with some period T0, i.e., if
–  X (t  T0 )   X (t )
– RX (t    T0 , t  T0 )  RX (t   , t )

1 T /2 1

T0 / 2
 Power: PX  lim
T  T T / 2
RX (t , t )dt 
T0 
T0 / 2
RX (t , t )dt

1 T /2 1
T T / 2
T0 / 2
 Power spectrum: GX ( f )  Tlim

R X (t   , t ) dt 
T0 
T0 / 2
RX (t   , t )dt
13

Signal Transmission through a


Linear System
14

Linear Time Invariant (LTI) System

Random signal 
Impulse Response Y (t )  X (t )  h(t )   X ( )h(t   )d

X(t) h(t)

• If a WSS random process X(t) passes through an LTI system with


impulse response h(t), the output process Y(t) will be also WSS
with mean 
Y   X  h(t )dt   X H (0)


autocorrelation
RY ( )  RX ( )  h( )  h( )

and power spectrum


GY ( f )  GX ( f ) | H ( f ) |2
15

Example 3. Gaussian Processes

• A random process X(t) is a Gaussian process if for all n and all (t1, …, tn),
n
the random variables { X (ti )}i 1 have a jointly Gaussian pdf.

 For Gaussian processes, knowledge of the mean and auto-


correlation gives a complete statistical description of the process.

 If a Gaussian process X(t) is passed through an LTI system, the


output process Y(t) will also be a Gaussian process.
16

Example 4. White Processes

• A random process X(t) is called a white process if it has a flat spectral


density, i.e., if GX(f) is a constant for all f.
GX ( f )
N0

 Power: PX   GX ( f )df  
2



0 f
 Autocorrelation:
N0
N0 : two-sided power spectral density
GX ( f )  RX ( )   ( ) 2
2

 If a white process X(t) passes through an LTI system with impulse


response h(t), the output process Y(t) will not be white any more.

Power spectrum of Y(t): GY ( f ) 


N0
2 | H ( f ) |2
 
Power of Y(t): PY   H ( f ) df  
N0 2 N0
2 2 h 2 (t )dt
 

You might also like