0% found this document useful (0 votes)
184 views35 pages

Discrete Time Random Process

(1) Random signals cannot be characterized by a simple mathematical equation and their future values cannot be predicted, unlike deterministic signals. We must use probability and statistics to analyze random signals. (2) Random signal theory is important for analyzing signals, inferring system parameters from noisy data, designing optimal systems, and predicting system performance. Examples include characterizing speech signals and digital communications signals and noise. (3) A random process is a collection of random variables indexed over time. It models real-world systems with random elements. Key concepts include its statistics, stationary and ergodic properties, autocorrelation, and power spectral density.

Uploaded by

SundarRajan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
184 views35 pages

Discrete Time Random Process

(1) Random signals cannot be characterized by a simple mathematical equation and their future values cannot be predicted, unlike deterministic signals. We must use probability and statistics to analyze random signals. (2) Random signal theory is important for analyzing signals, inferring system parameters from noisy data, designing optimal systems, and predicting system performance. Examples include characterizing speech signals and digital communications signals and noise. (3) A random process is a collection of random variables indexed over time. It models real-world systems with random elements. Key concepts include its statistics, stationary and ergodic properties, autocorrelation, and power spectral density.

Uploaded by

SundarRajan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 35

Discrete Time

Random Process
Signals: Deterministic vs. Stochastic

Deterministic Signals:
Each value of these signals are fixed and can be determined
by a mathematical expression. The future values of any deterministic
signal can be calculated from past values.

Random signals:
These signals cannot be characterized by a simple, well-
defined mathematical equation and their future values cannot be
predicted. Rather, we must use probability and statistics to analyze
their behavior.
Random signal theory
It is important for
 Analysis of signals;
 Inference of underlying system parameters from noisy observed
data;
 Design of optimal systems (digital and analogue signal
recovery, signal classification, estimation ...);
 Predicting system performance (error-rates, signal-to-noise
ratios, ...).
Examples
Example 1: Speech signals
Use probability theory to characterize that some sequences of
vowels and consonants are more likely than others, some
waveforms more likely than others for a given vowel or
consonant.

Use this to achieve: speech recognition, speech coding, speech


enhancement, ...

Example 2: Digital communications


Characterize the properties of the digital data source (mobile
phone, digital television transmitter, ...), characterize the
noise/distortions present in the transmission channel.

Use this to achieve: accurate regeneration of the digital signal at


the receiver, analysis of the channel characteristics ...
Probability theory
It is used to give a mathematical description
of the behavior of real-world systems which
involve elements of randomness.
 Coin-flipping experiment, in which we are
interested in whether 'Heads' or 'Tails' is the
outcome,
 Study of random errors in a coded digital data
stream (e.g. a CD recording or a digital
mobile phone).
Random Process
Definition:
A family or ensemble of signals that correspond
to every possible outcome of a certain signal
measurement. Each signal in this collection is
referred to as a realization or sample function of
the process.

 An indexed sequence of random variables


 Mapping from the sample space into an
ensemble of discrete time signals x(n)
 Collection of discrete time signals
Random Processes-Example
(1) Rolling a fair die: x(n)=A cos(nwo)
where A is a random variable :{1-6}
ensemble of 6 different and equally
probable DT signals
(2) Flipping a fair coin
 setting the value of x(n) to 1 or -1
 x(n) => DT random process with
random sequence of 1’s and -1s
 “Bernoulli Process”
Alternate View of RP
 For a particular value of n, say n=n0, x(n0) is a
random variable
x1(n)
w1
w2
n
wi
x2(n)

xi(n)

n=no
Alternate View of RP
 For each , there is a corresponding value
of x(no). Therefore a random process can be
viewed as an indexed sequence of random
variables.
. . . x(-2),x(-1),x(0),x(1),x(2), . . .

w1 x1(n)
w2
wi n
x2(n)
n
xi(n)
n
n=no
Statistics of RP
 Probability distribution function of each
random variable in the sequence
Fx(n)()=Pr{x(n) ≤ }
 Probability density function
fx(n)()=d/d {Fx(n)()}
 Joint Probability distribution function
Fx(n1) x(n2) …x(nk)(1,2,…k)
=Pr{x(n1) ≤ 1, x(n2) ≤ 2,…x(nk) ≤ k}
Ensemble Averages
First Order Statistics
 Mean:
mx(n)=E{x(n)}

 Variance:
σ2x(n)=E{|x(n)-mx(n)|2}

The first order statistics depend upon ‘n’


Ensemble Averages
Second Order Statistics
 Autocovariance:
Cx(k,l)=E{[x(k)-mx(k)] [x(l)-mx(l)]*}

 Autocorrelation
rx(k,l)=E{x(k)x(l)*}

 relates the random variables x(k) and x(l)


NOTE:

(1) If k=l, then Cx(k,k)=σ2x(k) (Variance)

(2) Cx(k,l)=rx(k,l) - mx(k)mx(l)*


For zero mean process Cx(k,l)=rx(k,l)

(3) If Cx(k,l)=0 for kl, then the random variables


x(k) and x(l) are uncorrelated
Ensemble Averages
Second Order Statistics
 Cross Covariance:
Cxy(k,l)=E{[x(k)-mx(k)] [y(l)-myl)]*}

 Cross correlation
rxy(k,l)=E{x(k)y(l)*}

 relates the random variables x(k) and y(l) from


two different processes
NOTE:
(1) Cxy(k,l)=rxy(k,l) - mx(k)my(l)*
For zero mean processes Cxy(k,l)=rxy(k,l)

(2) If Cxy(k,l)=0 for all k and l, then the random processes


x(n) and y(n) are uncorrelated.
Also rxy(k,l)= mx(k)my(l)*

(3) The random processes x(n) and y(n) are


said to be orthogonal if
rxy(k,l)=0
 Zero mean processes that are uncorrelated are
orthogonal
Additive Property of Uncorrelated
random processes
If the two random processes x(n) and y(n) are
uncorrelated then the autocorrelation of the
sum
z(n)=x(n)+y(n)
is equal to
rxy(k,l)= rx(k,l)+ ry(k,l)
Types of Random Process
(1) Continuous RP
(2) Discrete RP
(3) Stationary RP
(4) Non Stationary RP

Strict sense Stationary


Stationary RP
Wide sense Stationary
Stationary Processes
 The statistics or ensemble averages of a RP
are independent of time
 Statistical time invariance
First order stationary:
fx(n)()=fx(n+K)() for all k
first order statistics are independent of time.
mean : mx(n)=mx
Variance : σx2(n)=σx2
Stationary Processes
Second order stationary:
fx(n1)x(n2)(1, 2)=fx(n1+k)x(n2+k)(1, 2)
for all k
Also rx(k,l)= rx(k+n,l+n)
–invariant to a time shift
rx(k,l)= rx(k-l,0)
rx(k,l)= rx(k-l)
Strict sense stationary
 A RP is said to be stationary of order L, if the
random processes x(n) and x(n+k) have the
same Lth order joint density functions.

 A process that is stationary for all L>0 is said


to be stationary in the strict sense
Wide Sense Stationary
 A random process x(n) is said to be wide
sense stationary, if the following conditions
are satisfied
1.Mean: mx(n)=mx ; a constant
2. ACF: rx(k,l) depends only on the difference
k-l, ie.,rx(k,l)=rx(k-l)
3.Variance: Cx(0)<  ; finite
Examples for WSS
 Gaussian random process
 Bernoulli process
 Harmonic process
 x(n)=A cos (nw0); A={1,2,3,4,5,6}
-NOT a WSS RP
Jointly WSS:
rxy(k,l) = rxy(k+n,l+n)

rxy(k,l) = rxy(k-l) ‘k-l’ is called lag


Properties of WSS Process
(1) Symmetry: rx(k)= rx*(-k)
For a real process rx(k) is symmetric.
rx(k)= rx(-k)
(2) Mean square value:
rx(0)=E{|x(n)|2}  0
(3) Maximum Value: rx(0)  rx(k)
(4) Periodicity: If rx(k0)= rx(0) for some k0, then
rx(k) is periodic with period k0
Autocorrelation matrix
 Let x = [x(0),x(1), … ,x(p) ]T
 The outer product is

 x ( 0) 
 x(1) 
 

xx H   .  x* (0), x* (1), . . x* ( p ) 
 
 . 
 x( p)
Autocorrelation matrix

 x(0) x * (0) x(0) x* (1) . . x ( 0) x * ( p ) 


 * * * 
 x (1) x ( 0) x (1) x (1) . . x(1) x ( p ) 
XX H
 . . . . 
 
 . . . . 
 x( p) x * (0) x( p ) x* (1) . . x( p ) x ( p )
*

If x(n) is a WSS RP, then rx(k)=rx*(-k)


Autocorrelation matrix

 rx (0) rx * (1) rx * (2) . . rx * ( p) 


 r (1) r (0) r * (1) . . 
rx * ( p  1) 
 x x x

 rx (2) rx (1) rx (0) . . rx * ( p  2)


Rx  E{xx }  
H

 . . . . . . 
 . . . . . . 
 
rx ( p) rx ( p  1) rx ( p  2) . . rx (0) 
Autocovariance matrix

ACV Matrix: Cx=E{(x-mx)(x-mx)H}


size: (p+1)x(p+1)

Relation: Cx=Rx-mxmxH
where mx=[mx, mx,…mx]

For zero mean process


Cx=Rx
Properties of Autocorrelation matrix
(1) Hermitian, Toeplitz Matrix
(2) Non-negative, definite; Rx >0
(3) The eigen values of Rx are real valued and
nonnegative

Hermitian: A=(A*)T
Toeplitz : each of the diagonal elemnts
are same, aij=ai+1,j+1
White Noise
 Fundamental discrete RP
 A WSS process v(n) is said to white if
Cv(k)= σ2vk
 Autocovariance is zero for all k≠0
 Sequence of uncorrelated random variables
 Example
(1) White Gaussian Noise
(2) Bernoulli process
Power Spectrum
 Frequency domain representation of the random
process
 Fourier transform of an ensemble average
 Power spectral density


(z) ) 
Pxx(e
P j
 rxx(k)e
(k)z 
- jk- k

kk--

 Given the power spectrum



1
 x
j jk
rx (k)  P (e )e d
2 
Properties of Power Spectrum
 Symmetry: Px(ej)=Px*(ej)
Px(z)=Px*(1/z*)
Also if x(n) is real, then Px(ej)=Px(ej) Px(z)=Px*(z*)
 Positivity: Px(ej)0
 Total power: 1

 x )d
j
E{| x(n) | } 
2
P (e
2 

 Eigen Value property


j j
min Px (e )  i  max Px (e )
 
Wiener-Khintchine Theorem
Filtering the Random process

x(n) h(n) y(n)


WSS RP Linear Shift Output P
Invariant filter

y(n)=x(n)*h(n)
=  h(k) x(n-k)
Mean of y(n):
E{y(n)}=mxH(ej0)
Filtering the Random process
Autocorrelation of y(n):
cross correlation rxy(k)= rx(k)+ ry(k)
ry(k)= rxy(k)h*(-k)
= rx(k)h(k)  h*(-k)
Power spectrum of y(n):
Py(ej)= Px(ej) |H(ej)|2
Py(z)= Px(z) H(z) H*(1/z*)
Topics to be followed
 Spectral factorization
 Bias and Consistency

You might also like