0% found this document useful (0 votes)
45 views26 pages

SSP 4 1 - Modelling 1

The document discusses modeling random signals and processes. It covers filtering of random processes, spectral factorization, and how linear time-invariant systems affect the statistics and power spectrum of random inputs. Key topics include the definition of power spectrum, motivation for using power spectrum in signal processing, and relationships between input and output autocorrelation, cross-correlation, and power spectra when a random process passes through an LTI system.

Uploaded by

Gabor Gereb
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
45 views26 pages

SSP 4 1 - Modelling 1

The document discusses modeling random signals and processes. It covers filtering of random processes, spectral factorization, and how linear time-invariant systems affect the statistics and power spectrum of random inputs. Key topics include the definition of power spectrum, motivation for using power spectrum in signal processing, and relationships between input and output autocorrelation, cross-correlation, and power spectra when a random process passes through an LTI system.

Uploaded by

Gabor Gereb
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 26

Signal modeling I

IN5340/9340

Andreas Austeng
March 2022
Department of Informatics, University of Oslo

1/26
Outline

Introduction

Filtering of random processes

Spectral factorization
Motivation
Rational power spectra
Filter parameters ↔ Auto-correlation Sequence

2/26
What do we learn

• Filtering of stochastic processes


S. M. Alessio
Digital Signal Processing and Spectral Analysis for Scientists: Concepts and Applications
Springer, 2016 Ch. 9.9.
URL https://link.springer.com/book/10.1007/978-3-319-25468-5
• MA, AR and ARMA models
S.M. Alessio
Ch. 11.1 - 11.7 (except 11.3.5, 11.3.6, 11.6.1, 11.6.2, 11.6.3.2)
• Pade and Prony´s method
C. Sidney Burrus
Prony, Padé, and Linear Prediction for Interpolation and Approximation in the Time and
Frequency Domain Design of IIR Digital Filters and in Parameter Identification
URL https://arxiv.org/pdf/1909.05298.pdf

3/26
Power spectrum, definition

• Defined as the Fourier transform of the auto-correlation (AC) sequence, γxx [k], (sometime
called the auto-covariance sequence) of a WSS process:

• Px (e ω ) = ∞ γ [k]e −  k ω , and
P
k =−∞ xx
• γxx [k] = E {x[n]x ∗ [n − k]} = 21π −π

Px (e ω )e  k ω d ω,

• where
WSS: A random process x[n] for which
1. the mean is a constant, mx [n] = mx ,
2. the AC γxx [k , l] depends only on the difference k − l, and
3. the variance is finite, cx [0] < ∞.
(and γxx [k] is a deterministic function of delay).
• γxx [k] = E x ∗ [n]x[n − k]
© ª

• And often also written as rx [k].

4/26
Why Power spectrum ...

• Fourier analysis is a powerful tool in signal processing:

From dsbguide.com

• But the Fourier transform of WWS signals does not exist ...

5/26
Why Power spectrum ...

• What then, if
• x[n] is a WWS process, and

x[n] −→ h[n] −→ ?

• What can we expect?

6/26
Filtering & statistics
• How does LTI-filtering of random processes affect the statistics?
• Goal:
1. For a given system, determine the relationship between mean and AC of the input process to
the mean and AC of the output process.
2. For a given system, determine the relationship between the powers spectrum of the input
and output process.
• Assumptions;
• x[n] a WSS process with mean mx and AC rx [k].
• x[n] is filtered by a stable LTI filter described by h[n].
• The output from the system, y [n] will then be a stochastic process described by

X
y [n] = x[n] ∗ h[n] = h[k]x[n − k]
k =−∞
.

7/26
Effect of an LTI System on a Random Signal (1)

• my ; mean of y[n]:
( )
© ª ∞
X ∞
X © ª
myn = E y [n] = E h[k]x[n − k] = h[k]E x[n − k]
k =−∞ k =−∞

h[k] = mx H(e  0 ).
X
= mx
k =−∞

i.e., a constant. Moreover, my = 0 for a centered input with mx = 0.


• Cross-covariance γxy between y [n] and x[n]:
( )

³ X ´
∗ ∗
γyx [l]
© ª
= E y [n]x [n − l] = E h[k]x[n − k] x [n − l]
k =−∞

h[k]E x[n − k]x ∗ [n − l] .
X © ª
=
k =−∞

Letting m = n − k, i.e. n − l = m − (l − k), we get


∞ ∞
γyx [l] h[k]E x[m]x ∗ [m − (l − k)] = h[k]γxx [l − k] = h[l] ∗ γxx [l]
X © ª X
=
k =−∞ k =−∞

8/26
Effect of an LTI System on a Random Signal (2)

• γyx [l] = h[l] ∗ γxx [l]:


the output-input cross-covariance is equal to the convolution of the impulse response with
the input AC sequence.
• For zero-mean WWS processes, γxx [l] = rxx [l] and Pxy (e ω ) = H(e ω )Pxx (e ω ),
i.e. the cross-spectrum between output and input is the product of the input signal’s
power spectrum with the filter’s frequency response.
• Moreover: The input-output cross-covariance is

γxy [l] = E x[n]y ∗ [n − l] = E x[m + l]y ∗ [m] = γ∗


© ª © ª
yx [−l]
= h∗ [−l] ∗ γ∗ ∗
xx [−l] = h [−l] ∗ γxx [l].

i.e. the input-output cross-covariance is equal to the convolution of the folded complex
conjugate of the impulse response with the input AC sequence.
• For zero-mean WWS processes, Pyx (e ω ) = H ∗ (e ω )Pxx (e ω )
i.e., the cross-spectrum between input and output is the product of the input signal’s
power spectrum with the complex conjugate of the filter’s frequency response.

9/26
Effect of an LTI System on a Random Signal (3)

• Auto-correlation of the (centered) output is:


( )

∗ ∗
γyy [l] = E {y [n]y [n − l]} = E y [n − l]
X
h[n − k]x[k]
k =−∞
∞ ∞
h[n − k]E y ∗ [n − l]x[k] = h[n − k]γxy [k − n + l]
X © ª X
=
k =−∞ l =−∞

h[m]γxy [l − m] = h[l] ∗ γxy [l] = h[l] ∗ h∗ [−l] ∗ γxx [l] = h∗ [−l] ∗ γyx [l]
X
=
l =−∞

• This gives
¯2
Pyy (e ω ) = H(e ω )H ∗ (e ω )Pxx (e ω ) = ¯H(e ω )¯ Pxx (e ω )
¯

i.e. the power spectrum of the output signal is equal to the spectrum of the input signal
multiplied by the squared magnitude of the filter’s frequency response.

10/26
Effect of an LTI System on a Random Signal (4)

From Hayes

11/26
Effect of an LTI System on a Random Signal (5)

• Power spectrum of y [n]:


¯2
Py (e ω ) = Px (e ω )¯H(e ω )¯ .
¯

• In terms of z-transform:

Py (z) = Px (z)H(z)H ∗ (1/z ∗ ).

• If h[n] real, then H(z) = H ∗ (z ∗ ) and

Py (z) = Px (z)H(z)H(1/z).

12/26
Linear prediction
• Linear prediction is a mathematical operation where future values of a discrete-time signal
are estimated as a linear function of previous samples.
• These techniques use known information about the system to determine the model.
• Applications include speech and music synthesis, data compression, high-resolution
spectral estimation, communications, manufacturing, and simulation.

13/26
The core of linear prediction with an AR model ...

• Assume that a time series can be modeled by a p-order model:


• x[n] a function of the p previous values plus an error term, e[n];
x[n] = −a[1]x[n − 1] − a[2]x[n − 2] − . . . − a[p]x[n − p] + e[n], or
Pp
x[n] = − k =1 a[k]x[n − k] + e[n].
• Then
Pp
X (z) = − k =1 a[k]z −k X (z) + E (z), or
H(z) = E (z ) = Pp 1
X (z )
.
1+ k =1 a[k ]z −k
• H(z) interpretable as z-transform of an all-pole IIR filter with coefficients a[k].
=⇒ Called an auto-regressive (AR) filter.
• x[n] regarded as output of this filter caused by the random input e[n].
• e[n] often assumed white i.e. x[n] generated by the AR filter from a white noise source.
• e[n] represents the error between the predicted value and the “true” value.
• Having e[n] as input to filter h[n] creating the output x[n], we may calculate the power
spectrum
¯2 ¯2
Pxx (e ω ) = ¯H(e ω )¯ Pee (e ω ) = σ2e ¯H(e ω )¯ .
¯ ¯

i.e. bye estimating σ2e and H(e ω ), the power spectrum can be estimated.

14/26
Assumtions (. . . that all can be proven right)

• The spectrum Pxx of a WSS process is a real-valued, positive and periodic function of ω.
• The spectra of a very wide class of causal WSS random signals an be approximated by
rational functions.
• A rational spectrum is a rational function of e ω :
Pq − ωk
γ [k]e
k =−q 1
ω
Pxx (e ) ≡ Pxx (ω) = Pp
γ [l]e − ωl
l =−p 2

where γ1 [k] and γ2 [l] are two sequences having even symmetry which characterizes the AC
sequence of a real signal, and q and p represent some summation limits.
• This corresponds to seeing the signal as the result of filtering white noise with a filter with
proper order and coefficients.
• This again means viewing the generation of the measured series as a stochastic process,
which can be modelled by a general ARMA(p,q) model.

15/26
Assumtions (. . . that all can be proven right) (2)

• This means that ¯2


ω
¯
B(e )¯ ω ¯2
Pxx (w ) = σ2e ¯ 2 ¯¯
¯
¯2 = σe H(e ) ,
¯
¯A(e ω )¯

where

A(e ω ) = 1 + a1 e − ω + · · · + ap e − ωp
B(e ω ) = 1 + b1 e − ω + · · · + bq e − ωq

are the polynomial numerator and denominator of the filter1s frequency response

A(e ω )
H(e ω ) = .
B(e ω )

• This result is referred to as the spectral factorization theorem.

16/26
The white noise is random(!!)

17/26
Spectral factorization

• Any process that can be factorized as

Px (z) = σ20 H(z)H ∗ (1/z ∗ )

is called a regular process. Regular processes has the following properties


1. Any regular process may be realized as the output of a causal and stable filter that is driven
by white noise having a variance of σ20 . This is known as the innovations representation of
the process.
2. The inverse filter 1/H(z) is a whitening filter. That is, if x[n] is filtered with 1/H(z), then
the output is white noise with variance of σ2e . The formation of this white noise process is
called the innovation process.
3. Since the two processes are related by a invertible transformation, either process may be
derived from the other. Therefore, they both contain the same information.

18/26
Rational power spectra

• Restrict to rationale power spectra (and real signals). Then


B (z )B (z −1 )
Px (z) = σ2x Aq (z )Aq (z −1 ) , r1 < |z | < r2 , where Bq (z) and Ap (z) have roots inside the
p p
unit circle.
• Then H(z) (generates sequence x[n] from noise sequence e[n] ) rational:
© ª © ª
Pq
B (z ) bq [k ]z −k
H(z) = Aq (z ) = 1+Pk =p0 a [k ]z −k
, |z | > r1
p k =1 p
• H(z) and 1/H(z) are causal, stable and minimum phase, i.e. {x[n]} uniquely represents
the statistical properties of the innovation process {e[n]}, and vice versa.
• Alternative representation, difference equation
x[n] + pk =1 ap [k]x[n − k] = qk =0 bq [k]e[n − k]
P P

19/26
Special types of random processes

Pp Pq
From x[n] + a [k]x[n − k] =
k =1 p
b [k]e[n − k]
k =0 q

• Three special cases


• Auto-regressive (AR) process
bq [0] = 1, bq [k] = 0, k > 0 i.e.
Pp
H(z) = 1/Ap (z) is all-pole and x[n] + k =1 ap [k]x[n − k] = e[n].
• Moving average (MA) process
ap [k] = 0, k ≥ 1 i.e.
Pq
H(z) = Bq (z) is all-zero and x[n] = k =0 bq [k]e[n − k].
• Auto-regressive, moving average (ARMA) process
H(z) = Bq (z)/Ap (z); both finite poles and zeros.

20/26
Filter parameters ↔ Auto-correlation Sequence

• From x[n] = − pk =1 ap [k]x[n − k] + qk =0 bq [k]e[n − k]


P P

• Multiply by x ∗ [n − i] (from the right) and take the expectation value of both sides:
p
E x[n]x ∗ [n − l] = − ap [k]E x[n − i]x ∗ [n − l]
© ª X © ª
i =1
q
bq [i]E e[n − i]x ∗ [n − i]
X © ª
+
i =0
p q
i.e. γxx [l] = − ap [i]γxx [l − i] + bq [i]γex [l − i].
X X
i =1 i =0

• Using that x[n] = e[n] ∗ h[n] and that E e ∗ [n]e[m] = σ2e δ[n − m] we have that
© ª


γex [l − i] = E e ∗ [n − i]x[n − l] = E e ∗ [n − i]e[m]h[n − l − m]
© ª © X ª
m=−∞
∞ ³ ©
E e ∗ [n − i]e[m] h[n − l − m] = σ2e h[i − l].
X ª ´
=
m=−∞
p q
γxx [l] = − ap [i]γxx [l − i] + σ2e
X X
=⇒ bq [i]h[i − l].
i =1 i =0

21/26
Filter parameters ↔ Auto-correlation Sequence (2)

• Defining (for causal filters h[i], i.e. h[i] = 0 ∀ i < 0):



q Pq −i b [i + l]h[i] for l ≤ q
i =0 q
X
c[l] = bq [i]h[i − l] =
i =0
0 for l > q .

• We have when the power spectral density function of the stationary random process is a
rational function, that

Pp




− i =1 ap [i]γxx [l − i] for l > q
γxx [i] = − pi=1 ap [i]γxx [l − i] + σ2e c[l] for l ≤ q
P

for l < 0.

γ∗ [−l]

xx

• Known as the Yule-Walker equations.


• If ARMA: Nonlinear relationship due to product bp [i + l] × h[i] in c[l].
• If AR: linear relationship between the parameters.

22/26
Filter parameters ↔ Auto-correlation Sequence (3)

• AR-process: x[n] generated by filtering white noise through an all-pole filter h[n]:
z-transform: H(z) = 1+Pp 1 a z −1 = A(1z )
i =1 i

frequency transform: H(e ω ) = 1+Pp 1a e − ωi = A(e1ω )


i =1 i

power spectrum: Pxx (ω) = σ2e ¯ 1


¯A(e ω )¯2
¯

or: Pxx (z) = Pee (z) A(z )A∗1(1/z ∗ )


Pp
and diff. eq: x[n] = − i =1 ai x[n − i] + e[n]
• Having auto-correlation sequence

− Pp a [i]γ [l − i] + σ2 δ[l] for l ≥ 0
γxx [i] = i =1 p xx e
γ∗ [−l]
xx for l < 0.

23/26
Filter parameters ↔ Auto-correlation Sequence (4)

• An AR-process has the following Yule-Walker equations for l = 1, . . . p:


   
γxx [0] γxx [−1] ··· γxx [−(p + 1)]   γxx [1]
 ap [1]
 γxx [1] γxx [0] γxx [−(p + 2)]   γxx [2] 
 ∗  
···  .. 
 . .. .. .. .
 = − ..
  
 ..

.

. . .
    
 
ap [p]  
γ∗xx [p − 1] γxx [p − 2]

··· γxx [0] γxx [p]

• Indicating the matrix on the left as Γp and defining

a p = [a1 a2 . . . ap ]T
γp = [γ[1] γ[2] . . . γ[p]]T

we may write this as


Γp a p + γp = 0.

• The matrix Γp is Toeplitz.


• Inverse can be found using the Levinson-Durbin algorithm (O (p 2 ))

24/26
Filter parameters ↔ Auto-correlation Sequence (5)

• Including also l = 0, we may move the right-hand side term to the left and add extra row
at top of Γp , forming the augmented Yule-Walker equations

σ2e
    
γxx [0] γxx [−1] ··· γxx [−p] 1
 γxx [1] γxx [0] γxx [−p + 1]  0
 ∗    
···  ap [1]   
 . .. .. .. .. = ..
    
 .. .
 
 . . 
 .  
  . 

γ∗xx [p] γ∗xx [p − 1] ··· γxx [0] ap [p] 0

• This can be written as


σ2
" # " #
1
Γp +1 = e
ap 0p
where 0p is a column vector of zeros of size p × 1.
• If the first p + 1 samples of the AX sequence have been estimated, the coefficients ai can
be estimated as:
1
a p = −Γ−
p γp

• The σ2e can be found from the augmented equations: σ2e = γ[0] + pi=1 ai γ[i].
P

25/26
Filter parameters ↔ Auto-correlation Sequence (5)

• MA-process: x[n] generated by filtering white noise through an all-zero filter h[n]:
Pq −1
z-transform: H(z) = B(z) + bz
i =1 i

H(e ω ) = B(e ω )1 +
Pq − ωi
frequency transform: be
i =1 i
¯2
power spectrum: Pxx (ω) = σ2e ¯B(e ω )¯ or: Pxx (z) = Pee (z)B(z)B ∗ (1/z ∗ )
¯

Pq
and diff. eq: x[n] = − i =1 bi e[n − i]
• It has auto-correlation sequence

σ2 Pq −|l | b [i + |l |]b for |l | ≤ q
e i =1 q q
γxx [i] =
0 for |l | > q .

• Problem: Estimating the q parameters bq [i] and σ2e is a nonlinear estimation problem!

26/26

You might also like