Adaptive Signal Processing Lecture Note
Adaptive Signal Processing Lecture Note
Requirements:
Project work: Exercises and programs for algorithm implementation
Final examination
Lecture 1
0
1
2
3
4
5
6
7
8
9
10
11
12
13
14
Kalman Filters
Square Root Adaptive Filters
Order Recursive Adaptive Filters
Finite Precision Eects
Tracking of Time Varying Systems
Adaptive Filters using Innite-Duration
Least-Mean-Square Adaptive Filters
15
Impulse Response Structures
Normalized Least-Mean-Square Adaptive
16 Blind Deconvolution
Filters
Frequency-Domain Adaptive Filters
17 Back-Propagation Learning
Method of Least Squares
Epilogue
Recursive Least-Squares Algorithm
Lecture 1
for all k
* The reference noise v1 (n) and the noise v0 (n) are correlated, with unknown crosscorrelation p(k),
Ev0 (n)v1 (n k) = p(k)
Lecture 1
Lecture 1
Lecture 1
M
1
wk (n)v1 (n k)
k=0
.
* The error signal is computed as e(n) = d(n) y(n).
* The parameters of the lters are modied in an adaptive manner. For example, using the LMS
algorithm (the simplest adaptive algorithm)
wk (n + 1) = wk (n) + v1 (n k)e(n)
where is the adaptation constant.
(LM S)
Lecture 1
30
25
20
e2 (n) = F (w0 , . . . , wM 1 )
15
is a paraboloid.
10
20
15
20
15
10
10
5
w
de (n)
dwk
5
0
= 2e(n)v1 (n k)
w0
Lecture 1
M
1
wk (n + 1)v1 (n k)
k=0
M
1
k=0
M
1
wk (n)v1 (n k) e(n)
k=0
M
1
v12 (n
k=0
M
1
v12 (n k)
k=0
M
1
2
v1 (n k))
k=0
= e(n) e(n)
= e(n)(1
M 1 2
k=0 v1 (n
k)
k)
Lecture 1
Lecture 1
10
Lecture 1
11
Lecture 1
12
Lecture 1
13