0% found this document useful (0 votes)
53 views4 pages

Adaptive Filtering With Averaging in Noise Cancellation For Voice and Speech Recognition

This document proposes an adaptive filtering with averaging (AFA) algorithm for noise cancellation. The AFA algorithm averages both the approximation sequence and observed signals over time, improving stability compared to existing algorithms. It has high convergence like RLS but lower complexity since it does not use a covariance matrix. The algorithm estimates noise using filtered reference noise, calculates error, and updates filter coefficients using averaged terms, avoiding additional complexity compared to existing methods. Experiments show the AFA algorithm effectively cancels car and office noise from speech.

Uploaded by

Ovais Farooq
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
53 views4 pages

Adaptive Filtering With Averaging in Noise Cancellation For Voice and Speech Recognition

This document proposes an adaptive filtering with averaging (AFA) algorithm for noise cancellation. The AFA algorithm averages both the approximation sequence and observed signals over time, improving stability compared to existing algorithms. It has high convergence like RLS but lower complexity since it does not use a covariance matrix. The algorithm estimates noise using filtered reference noise, calculates error, and updates filter coefficients using averaged terms, avoiding additional complexity compared to existing methods. Experiments show the AFA algorithm effectively cancels car and office noise from speech.

Uploaded by

Ovais Farooq
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

Adaptive Filtering with Averaging in Noise Cancellation for Voice and Speech Recognition

Georgi Iliev and Nikola Kasabov


Department of Information Science, University of Otago
P.O. Box 56, Dunedin, New Zealand
[email protected], [email protected]

Abstract – In many applications of noise cancellation correlation between the two noise signals. This is
the changes in signal characteristics could be quite equivalent to the minimization of the mean-square
fast. This requires the utilization of adaptive error E[e2(n)] where
algorithms, which converge rapidly. From this point
of view the best choice is the recursive least squares e(n) = s(n) + n2(n) – n3(n) (1)
(RLS) algorithm. Unfortunately this algorithm has
high computational complexity and stability Having in mind that by assumption, s(n) is
problems. In this contribution we present an algorithm correlated neither with n2(n) nor with n1(n) we have
based on adaptive filtering with averaging (AFA) used
for noise cancellation. The main advantages of AFA E[e2(n)] = E[s2(n)] + E[n2(n) – n3(n)]2. (2)
algorithm could be summarized as follows. It has high
convergence rate comparable to that of the RLS In other words the minimization of E[e2(n)] is
algorithm and at the same time low computational equivalent to the minimization of the difference
complexity and possible robustness in fixed-point between n2(n) and n3(n). Obviously E[e2(n)] will be
implementations. The algorithm is illustrated on car minimal when n3(n) ≈ n2(n) i.e. when the impulse
and office noise added to speech data. response of the adaptive filter closely mimics the
impulse response of the noise path.
The minimization of E[e2(n)] can be achieved by
updating the filter taps wi(n). Most often NLMS and
RLS algorithms are used. In Tables 1 and 2 are
I. INTRODUCTION summarized the steps required for adaptive noise
The purpose of this contribution is to study the cancellation scheme depicted in Fig. 1.
application of a new algorithm based on adaptive
filtering with averaging in noise cancellation problem.
It is well known that two of most frequently applied s(n)+n2(n) e(n)
algorithms for noise cancellation [1] are normalized
least mean squares (NLMS) [2], [3], [4] and recursive
least squares (RLS) [5], [6] algorithms. Considering
the two algorithms, it is obvious that NLMS algorithm hi(n)
has the advantage of low computational complexity.
On the contrary, the high computational complexity is n1(n) n3(n)
the weakest point of RLS algorithm but it provides a wi(n)
fast adaptation rate. Thus, it is clear that the choice of
the adaptive algorithm to be applied is always a
tradeoff between computational complexity and fast Fig. 1. Adaptive noise cancellation scheme.
convergence.
In the present work we propose a new adaptive
algorithm with averaging applied for noise
cancellation. The conducted extensive experiments Table 1. NLMS algorithm.
with different types of noise reveal its robustness
maintaining fast convergence and at the same time Noise estimation:
keeping the computational complexity at a low level. N
n3(n) = ∑ w i (n ) n1 (n − i)
i =0

II. ADAPTIVE NOISE CANCELLATION N – filter order


Fig. 1 shows the classical scheme for adaptive noise Error estimation:
cancellation using digital filter with finite impulse e(n) = s(n) + n2(n) – n3(n)
response (FIR). The primary input consists of speech Coefficients update:
s(n) and noise n2(n) while the reference input consists e( n ) n ( n − i )
of noise n1(n) alone. The two noises n1(n) and n2(n) wi(n+1) = wi(n) + µ N 1
are correlated and hi(n) is the impulse response of the ∑ n12 (n − i)
i=0
noise path. The system tries to reduce the impact of for 0≤i≤N
the noise in the primary input exploring the
Table 2. RLS algorithm. 1 n
W (n ) = ∑ W (k )
n k =1
Noise estimation: 1/2<γ<1
N
n3(n) = ∑ w i (n ) n1 (n − i)
i =0
The analysis presented in [8] shows that such an
N – filter order algorithm could be unstable in the initial period. In
Error estimation: order to improve the stability we undergo the second
e(n) = s(n) + n2(n-d) – n3(n) step, namely to average not only trough the
Covariance update: approximation sequence but also through the observed
P(n − 1) N1 (n ) N1T (n )P(n − 1) signals N1 and e. This leads us to an adaptive
P(n) = P(n-1) -
1 / δ + N1T (n )P(n − 1) N1 (n ) algorithm with averaging (AFA):
Gain update:
P(n − 1) N1 (n ) W (n ) =
1 n
∑ W (k )
K(n) =
1 / δ + N1T (n )P(n − 1) N1 (n ) n k =1
Coefficients update:
1 n
wi(n) = wi(n-1) + ki(n)e(n) W (n + 1) = W (n ) + γ
∑ N1 (k )e(k ) (6)
for 0≤i≤N n k =1
0<δ<1 P(0) = αI α large 1/2<γ<1

The required steps for the utilization of an AFA


III. NOISE CANCELLATION WITH AVERAGING algorithm in noise cancellation problem are presented
As mentioned in the introduction, for application in Table 3.
where the fast convergence rate is vital, NLMS Considering Table 3 it could be concluded that first,
algorithm is not applicable. The more complex RLS the averaging here does not create additional burden
algorithm maintains a good rate of adaptation but the since the terms w i (n ) and n1ei(n ) can be recursively
prize to be paid is an order-of-magnitude increase in computed from their past values. Second, the
complexity. Moreover RLS algorithm is known to algorithm does not use the covariance matrix, so there
have stability issues [7] due to the recursive is no need of covariance estimate. This implies low
covariance update formula (see Table 2). In this computational complexity and escape from stability
section we introduce a new adaptive algorithm applied issues related to P(n).
for noise cancellation based on adaptive filtering with
averaging. Table 3. AFA algorithm.
We start with defining the problem in the following
manner. To recursively adjust the filter coefficients, Noise estimation:
so that the mean-square error is minimized, a standard N
algorithm for approximating the vector of filter n3(n) = ∑ w i (n ) n1 (n − i)
i =0
coefficients can be written as
N – filter order
Error estimation:
W(n+1) = W(n) – a(n)N1(n)e(n) (3) e(n) = s(n) + n2(n) – n3(n)
Coefficients update:
where
1 n
w i (n ) = ∑ w i (k )
W(n) = [w0(n), w1(n),…, wN(n)]T is the coefficients n k =1
n
vector, n1 ei(n ) = ∑ n1 (k − i)e(k )
k =1

N1(n) = [n1(n), n1(n-1),…, n1(n-N)]T is the input 1


wi(n+1) = wi (n ) + γ
n1 ei (n )
vector and a(n) is a sequence of positive scalars as n
a(n)→0 for n→∞. for 0≤i≤N and 1/2<γ<1
In (3) the estimation error can be given by

e(n) = s(n) + n2(n) - N1T(n)W(n). (4) IV. EXPERIMENTAL RESULTS


In this section we assess the performance of the
The equation (3) could be transformed through proposed AFA algorithm for noise cancellation.
taking the averages of W: The LMS, RLS and AFA algorithms are
implemented according to the steps presented in
W (n + 1) = W (n ) +
1
N1 (n )e(n ) (5) Tables 1-3 as for the LMS algorithm - µ= 0.02, for the
n
γ
RLS algorithm - δ= 0.98 and for the AFA algorithm -
γ = 0.5.
First, the original speech (the word "home") is
corrupted with office noise (SNR=6dB) and the
results after noise cancellation are shown in Fig. 2. Filter Taps
0.3
Second, an experiment with car noise (SNR=0dB) is
0.2
conducted and the results for different algorithms are

Amplitude
presented in Fig. 3 (here the original speech is the 0.1

word "return"). 0

-0.1
0 2000 4000 6000 8000 10000 12000
Number of Iterations
Speech ANC Output
0.4 0.4

0.2 0.2

Amplitude
Amplitude

0 0

-0.2 -0.2

-0.4 -0.4
0 2000 4000 6000 8000 10000 12000 0 2000 4000 6000 8000 10000 12000
Number of Iterations Number of Iterations
Noise
0.2
Fig. 2d. The AFA algorithm – office noise.
0.1
Amplitude

-0.1

-0.2
0 2000 4000 6000 8000 10000 12000
Speech
Number of Iterations 0.4

0.2
Fig. 2a. The signals for the experiment with office
Amplitude

0
noise. -0.2

-0.4
Filter Taps
1.5 -0.6
0 2000 4000 6000 8000 10000 12000 14000
1 Number of Iterations
Noise
Amplitude

0.5 0.4

0 0.2
Amplitude

-0.5
0
-1
0 2000 4000 6000 8000 10000 12000 -0.2
Number of Iterations
ANC Output
0.4 -0.4
0 2000 4000 6000 8000 10000 12000 14000
Number of Iterations
0.2
Amplitude

0 Fig. 3a. The signals for the experiment with car noise.
-0.2

-0.4
0 2000 4000 6000 8000 10000 12000
Filter Taps
Number of Iterations 1

Fig. 2b. The NLMS algorithm – office noise. 0.5


Amplitude

Filter Taps 0
0.4

0.3 -0.5
0 2000 4000 6000 8000 10000 12000 14000
Amplitude

0.2 Number of Iterations


ANC Output
0.1 0.5

0
0
Amplitude

-0.1
0 2000 4000 6000 8000 10000 12000
Number of Iterations -0.5
ANC Output
0.4

-1
0.2 0 2000 4000 6000 8000 10000 12000 14000
Amplitude

Number of Iterations
0

-0.2 Fig. 3b. The NLMS algorithm – car noise.


-0.4
0 2000 4000 6000 8000 10000 12000
Number of Iterations

Fig. 2c. The RLS algorithm – office noise.


Filter Taps The program can be used in off-line applications. The
0.8
signals from primary and reference microphone have
0.6
to be previously recorded in .wav files. The order of
Amplitude

0.4

0.2
the adaptive filter, the step size and the initial values
0
of filter taps are controlled via the interface. At the
-0.2
output the user may see the plotted filter taps and the
0 2000 4000 6000 8000
Number of Iterations
10000 12000 14000 speech after noise reduction, listen to the different
0.4
ANC Output
signals used in the process of adaptive noise
0.2
cancellation and save the free of noise speech in a
.wav format file.
Amplitude

-0.2

-0.4
V. CONCLUSIONS
-0.6
The main goal of this paper is to investigate the
0 2000 4000 6000 8000 10000 12000 14000
Number of Iterations
application of an algorithm based on adaptive filtering
with averaging in noise cancellation problem. Here
Fig. 3c. The RLS algorithm – car noise. the main concern is to achieve a high convergence
rate in order to meet the requirements imposed by
Filter Taps applications where the changes in signal
0.6
characteristics could be quite rapid. In this aspect the
0.4
obtained results show that the AFA algorithm is very
Amplitude

0.2 promising. Its main advantages could be summarized


0 as follows:
-0.2
• high adaptation rate, comparable to that of the
0 2000 4000 6000 8000
Number of Iterations
10000 12000 14000
RLS algorithm;
0.4
ANC Output
• low computational complexity and possible
0.2 robustness in fixed-point implementations.
The method is applicable for real word applications
Amplitude

-0.2 of Automated Speech Recognition Systems (ASRS):


-0.4 • noise suppression in ASR in a car environment;
-0.6
0 2000 4000 6000 8000 10000 12000 14000
• noise suppression in ASR in office environment;
Number of Iterations • noise suppression in ASR in a plane.
Fig. 3d. The AFA algorithm – car noise. REFERENCES

[1] W. Harrison, J. Lim, E. Singer, “A new


application of adaptive noise cancellation,” IEEE
Trans. Acoust., Speech, Signal Processing, vol.
34, pp. 21-27, Jan. 1986.
[2] B. Widrow, S. Stearns, Adaptive Signal
Processing. Englewood Cliffs, NJ: Prentice-Hall,
1985.
[3] G. Goodwin, K. Sin, Adaptive Filtering,
Prediction and Control. Englewood Cliffs, NJ:
Prentice-Hall, 1985.
[4] S. Ikeda, A. Sugiyama, “An adaptive noise
canceller with low signal distortion for speech
codecs,” IEEE Trans. Signal Processing, vol. 47,
pp. 665-674, Mar. 1999.
[5] S. Haykin, Adaptive Filter Theory. Englewood
Cliffs, NJ: Prentice-Hall, 1996.
Fig. 4. Graphical user interface for adaptive noise [6] M. Honig, D. Messerschmitt, Adaptive Filters:
cancellation. Sructures, Algorithms, and Applications.
Boston: Kluwer Academic Publishers, 1984.
Comparing the results of the different algorithms it [7] F. Hsu, “Square root Kalman filtering for high-
is clear that RLS and AFA outperform NLMS speed data received over fading dispersive HF
algorithm. The last shows a high deviation in its channels,” IEEE Trans. Inform. Theory, vol. 28,
coefficients that results in poorer performance. pp. 753-763, Sept. 1982.
A MatLab package with graphical user interface [8] K. Astrom, G. Goodwin, P. Kumar, Adaptive
(GUI) (see Fig. 4) is available on the WWW from Control, Filtering, and Signal Processing. New
http://divcom.otago.ac.nz/infosci/KEL/CBIIS.html. York: Springer-Verlag, 1995.

You might also like