Frequently Asked Questions On Wavelets: Q1. What Is The Wavelet Transform?
Frequently Asked Questions On Wavelets: Q1. What Is The Wavelet Transform?
Naoki Saito
Department of Mathematics
University of California
Davis, CA 95616 USA
email:[email protected]
January 6, 2018
1
frequency subband (subband containing the DC component) is spanned by a set of translated versions of
another single elementary waveform called “father wavelet” (or “scaling function”).
The frequency characteristics of the father wavelet essentially determines the quality of the wavelet
transform. If we were to partition the frequency axis sharply using the characteristic functions (or box-
car functions), then we would end up with the so-called Shannon (or Littlewood-Paley) wavelets, i.e., the
difference of two sinc functions, the corresponding father wavelet is simply the sinc function. Clearly,
however, we cannot have a finite-length filter in the time domain in this case. The other extreme is the Haar
wavelet which is the simplest and oldest wavelet discovered in 1910. In this case, the father wavelet is just
a boxcar function, and the mother wavelet is defined as
1
for 0 < x < 1/2,
g(x) = −1 for 1/2 ≤ x < 1,
0 otherwise.
This is simply a difference of the two boxcar functions in time. Although the Haar wavelet gives rise to the
shortest filter length in the time domain, it partitions the frequency axis quite badly.
The reconstruction (or synthesis) process is also very simple: starting from the lowest frequency compo-
nents (or coarsest scale coefficients) and the second lowest frequency components, the adjoint operations H ∗
and G∗ are applied respectively and added to obtain the next finer scale coefficients. This process is iterated
to reconstruct the original signal. The computational complexity of the decomposition and reconstruction
process is in both cases O(N ) where N is a number of time samples.
2
Q4. What are the applications of wavelets?
Wavelets and their relatives generated a lot of interests in diverse fields ranging from astronomy to geology
to biology as well as statistics and computer science. In each of these fields, the wavelets are applied for
• Data Compression
• Noise Removal
One of the most successful applications so far, however, is data compression. In fact, the new image com-
pression standard called JPEG2000 is fully based on wavelets. See the JPEG (Joint Photographic Experts
Group) official website listed below for the details. I would strongly encourage the reader to read an excel-
lent and somewhat advanced review article of Donoho, Vetterli, DeVore, and Daubechies [5] that explains a
deep relationship between data compression and harmonic analysis.
It is beyond the scope of this FAQ to describe each application. Instead, I would like to refer the reader
to the SIAM News article written by Barry Cipra attached in Appendix.
3
* JPEG Official Website:
http://www.jpeg.org
This page provides wealth of information about both the JPEG standard (based on DCT-II) as well as the
JPEG2000 standard (based on wavelets) in details.
From Journals
There are two journals dedicated to the wavelet-related research:
One often encounters the wavelet-related work in the following journals (just a subset):
4
* SIAM Journ. on Numerical Analysis,
* SIAM Journ. on Scientific Computing,
* Signal Processing
References
[1] J. M. A SH, ed., Studies in Harmonic Analysis, vol. 13 of MAA Studies in Mathematics, Math. Assoc.
Amer., 1976.
[2] R. N. B RACEWELL, The Fourier Transform and Its Applications, McGraw-Hill, Inc., second, re-
vised ed., 1986.
[3] W. L. B RIGGS AND V. E. H ENSON, The DFT: An Owner’s Manual for the Discrete Fourier Trans-
form, SIAM, Philadelphia, PA, 1995.
[4] I. DAUBECHIES, Ten Lectures on Wavelets, vol. 61 of CBMS-NSF Regional Conference Series in
Applied Mathematics, SIAM, Philadelphia, PA, 1992.
[5] D. L. D ONOHO , M. V ETTERLI , R. A. D E VORE , AND I. DAUBECHIES, Data compression and har-
monic analysis, IEEE Trans. Inform. Theory, 44 (1998), pp. 2435–2476. Invited paper.
[6] H. DYM AND H. P. M C K EAN, Fourier Series and Integrals, Academic Press, 1972.
[7] G. B. F OLLAND, Fourier Analysis and Its Applications, Amer. Math. Soc., Providence, RI, 1992.
Republished by AMS, 2009.
5
[8] M. H OLSCHNEIDER, Wavelets: An Analysis Tool, Oxford Univ. Press, 1995.
[9] S. JAFFARD , Y. M EYER , AND R. D. RYAN, Wavelets: Tools for Science & Technology, SIAM,
Philadelphia, PA, 2001.
[11] S. M ALLAT, Multifrequency channel decompositions of images and wavelet models, IEEE Trans.
Acoust., Speech, Signal Process., 37 (1989), pp. 2091–2110.
[12] , A Wavelet Tour of Signal Processing, Academic Press, Burlington, MA, third ed., 2009.
[13] Y. M EYER, Book review of “An Introduction to Wavelets” by C. K. Chui, Academic Press, NY, 1992,
and “Ten Lectures on Wavelets” by I. Daubechies, SIAM, 1992, Bull. Amer. Math. Soc., 28 (1993),
pp. 350–360.
[14] , Wavelets and Operators, vol. 37 of Cambridge Studies in Advanced Mathematics, Cambridge
Univ. Press, New York, 1993. Translated by D. H. Salinger.
[15] , Wavelets: Their past and their future, in Progress in Wavelet Analysis and Applications,
Y. Meyer and S. Roques, eds., Editions Frontieres, B.P.33, 91192 Gif-sur-Yvette Cedex, France, 1993,
pp. 9–18.
[16] , Oscillating Patterns in Image Processing and Nonlinear Evolution Equations, vol. 22 of Uni-
versity Lecture Series, Amer. Math. Soc., Providence, RI, 2001.
[17] Y. M EYER AND R. C OIFMAN, Wavelets: Calderón-Zygmund and multilinear operators, vol. 48 of
Cambridge Studies in Advanced Mathematics, Cambridge University Press, New York, 1997.
[18] M. A. P INSKY, Introduction to Fourier Analysis and Wavelets, Amer. Math. Soc., Providence, RI,
2002. Republished by AMS, 2009.
[19] O. R IOUL AND M. V ETTERLI, Wavelets and signal processing, IEEE SP Magazine, 8 (1991), pp. 14–
38.
[20] E. M. S TEIN AND G. W EISS, Introduction to Fourier Analysis on Euclidean Spaces, Princeton Univ.
Press, 1971.
[21] M. V. W ICKERHAUSER, Adapted Wavelet Analysis from Theory to Software, A K Peters, Ltd., Welles-
ley, MA, 1994.
[23] A. Z YGMUND, Trigonometric Series, Cambridge Mathematical Library, Cambridge Univ. Press,
third ed., 2003. Volumes I & II combined.
6
Appendix: Barry Cipra’s Article
==========================================
Reprinted from SIAM NEWS
Volume 26-7, November 1993
(C) 1993 by Society for Industrial and Applied Mathematics
All rights reserved.
==========================================
7
Daubechies likens a wavelet transform to a musical score, which tells the musician which note to play at
what time. One of the attractive features of wavelets is their “zoom in” property: They are designed to deal
with fine details that affect only part of an image or signal–something that always leaves Fourier analysis
with a case of the jitters. That is in part why wavelets seem to be so good at data compression for things like
fingerprint images.
Megabytes of Ink
One problem with fingerprints is simply that there are so many of them. The FBI has approximately 200
million fingerprint cards. Some come from employment and security checks, but 114.5 million cards belong
to some 29 million criminals (bad guys tend to get fingerprinted more than once). According to Peter
Higgins, deputy assistant director of the FBI’s Criminal Justice Information Services division, the files
occupy an acre of office space.
That’s a lot of black ink.
By digitizing the files and storing them electronically, “we hope to put [them] in something that would
fit in a 20 x 20 - foot room,” Higgins says. Putting things in electronic form should speed up the submission
process. (Currently, the FBI receives anywhere from 30,000 to 40,000 fingerprint identification requests
every day, mostly through the mail. Approximately half pertain to criminal arrests and half to employment
checks.) It’s also hard to imagine doing automated fingerprint identification any other way.
But the digitized images have to be of high quality. Faxed fingerprints may be OK for post office
reproductions, but not for permanent records. And that’s another problem: At a resolution of 500 pixels per
inch with 256 levels of gray-scale, a single inch-and-a-half-square fingerprint block takes up approximately
600 kilobytes, and an entire card weighs in at a hefty 10 megabytes of data. Multiplied by 200 million cards,
that’s–well, quite a bit. Moreover, 10 megabytes is a nontrivial amount of data to transmit. At a standard
modem rate of 9600 bits per second with 20tie up a phone line for nearly three hours.
That’s where the wavelet compression standard comes in. Developed by Tom Hopper at the FBI and
Jonathan Bradley and Christopher Brislawn at Los Alamos National Laboratory, the standard is based on
scalar quantization of a 64-subband discrete wavelet transform. Compression takes place in the quantization
step, where the coefficients of the transform within each subband are, in effect, assigned to integer-valued
“bins.” (The information is further compressed by Huffman coding, which uses strings of variable length to
represent data.)
The wavelet/scalar quantization standard is not locked in to a particular wavelet basis. A digitized
fingerprint file will contain not only the compressed image, but also tables specifying the wavelet transform,
scalar quantizer, and Huffman code. “We allow for a number of different encoders,” Brislawn explains. So
far, however, only one system, using a basis of bi-orthogonal wavelets constructed by Cohen, Daubechies,
and Feauveau and reported in a 1990 paper, has been approved by the FBI. The system gives reconstructions
that are hard to distinguish from originals at compression ratios of about 20:1.
8
Not all the applications under consideration have immediate market potential. At Aware, for example,
John Weiss and Sam Qian have been investigating the use of wavelets in the numerical solution of partial
differential equations. “One of the things that they seem to be very useful for are situations where you have
strong gradients,” Weiss explains. That includes problems involving shock waves and turbulence, he adds.
“What we’re trying to do is see if you can make the wavelet method as universal as, say, the finite
element method but obtain a better rate of convergence.” Weiss, who co-organized a minisymposium on
wavelet solutions to PDEs at the SIAM annual meeting last July, envisions the incorporation of wavelet-
based solvers into a CAD/CAM package for design engineers who don’t have access to supercomputers: “If
you can offer something which is reliable and fairly simple to use, I think there is a market for that.”
At Los Alamos, Bradley and Brislawn are also applying wavelet technology to the solutions of partial
differential equations, but in a completely different way: They are using wavelets not to solve PDEs, but to
help manage the volumes of data that supercomputers spew out when they run through a global climate or
ocean simulation. A typical simulation generates on the order of a terabyte of uncompressed data. Much
like their wavelet/scalar quantization approach to fingerprint image compression, Bradley and Brislawn have
developed a wavelet/vector quantization method for the multidimensional data sets of climate and ocean
models. In this case the purpose is to give researchers a rough and ready look at what the supercomputer is
trying to tell them.
In one of the more pleasing applications of the new mathematics, Ronald Coifman and colleagues at
Yale University used wavelets to clean up an old recording of Brahms playing his First Hungarian Dance on
the piano. The “original” recording–actually a re-recording of a radio broadcast (complete with static) of a
78 record copied from a partially melted wax cylinder–was unrecognizable as music. But using a technique
he calls adapted waveform analysis, Coifman, who is a co-founder of Fast Mathematical Algorithms &
Hardware, managed to strip out much of the noise. The remaining sound is good enough to give a sense of
Brahms’s style of play.
Cleaning Up Statistics
Music is not the only domain in which noise is a problem. Statisticians have long grappled with the problem
of noisy data. It appears that wavelets may hold some of the answers. At least that’s the view expressed by
David Donoho, a statistician at Stanford University who has led the way in applying wavelet techniques in
the theory of statistics. He and colleague Iain Johnstone have developed a “wavelet shrinkage” technique
that works wonders on a variety of data sets.
The technique starts with the application of a wavelet transform to the noisy signal or data set. It
then “shrinks” each of the wavelet coefficients toward zero , using a soft-threshold nonlinearity, so that
suitably small coefficients are set precisely to zero. Finally, the altered coefficients are inverted to produce
a “denoised” signal. Donoho and Johnstone, with co-authors Gerard Kekyacharian and Dominique Picard
of the Universite de Paris-VII, have shown that denoising by wavelet shrinkage is either optimal or nearly
so for a number of technical criteria. For example, Donoho has proved that the reconstructed signal is, with
high probability, at least as smooth as the original (true) signal, for a wide variety of smoothness measures.
Whether wavelets will have an impact in specific areas of applied statistics–say clinical trials in medical
research–remains to be seen. But there’s no question they’ve changed the landscape of theoretical statistics,
says Donoho: “As soon as we were exposed to wavelets, we made the equivalent of about ten years’ progress
in months.” One change is likely to be in the questions that are asked about such problems as smoothness.
“Part of the reason we were looking at these problems was that we didn’t know how to do them,” Donoho
says. “Now that we know how to do them, I think we’re in a better position to say what are the right
9
questions for statistical theory to focus on.”
(Barry A. Cipra is a mathematician and writer based in Northfield, Minnesota.)
10