0% found this document useful (0 votes)
222 views9 pages

Least-Squares Spectral Analysis

Least-squares spectral analysis (LSSA) is a method for estimating the frequency spectrum of data based on fitting sinusoids using least squares. It was developed in the 1960s-1970s by researchers including Vaníček, Lomb, and Scargle. The Lomb-Scargle periodogram simplified Vaníček's method by ignoring correlations between sine and cosine bases, making it equivalent to a traditional periodogram adapted for unequally spaced data. Later, the generalized Lomb-Scargle periodogram removed the zero-mean assumption of the standard method. LSSA allows analysis of unevenly spaced data without gaps or editing, unlike Fourier methods.

Uploaded by

harrison9
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
222 views9 pages

Least-Squares Spectral Analysis

Least-squares spectral analysis (LSSA) is a method for estimating the frequency spectrum of data based on fitting sinusoids using least squares. It was developed in the 1960s-1970s by researchers including Vaníček, Lomb, and Scargle. The Lomb-Scargle periodogram simplified Vaníček's method by ignoring correlations between sine and cosine bases, making it equivalent to a traditional periodogram adapted for unequally spaced data. Later, the generalized Lomb-Scargle periodogram removed the zero-mean assumption of the standard method. LSSA allows analysis of unevenly spaced data without gaps or editing, unlike Fourier methods.

Uploaded by

harrison9
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

Least-squares spectral analysis

Least-squares spectral analysis (LSSA) is a method of


estimating a frequency spectrum based on a least-squares
fit of sinusoids to data samples, similar to Fourier
analysis.[1][2] Fourier analysis, the most used spectral
method in science, generally boosts long-periodic noise in
the long and gapped records; LSSA mitigates such
problems.[3] Unlike in Fourier analysis, data need not be
equally spaced to use LSSA.

Developed in 1969[4] and 1971,[5] LSSA is also known as


the Vaníček method[6] and the Gauss-Vaniček method[7]
after Petr Vaníček, and as the Lomb method[3] or the
Lomb–Scargle periodogram,[2][8] based on the
simplifications first by Nicholas R. Lomb[9] and then by
Jeffrey D. Scargle.[10]

Historical background The result of fitting a set of data points with a


quadratic function
The close connections between Fourier analysis, the
periodogram, and the least-squares fitting of sinusoids have
been known for a long time.[11] However, most developments are restricted to complete data sets of equally
spaced samples. In 1963, Freek J. M. Barning of Mathematisch Centrum, Amsterdam, handled unequally
spaced data by similar techniques,[12] including both a periodogram analysis equivalent to what nowadays
is called the Lomb method and least-squares fitting of selected frequencies of sinusoids determined from
such periodograms — and connected by a procedure known today as the matching pursuit with post-back
fitting[13] or the orthogonal matching pursuit.[14]

Petr Vaníček, a Canadian geophysicist and geodesist of the University of New Brunswick, proposed in
1969 also the matching-pursuit approach for equally and unequally spaced data, which he called
"successive spectral analysis" and the result a "least-squares periodogram".[4] He generalized this method
to account for any systematic components beyond a simple mean, such as a "predicted linear (quadratic,
exponential, ...) secular trend of unknown magnitude", and applied it to a variety of samples, in 1971.[5]

Vaníček's strictly least-squares method was then simplified in 1976 by Nicholas R. Lomb of the University
of Sydney, who pointed out its close connection to periodogram analysis.[9] Subsequently, the definition of
a periodogram of unequally spaced data was modified and analyzed by Jeffrey D. Scargle of NASA Ames
Research Center,[10] who showed that, with minor changes, it becomes identical to Lomb's least-squares
formula for fitting individual sinusoid frequencies.

Scargle states that his paper "does not introduce a new detection technique, but instead studies the reliability
and efficiency of detection with the most commonly used technique, the periodogram, in the case where the
observation times are unevenly spaced," and further points out regarding least-squares fitting of sinusoids
compared to periodogram analysis, that his paper "establishes, apparently for the first time, that (with the
proposed modifications) these two methods are exactly equivalent."[10]
Press[3] summarizes the development this way:

A completely different method of spectral analysis for unevenly sampled data, one that
mitigates these difficulties and has some other very desirable properties, was developed by
Lomb, based in part on earlier work by Barning and Vanicek, and additionally elaborated by
Scargle.

In 1989, Michael J. Korenberg of Queen's University in Kingston, Ontario, developed the "fast orthogonal
search" method of more quickly finding a near-optimal decomposition of spectra or other problems,[15]
similar to the technique that later became known as the orthogonal matching pursuit.

Development of LSSA and variants

The Vaníček method

In the Vaníček method, a discrete data set is approximated by a


weighted sum of sinusoids of progressively determined
frequencies using a standard linear regression or least-squares
fit.[16] The frequencies are chosen using a method similar to
Barning's, but going further in optimizing the choice of each
successive new frequency by picking the frequency that
minimizes the residual after least-squares fitting (equivalent to
the fitting technique now known as matching pursuit with pre-
backfitting[13]). The number of sinusoids must be less than or
equal to the number of data samples (counting sines and cosines
of the same frequency as separate sinusoids).

A data vector Φ is represented as a weighted sum of sinusoidal In linear regression, the observations
basis functions, tabulated in a matrix A by evaluating each (red) are assumed to be the result of
function at the sample times, with weight vector x: random deviations (green) from an
underlying relationship (blue) between a
, dependent variable (y) and an
independent variable (x). Then in a
where the weights vector x is chosen to minimize the sum of
normed fitting, such as by the criterion
squared errors in approximating Φ. The solution for x is closed-
of least squares, the data points (red)
form, using standard linear regression:[17] are represented by the line of
normatively best fit (blue), from which
there always remain "residuals" (green).

Here the matrix A can be based on any set of functions mutually


independent (not necessarily orthogonal) when evaluated at the sample times; functions used for spectral
analysis are typically sines and cosines evenly distributed over the frequency range of interest. If we choose
too many frequencies in a too-narrow frequency range, the functions will be insufficiently independent, the
matrix ill-conditioned, and the resulting spectrum meaningless.[17]

When the basis functions in A are orthogonal (that is, not correlated, meaning the columns have zero pair-
wise dot products), the matrix ATA is diagonal; when the columns all have the same power (sum of
squares of elements), then that matrix is an identity matrix times a constant, so the inversion is trivial. The
latter is the case when the sample times are equally spaced and sinusoids chosen as sines and cosines
equally spaced in pairs on the frequency interval 0 to a half cycle per sample (spaced by 1/N cycles per
sample, omitting the sine phases at 0 and maximum frequency where they are identically zero). This case is
known as the discrete Fourier transform, slightly rewritten in terms of measurements and coefficients.[17]

— DFT case for N equally spaced samples and frequencies, within a scalar
factor.

The Lomb method

Trying to lower the computational burden of the


Vaníček method in 1976 [9] (no longer an issue),
Lomb proposed using the above simplification in
general, except for pair-wise correlations between
sine and cosine bases of the same frequency, since
the correlations between pairs of sinusoids are often
small, at least when they are not tightly spaced. This
formulation is essentially that of the traditional
periodogram but adapted for use with unevenly
spaced samples. The vector x is a reasonably good
estimate of an underlying spectrum, but since we
ignore any correlations, Ax is no longer a good A power spectrum (magnitude-squared) of two
approximation to the signal, and the method is no sinusoidal basis functions, calculated by the
longer a least-squares method — yet in the literature periodogram method
continues to be referred to as such.

Rather than just taking dot products of the data with sine and cosine waveforms directly, Scargle modified
the standard periodogram formula so to find a time delay first, such that this pair of sinusoids would be
mutually orthogonal at sample times and also adjusted for the potentially unequal powers of these two
basis functions, to obtain a better estimate of the power at a frequency.[3][10] This procedure made his
modified periodogram method exactly equivalent to Lomb's method. Time delay by definition equals to

Then the periodogram at frequency is estimated as:

which, as Scargle reports, has the same statistical distribution as the periodogram in the evenly sampled
case.[10]

At any individual frequency , this method gives the same power as does a least-squares fit to sinusoids of
that frequency and of the form:
[18]
In practice, it is always difficult to judge if a given Lomb peak is significant or not, especially when the
nature of the noise is unknown, so for example a false-alarm spectral peak in the Lomb periodogram
analysis of noisy periodic signal may result from noise in turbulence data.[19] Fourier methods can also
report false spectral peaks when analyzing patched-up or data edited otherwise.[7]

The generalized Lomb–Scargle periodogram

The standard Lomb–Scargle periodogram is only valid for a model with a zero mean. Commonly, this is
approximated — by subtracting the mean of the data before calculating the periodogram. However, this is
an inaccurate assumption when the mean of the model (the fitted sinusoids) is non-zero. The generalized
Lomb–Scargle periodogram removes this assumption and explicitly solves for the mean. In this case, the
function fitted is
[20]

The generalized Lomb–Scargle periodogram has also been referred to in the literature as a floating mean
periodogram.[21]

Korenberg's "fast orthogonal search" method

Michael Korenberg of Queen's University in Kingston, Ontario, developed a method for choosing a sparse
set of components from an over-complete set — such as sinusoidal components for spectral analysis —
called the fast orthogonal search (FOS). Mathematically, FOS uses a slightly modified Cholesky
decomposition in a mean-square error reduction (MSER) process, implemented as a sparse matrix
inversion.[15][22] As with the other LSSA methods, FOS avoids the major shortcoming of discrete Fourier
analysis, so it can accurately identify embedded periodicities and excel with unequally spaced data. The fast
orthogonal search method was applied to also other problems, such as nonlinear system identification.

Palmer's Chi-squared method

Palmer has developed a method for finding the best-fit function to any chosen number of harmonics,
allowing more freedom to find non-sinusoidal harmonic functions.[23] His is a fast (FFT-based) technique
for weighted least-squares analysis on arbitrarily spaced data with non-uniform standard errors. Source
code that implements this technique is available.[24] Because data are often not sampled at uniformly
spaced discrete times, this method "grids" the data by sparsely filling a time series array at the sample times.
All intervening grid points receive zero statistical weight, equivalent to having infinite error bars at times
between samples.

Applications
The most useful feature of LSSA is enabling incomplete records to be spectrally analyzed — without the
need to manipulate data or to invent otherwise non-existent data.

Magnitudes in the LSSA spectrum depict the contribution of a frequency or period to the variance of the
time series.[4] Generally, spectral magnitudes thus defined enable the output's straightforward significance
level regime.[25] Alternatively, spectral magnitudes in the Vaníček spectrum can also be expressed in
dB.[26] Note that spectral magnitudes in the Vaníček spectrum follow β-distribution.[27]
Inverse transformation of Vaníček's LSSA is possible, as is most
easily seen by writing the forward transform as a matrix; the
matrix inverse (when the matrix is not singular) or pseudo-
inverse will then be an inverse transformation; the inverse will
exactly match the original data if the chosen sinusoids are
mutually independent at the sample points and their number is
equal to the number of data points.[17] No such inverse
procedure is known for the periodogram method.

Implementation
The LSSA can be implemented in less than a page of
MATLAB code.[28] In essence:[16]
Beta distribution for different values of its
parameters
"to compute the least-squares spectrum we must
compute m spectral values ... which involves
performing the least-squares approximation m
times, each time to get [the spectral power] for a
different frequency"

I.e., for each frequency in a desired set of frequencies, sine and cosine functions are evaluated at the times
corresponding to the data samples, and dot products of the data vector with the sinusoid vectors are taken
and appropriately normalized; following the method known as Lomb/Scargle periodogram, a time shift is
calculated for each frequency to orthogonalize the sine and cosine components before the dot product;[17]
finally, a power is computed from those two amplitude components. This same process implements a
discrete Fourier transform when the data are uniformly spaced in time and the frequencies chosen
correspond to integer numbers of cycles over the finite data record.

This method treats each sinusoidal component independently, or out of context, even though they may not
be orthogonal to data points; it is Vaníček's original method. In addition, it is possible to perform a full
simultaneous or in-context least-squares fit by solving a matrix equation and partitioning the total data
variance between the specified sinusoid frequencies.[17] Such a matrix least-squares solution is natively
available in MATLAB as the backslash operator.[29]

Furthermore, the simultaneous or in-context method, as opposed to the independent or out-of-context


version (as well as the periodogram version due to Lomb), cannot fit more components (sines and cosines)
than there are data samples, so that:[17]

"...serious repercussions can also arise if the selected frequencies result in some of the Fourier
components (trig functions) becoming nearly linearly dependent with each other, thereby
producing an ill-conditioned or near singular N. To avoid such ill conditioning it becomes
necessary to either select a different set of frequencies to be estimated (e.g., equally spaced
frequencies) or simply neglect the correlations in N (i.e., the off-diagonal blocks) and estimate
the inverse least squares transform separately for the individual frequencies..."

Lomb's periodogram method, on the other hand, can use an arbitrarily high number of, or density of,
frequency components, as in a standard periodogram; that is, the frequency domain can be over-sampled by
an arbitrary factor.[3] However, as mentioned above, one should keep in mind that Lomb's simplification
and diverging from the least squares criterion opened up his technique to grave sources of errors, resulting
even in false spectral peaks.[19]

In Fourier analysis, such as the Fourier transform and discrete Fourier transform, the sinusoids fitted to data
are all mutually orthogonal, so there is no distinction between the simple out-of-context dot-product-based
projection onto basis functions versus an in-context simultaneous least-squares fit; that is, no matrix
inversion is required to least-squares partition the variance between orthogonal sinusoids of different
frequencies.[30] In the past, Fourier's was for many a method of choice thanks to its processing-efficient fast
Fourier transform implementation when complete data records with equally spaced samples are available,
and they used the Fourier family of techniques to analyze gapped records as well, which, however, required
manipulating and even inventing non-existent data just so to be able to run a Fourier-based algorithm.

See also
Non-uniform discrete Fourier transform
Orthogonal functions
SigSpec
Sinusoidal model
Spectral density
Spectral density estimation, for competing alternatives

References
1. Cafer Ibanoglu (2000). Variable Stars As Essential Astrophysical Tools (https://books.googl
e.com/books?id=QzGbOiZ3OnkC&q=vanicek+spectral+sinusoids&pg=PA269). Springer.
ISBN 0-7923-6084-2.
2. D. Scott Birney; David Oesper; Guillermo Gonzalez (2006). Observational Astronomy (http
s://books.google.com/books?id=cc9L8QWcZWsC&q=Lomb-Scargle-periodogram&pg=RA3-
PA263). Cambridge University Press. ISBN 0-521-85370-2.
3. Press (2007). Numerical Recipes (https://books.google.com/books?id=9GhDHTLzFDEC&q
=%22spectral+analysis%22+%22vanicek%22+inauthor:press&pg=PA685) (3rd ed.).
Cambridge University Press. ISBN 978-0-521-88068-8.
4. P. Vaníček (1 August 1969). "Approximate Spectral Analysis by Least-squares Fit" (https://art
icles.adsabs.harvard.edu/pdf/1969Ap%26SS...4..387V.pdf) (PDF). Astrophysics and Space
Science. 4 (4): 387–391. Bibcode:1969Ap&SS...4..387V (https://ui.adsabs.harvard.edu/abs/
1969Ap&SS...4..387V). doi:10.1007/BF00651344 (https://doi.org/10.1007%2FBF00651344).
OCLC 5654872875 (https://www.worldcat.org/oclc/5654872875). S2CID 124921449 (https://
api.semanticscholar.org/CorpusID:124921449).
5. P. Vaníček (1 July 1971). "Further development and properties of the spectral analysis by
least-squares fit" (https://articles.adsabs.harvard.edu/pdf/1971Ap%26SS..12...10V.pdf)
(PDF). Astrophysics and Space Science. 12 (1): 10–33. Bibcode:1971Ap&SS..12...10V (http
s://ui.adsabs.harvard.edu/abs/1971Ap&SS..12...10V). doi:10.1007/BF00656134 (https://doi.
org/10.1007%2FBF00656134). S2CID 109404359 (https://api.semanticscholar.org/CorpusI
D:109404359).
6. J. Taylor; S. Hamilton (20 March 1972). "Some tests of the Vaníček Method of spectral
analysis". Astrophysics and Space Science. 17 (2): 357–367.
Bibcode:1972Ap&SS..17..357T (https://ui.adsabs.harvard.edu/abs/1972Ap&SS..17..357T).
doi:10.1007/BF00642907 (https://doi.org/10.1007%2FBF00642907). S2CID 123569059 (htt
ps://api.semanticscholar.org/CorpusID:123569059).
7. M. Omerbashich (26 June 2006). "Gauss-Vanicek spectral analysis of the Sepkoski
compendium: no new life cycles". Computing in Science & Engineering. 8 (4): 26–30.
arXiv:math-ph/0608014 (https://arxiv.org/abs/math-ph/0608014).
Bibcode:2006CSE.....8d..26O (https://ui.adsabs.harvard.edu/abs/2006CSE.....8d..26O).
doi:10.1109/MCSE.2006.68 (https://doi.org/10.1109%2FMCSE.2006.68).
8. Hans P. A. Van Dongen (1999). "Searching for Biological Rhythms: Peak Detection in the
Periodogram of Unequally Spaced Data". Journal of Biological Rhythms. 14 (6): 617–620.
doi:10.1177/074873099129000984 (https://doi.org/10.1177%2F074873099129000984).
PMID 10643760 (https://pubmed.ncbi.nlm.nih.gov/10643760). S2CID 14886901 (https://api.s
emanticscholar.org/CorpusID:14886901).
9. Lomb, N. R. (1976). "Least-squares frequency analysis of unequally spaced data".
Astrophysics and Space Science. 39 (2): 447–462. Bibcode:1976Ap&SS..39..447L (https://u
i.adsabs.harvard.edu/abs/1976Ap&SS..39..447L). doi:10.1007/BF00648343 (https://doi.org/
10.1007%2FBF00648343). S2CID 2671466 (https://api.semanticscholar.org/CorpusID:2671
466).
10. Scargle, J. D. (1982). "Studies in astronomical time series analysis. II - Statistical aspects of
spectral analysis of unevenly spaced data". Astrophysical Journal. 263: 835.
Bibcode:1982ApJ...263..835S (https://ui.adsabs.harvard.edu/abs/1982ApJ...263..835S).
doi:10.1086/160554 (https://doi.org/10.1086%2F160554).
11. David Brunt (1931). The Combination of Observations (2nd ed.). Cambridge University
Press.
12. Barning, F. J. M. (1963). "The numerical analysis of the light-curve of 12 Lacertae". Bulletin
of the Astronomical Institutes of the Netherlands. 17: 22. Bibcode:1963BAN....17...22B (http
s://ui.adsabs.harvard.edu/abs/1963BAN....17...22B).
13. Pascal Vincent; Yoshua Bengio (2002). "Kernel Matching Pursuit" (http://www.iro.umontreal.
ca/~vincentp/Publications/kmp_mlj.pdf) (PDF). Machine Learning. 48: 165–187.
doi:10.1023/A:1013955821559 (https://doi.org/10.1023%2FA%3A1013955821559).
14. Y. C. Pati, R. Rezaiifar, and P. S. Krishnaprasad, "Orthogonal matching pursuit: Recursive
function approximation with applications to wavelet decomposition," in Proc. 27th Asilomar
Conference on Signals, Systems and Computers, A. Singh, ed., Los Alamitos, CA, USA,
IEEE Computer Society Press, 1993
15. Korenberg, M. J. (1989). "A robust orthogonal algorithm for system identification and time-
series analysis". Biological Cybernetics. 60 (4): 267–276. doi:10.1007/BF00204124 (https://
doi.org/10.1007%2FBF00204124). PMID 2706281 (https://pubmed.ncbi.nlm.nih.gov/270628
1). S2CID 11712196 (https://api.semanticscholar.org/CorpusID:11712196).
16. Wells, D.E., P. Vaníček, S. Pagiatakis, 1985. Least-squares spectral analysis revisited.
Department of Surveying Engineering Technical Report 84, University of New Brunswick,
Fredericton, 68 pages, Available at [1] (http://www2.unb.ca/gge/Pubs/TR84.pdf).
17. Craymer, M.R., The Least Squares Spectrum, Its Inverse Transform and Autocorrelation
Function: Theory and Some Applications in Geodesy (https://tspace.library.utoronto.ca/handl
e/1807/12263), Ph.D. Dissertation, University of Toronto, Canada (1998).
18. William J. Emery; Richard E. Thomson (2001). Data Analysis Methods in Physical
Oceanography (https://books.google.com/books?id=gYc4fp_ixmwC&q=vanicek+least-squar
es+spectral-analysis+lomb&pg=PA458). Elsevier. ISBN 0-444-50756-6.
19. Zhou, W.-X.; Sornette, D. (October 2001). "Statistical significance of periodicity and log-
periodicity with heavy-tailed correlated noise". International Journal of Modern Physics C. 13
(2): 137–169. arXiv:cond-mat/0110445 (https://arxiv.org/abs/cond-mat/0110445).
Bibcode:2002IJMPC..13..137Z (https://ui.adsabs.harvard.edu/abs/2002IJMPC..13..137Z).
doi:10.1142/S0129183102003024 (https://doi.org/10.1142%2FS0129183102003024).
S2CID 8256563 (https://api.semanticscholar.org/CorpusID:8256563).
20. M. Zechmeister; M. Kürster (March 2009). "The generalised Lomb–Scargle periodogram. A
new formalism for the floating-mean and Keplerian periodograms". Astronomy &
Astrophysics. 496 (2): 577–584. arXiv:0901.2573 (https://arxiv.org/abs/0901.2573).
Bibcode:2009A&A...496..577Z (https://ui.adsabs.harvard.edu/abs/2009A&A...496..577Z).
doi:10.1051/0004-6361:200811296 (https://doi.org/10.1051%2F0004-6361%3A200811296).
S2CID 10408194 (https://api.semanticscholar.org/CorpusID:10408194).
21. Andrew Cumming; Geoffrey W. Marcy; R. Paul Butler (December 1999). "The Lick Planet
Search: Detectability and Mass Thresholds". The Astrophysical Journal. 526 (2): 890–915.
arXiv:astro-ph/9906466 (https://arxiv.org/abs/astro-ph/9906466).
Bibcode:1999ApJ...526..890C (https://ui.adsabs.harvard.edu/abs/1999ApJ...526..890C).
doi:10.1086/308020 (https://doi.org/10.1086%2F308020). S2CID 12560512 (https://api.sem
anticscholar.org/CorpusID:12560512).
22. Korenberg, Michael J.; Brenan, Colin J. H.; Hunter, Ian W. (1997). "Raman Spectral
Estimation via Fast Orthogonal Search". The Analyst. 122 (9): 879–882.
Bibcode:1997Ana...122..879K (https://ui.adsabs.harvard.edu/abs/1997Ana...122..879K).
doi:10.1039/a700902j (https://doi.org/10.1039%2Fa700902j).
23. Palmer, David M. (2009). "A Fast Chi-squared Technique For Period Search of Irregularly
Sampled Data". The Astrophysical Journal. 695 (1): 496–502. arXiv:0901.1913 (https://arxiv.
org/abs/0901.1913). Bibcode:2009ApJ...695..496P (https://ui.adsabs.harvard.edu/abs/2009
ApJ...695..496P). doi:10.1088/0004-637X/695/1/496 (https://doi.org/10.1088%2F0004-637
X%2F695%2F1%2F496). S2CID 5991300 (https://api.semanticscholar.org/CorpusID:59913
00).
24. "David Palmer: The Fast Chi-squared Period Search" (http://public.lanl.gov/palmer/fastchi.ht
ml).
25. Beard, A.G., Williams, P.J.S., Mitchell, N.J. & Muller, H.G. A special climatology of planetary
waves and tidal variability, J Atm. Solar-Ter. Phys. 63 (09), p.801–811 (2001).
26. Pagiatakis, S. Stochastic significance of peaks in the least-squares spectrum, J of Geodesy
73, p.67-78 (1999).
27. Steeves, R.R. A statistical test for significance of peaks in the least squares spectrum,
Collected Papers of the Geodetic Survey, Department of Energy, Mines and Resources,
Surveys and Mapping, Ottawa, Canada, p.149-166 (1981)
28. Richard A. Muller; Gordon J. MacDonald (2000). Ice Ages and Astronomical Causes: Data,
spectral analysis and mechanisms (1st ed.). Springer Berlin Heidelberg.
Bibcode:2000iaac.book.....M (https://ui.adsabs.harvard.edu/abs/2000iaac.book.....M).
ISBN 978-3-540-43779-6. OL 20645181M (https://openlibrary.org/books/OL20645181M).
Wikidata Q111312009.
29. Timothy A. Davis; Kermit Sigmon (2005). MATLAB Primer (https://books.google.com/books?i
d=MXWypqcHECkC&q=matlab+least-squares+backslash&pg=PA12). CRC Press. ISBN 1-
58488-523-8.
30. Darrell Williamson (1999). Discrete-Time Signal Processing: An Algebraic Approach (https://
books.google.com/books?id=JCKAirWQdqkC&q=fourier-transform+orthogonal+least-square
s&pg=PA314). Springer. ISBN 1-85233-161-5.

External links
LSSA package freeware download (http://web.archive.org/web/20220818070617id_/http://w
ww2.unb.ca/gge/Research/GRL/LSSA/sourceCode.html), FORTRAN, Vaníček's least-
squares spectral analysis method, from the University of New Brunswick.
LSWAVE package freeware download (https://geodesy.noaa.gov/gps-toolbox/LSWAVE.ht
m), MATLAB, includes the Vaníček's least-squares spectral analysis method, from the U.S.
National Geodetic Survey.
LSSA software freeware download (ftp://ftp.geod.nrcan.gc.ca/pub/GSD/craymer/software/lss
a/) (via ftp), FORTRAN, Vaníček's method, from the Natural Resources Canada.

Retrieved from "https://en.wikipedia.org/w/index.php?title=Least-squares_spectral_analysis&oldid=1161956783"

You might also like