Least Squares Fitting - From Wolfram MathWorld
Least Squares Fitting - From Wolfram MathWorld
Least Squares Fitting - From Wolfram MathWorld
Algebra
Number Theory
Recreational Mathematics
Fitting Lifetime Data to
Topology a Weibull Model
Frederick Wu
Alphabetical Index
Interactive Entries The Sampling
Distribution of a
Random Entry
Sampling Quantile
New in MathWorld Seth J. Chandler
MathWorld Classroom A mathematical procedure for finding the best-fitting curve to a given set Fitting Noisy Data
Rob Morris
of points by minimizing the sum of the squares of the offsets ("the
About MathWorld
Contribute to MathWorld
residuals") of the points from the curve. The sum of the squares of the
Fitting a Curve to Five
Send a Message to the Team offsets is used instead of the offset absolute values because this allows Points
Rob Morris
the residuals to be treated as a continuous differentiable quantity.
MathWorld Book
However, because squares of the offsets are used, outlying points can
have a disproportionate effect on the fit, a property which may or may not
Wolfram Web Resources »
13,594 entries
be desirable depending on the problem at hand.
Last updated: Thu Aug 18 2016
Created, developed,
and
nurtured by Eric Weisstein
at Wolfram Research
The linear least squares fitting technique is the simplest and most
commonly applied form of linear regression and provides a solution to the
problem of finding the best fitting straight line through a set of points. In
fact, if the functional relationship between the two quantities being
graphed is known to within additive or multiplicative constants, it is
common practice to transform the data in such a way that the resulting
line is a straight line, say by plotting vs. instead of vs. in the case
of analyzing the period of a pendulum as a function of its length . For
this reason, standard forms for exponential, logarithmic, and power laws
are often explicitly computed. The formulas for linear least squares fitting
were independently derived by Gauss and Legendre.
For nonlinear least squares fitting to a number of unknown parameters,
linear least squares fitting may be applied iteratively to a linearized form
of the function until convergence is achieved. However, it is often also
possible to linearize a nonlinear function at the outset and still use linear
methods for determining fit parameters without resorting to iterative
procedures. This approach does commonly violate the implicit
assumption that the distribution of errors is normal, but often still gives
acceptable results using normal equations, a pseudoinverse, etc.
Depending on the type of fit and initial parameters chosen, the nonlinear
fit may have good or poor convergence properties. If uncertainties (in the
most general case, error ellipses) are given for the points, points can be
weighted differently in order to give the high-quality points more weight.
Vertical least squares fitting proceeds by finding the sum of the squares of
the vertical deviations of a set of data points
(1)
from a function . Note that this procedure does not minimize the actual
deviations from the line (which would be measured perpendicular to the
given function). In addition, although the unsquared sum of distances
might seem a more appropriate quantity to minimize, use of the absolute
value results in discontinuous derivatives which cannot be treated
analytically. The square deviations from each point are therefore summed,
and the resulting residual is then minimized to find the best fit line. This
procedure results in outlying points being given disproportionately large
weighting.
(2)
so
(4)
(5)
(6)
(7)
(8)
In matrix form,
(9)
so
(10)
(11)
so
(12)
(13)
(14)
(15)
(16)
(17)
(18)
(19)
(20)
(21)
(22)
(23)
(24)
Here, is the covariance and and are variances. Note that the
quantities and can also be interpreted as the dot products
(25)
(26)
(27)
(29)
then the error between the actual vertical point and the fitted point is
given by
(31)
Now define as an estimator for the variance in ,
(32)
(33)
(34)
(35)
SEE ALSO:
ANOVA, Correlation Coefficient, Interpolation, Least Squares Fitting--Exponential, Least Squares Fitting--Logarithmic,
Least Squares Fitting--Perpendicular Offsets, Least Squares Fitting--Polynomial, Least Squares Fitting--Power Law,
MANOVA, Matrix 1-Inverse, Moore-Penrose Matrix Inverse, Nonlinear Least Squares Fitting, Pseudoinverse,
Regression Coefficient, Residual, Spline
REFERENCES:
Acton, F. S. Analysis of Straight-Line Data. New York: Dover, 1966.
Bevington, P. R. Data Reduction and Error Analysis for the Physical Sciences.
New York: McGraw-Hill, 1969.
Chatterjee, S.; Hadi, A.; and Price, B. "Simple Linear Regression." Ch. 2 in
Regression Analysis by Example, 3rd ed. New York: Wiley, pp. 21-50, 2000.
Edwards, A. L. "The Regression Line on ." Ch. 3 in An Introduction to Linear
Regression and Correlation. San Francisco, CA: W. H. Freeman, pp. 20-32, 1976.
Farebrother, R. W. Fitting Linear Relationships: A History of the Calculus of
Observations 1750-1900. New York: Springer-Verlag, 1999.
Gauss, C. F. "Theoria combinationis obsevationum erroribus minimis
obnoxiae." Werke, Vol. 4. Göttingen, Germany: p. 1, 1823.
Gonick, L. and Smith, W. The Cartoon Guide to Statistics. New York: Harper
Perennial, 1993.
Kenney, J. F. and Keeping, E. S. "Linear Regression, Simple Correlation, and
Contingency." Ch. 8 in Mathematics of Statistics, Pt. 2, 2nd ed. Princeton, NJ:
Van Nostrand, pp. 199-237, 1951.
Kenney, J. F. and Keeping, E. S. "Linear Regression and Correlation." Ch. 15 in
Mathematics of Statistics, Pt. 1, 3rd ed. Princeton, NJ: Van Nostrand, pp. 252-
285, 1962.
Lancaster, P. and Šalkauskas, K. Curve and Surface Fitting: An Introduction.
London: Academic Press, 1986.
Laplace, P. S. "Des méthodes analytiques du Calcul des Probabilités." Ch. 4 in
Théorie analytique des probabilités, Livre 2, 3rd ed. Paris: Courcier, 1820.
Lawson, C. and Hanson, R. Solving Least Squares Problems. Englewood Cliffs,
NJ: Prentice-Hall, 1974.
Ledvij, M. "Curve Fitting Made Easy." Industrial Physicist 9, 24-27, Apr./May
2003.
Nash, J. C. Compact Numerical Methods for Computers: Linear Algebra and
Function Minimisation, 2nd ed. Bristol, England: Adam Hilger, pp. 21-24, 1990.
Press, W. H.; Flannery, B. P.; Teukolsky, S. A.; and Vetterling, W. T. "Fitting Data
to a Straight Line" "Straight-Line Data with Errors in Both Coordinates," and
"General Linear Least Squares." §15.2, 15.3, and 15.4 in Numerical Recipes in
FORTRAN: The Art of Scientific Computing, 2nd ed. Cambridge, England:
Cambridge University Press, pp. 655-675, 1992.
Whittaker, E. T. and Robinson, G. "The Method of Least Squares." Ch. 9 in The
Calculus of Observations: A Treatise on Numerical Mathematics, 4th ed. New
York: Dover, pp. 209-, 1967.
York, D. "Least-Square Fitting of a Straight Line." Canad. J. Phys. 44, 1079-
1086, 1966.
Contact the MathWorld Team © 1999-2016 Wolfram Research, Inc. | Terms of Use