Curve Fitting
Curve Fitting
Generally, if we have n data points, there is exactly one polynomial of degree at most n−1
going through all the data points. The interpolation error is proportional to the distance
between the data points to the power n. Furthermore, the interpolant is a polynomial and
thus infinitely differentiable. So, we see that polynomial interpolation overcomes most of
the problems of linear interpolation.
The interpolating polynomial of the least degree is unique, however, and since it can be
arrived at through multiple methods, referring to "the Lagrange polynomial" is perhaps not
as correct as referring to "the Lagrange form" of that unique polynomial.
Divided Difference Notation
Spline interpolation
Cubic Spline Interpolation
Example
In mathematics, extrapolation is the process of estimating, beyond the original observation range,
the value of a variable on the basis of its relationship with another variable. It is similar to
interpolation, which produces estimates between known observations, but extrapolation is subject to
greater uncertainty and a higher risk of producing meaningless results.
Linear regression
• Aim of linear regression is to fit a straight line, ŷ = ax + b, to data that gives best prediction of y for any
value of x
= ŷ, predicted value
= yi , true value
ε = residual error
Least Squares Regression
• To find the best line we must minimise the sum of the squares of the residuals (the vertical
distances from the data points to our line)
Residual (ε) = y - ŷ
coefficients a and b: