0% found this document useful (0 votes)
17 views81 pages

Curve Fitting

The document discusses two types of curve fitting: interpolation, which fits curves through discrete data points, and least squares regression, which derives a trend line for data with noise. It details methods such as linear and polynomial interpolation, including the Lagrange polynomial for unique polynomial fitting, and highlights the computational complexity and potential issues with polynomial methods. Additionally, it explains linear regression and least squares regression for minimizing residuals to find the best-fitting line for data.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views81 pages

Curve Fitting

The document discusses two types of curve fitting: interpolation, which fits curves through discrete data points, and least squares regression, which derives a trend line for data with noise. It details methods such as linear and polynomial interpolation, including the Lagrange polynomial for unique polynomial fitting, and highlights the computational complexity and potential issues with polynomial methods. Additionally, it explains linear regression and least squares regression for minimizing residuals to find the best-fitting line for data.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 81

Curve Fitting

Two types of curve fitting:


•Interpolation
Given data for discrete values, fit a curve or a series of curves that pass directly
through each of the points.
-When data are very precise.

•Least square regression


Given data for discrete values, derive a single curve that represents the general trend
of data.
-When the given data exhibit a significant degree of error or noise.
Linear interpolation
One of the simplest methods is linear interpolation.
Polynomial interpolation

Polynomial interpolation is a generalization of linear interpolation. Note that the linear


interpolant is a linear function. We now replace this interpolant with a polynomial of higher
degree.

Generally, if we have n data points, there is exactly one polynomial of degree at most n−1
going through all the data points. The interpolation error is proportional to the distance
between the data points to the power n. Furthermore, the interpolant is a polynomial and
thus infinitely differentiable. So, we see that polynomial interpolation overcomes most of
the problems of linear interpolation.

However, polynomial interpolation also has some disadvantages. Calculating the


interpolating polynomial is computationally expensive (computational complexity)
compared to linear interpolation. Furthermore, polynomial interpolation may exhibit
oscillatory artifacts, especially at the end points.
The number of data points minus one defines the order of interpolation.
Thus, linear (or two-point interpolation) is the first order interpolation.
Lagrange Interpolating Polynomial
In numerical analysis, Lagrange polynomials are used for polynomial interpolation. For a
given set of points (xj, yj) with no two xj values equal, the Lagrange polynomial is the
polynomial of lowest degree that assumes at each value xj the corresponding value yj (i.e.
the functions coincide at each point).

The interpolating polynomial of the least degree is unique, however, and since it can be
arrived at through multiple methods, referring to "the Lagrange polynomial" is perhaps not
as correct as referring to "the Lagrange form" of that unique polynomial.
Divided Difference Notation
Spline interpolation
Cubic Spline Interpolation
Example
In mathematics, extrapolation is the process of estimating, beyond the original observation range,
the value of a variable on the basis of its relationship with another variable. It is similar to
interpolation, which produces estimates between known observations, but extrapolation is subject to
greater uncertainty and a higher risk of producing meaningless results.
Linear regression

• Aim of linear regression is to fit a straight line, ŷ = ax + b, to data that gives best prediction of y for any
value of x

• This will be the line that ŷ = ax + b


minimises distance between
data and fitted line, i.e. slope intercept
the residuals
ε

= ŷ, predicted value
= yi , true value

ε = residual error
Least Squares Regression

• To find the best line we must minimise the sum of the squares of the residuals (the vertical
distances from the data points to our line)

Model line: ŷ = ax + b a = slope, b = intercept

Residual (ε) = y - ŷ

Sum of squares of residuals = Σ (y – ŷ)2

 we must find values of a and b that minimise


Σ (y – ŷ)2
How to find regression line

coefficients a and b:

You might also like