0% found this document useful (0 votes)
17 views15 pages

Chap 5 Com

Chapter Five discusses approximation and interpolation methods for estimating values between discrete data points. It outlines two primary approaches for curve fitting: least-squares regression for noisy data and interpolation for precise data. The chapter also covers polynomial approximation techniques and various forms of interpolation, including linear and quadratic methods.

Uploaded by

Dawit Gizachew
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views15 pages

Chap 5 Com

Chapter Five discusses approximation and interpolation methods for estimating values between discrete data points. It outlines two primary approaches for curve fitting: least-squares regression for noisy data and interpolation for precise data. The chapter also covers polynomial approximation techniques and various forms of interpolation, including linear and quadratic methods.

Uploaded by

Dawit Gizachew
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 15

CHAPTER FIVE

Approximation and Interpolation


Class of Common Approximation Function

• Data is often given for discrete values along a continuum.


• However, you may require estimates at points between the discrete values.
• In addition, you may require a simplified version of a complicated function.
• One way to do this is to compute values of the function at a number of discrete
values along the range of interest.
• Then, a simpler function may be derived to fit these values.
• Both of these applications are known as curve fitting
• One can approximate a discrete data set with linear, quadratic, polynomial,
logarithmic etc. Functions.
Class of Common Approximation Function

There are two general approaches for curve fitting that are distinguished from
each other on the basis of the amount of error associated with the data.
• First, where the data exhibits a significant degree of error or “noise,” the
strategy is to derive a single curve that represents the general trend of the data.
• Because any individual data point may be incorrect, we make no effort to intersect
every point.
• Rather, the curve is designed to follow the pattern of the points taken as a group.
One approach of this nature is called least-squares regression.
• Second, where the data is known to be very precise, the basic approach is to fit a
curve or a series of curves that pass directly through each of the points
• The estimation of values between well-known discrete points is called
interpolation.
Criteria for the Choice of the Approximate Function
The function introduces a small or minimum error to system to
approximate intermediate values and simplicity of implementation in
computer Software.
Least Square Approximation by Polynomials

• The simplest example of a least-squares approximation is fitting a


straight line to a set of paired observations
(x1, y1), (x2, y2),…… (x n, y n)
• The mathematical expression for the straight line is y = a 0 + a1 x + e
• Where a0 and a1 are coefficients representing the intercept and the
slope, respectively, and e is the error, or residual, between the model
and the observations, which can be represented by rearranging
e = y – a0 – a1 x.
Least Square Approximation by Polynomials

• One strategy for fitting a “best” line through the data would be to
minimize the sum of the square of residual errors for all the available
data,
Where n = total number of points.

• We can find the values of a0 and a1 by differentiating the above


equation with respect to a0 and a1 and setting it to zero.
Approximation by Polynomials

• The least-squares procedure can be readily extended to fit the data to a


higher-order polynomial. For example, suppose that we fit a second-
order polynomial or quadratic: y = a0 + a1 x + a2 x2 + e

• These equations can be set equal to zero and rearranged to develop the
following set of normal equations:
Approximation by Polynomials
• The standard error formula with n= number of data points and m=
degree of approximation function.

Examples and exercises


Interpolation
For n + 1 data points, there is one and only one polynomial
of order n that passes through all the points.
Polynomial interpolation consists of determining the unique
nth-order polynomial that fits n + 1 data points.
Although there is one and only one nth-order polynomial
that fits n + 1 points, there are a variety of mathematical
formats in which this polynomial can be expressed
Interpolation
• Newton’s Divided-Difference Interpolating Polynomials
 Linear Interpolation:- The simplest form of interpolation is to connect two data
points with a straight line.

using similar triangles

[ f(x1) − f(x0)]/(x1 − x0) is a finite-divided-difference approximation of


the first derivative.
Interpolation
• Quadratic Interpolation:
If three data points are available, this can be accomplished with a second-order
polynomial (also called a quadratic polynomial or a parabola).
A particularly convenient form for this purpose is

With
Interpolation
• The coefficients can be found from finite divided differences.

Where,
Interpolation
Therefore Newton’s divided-difference interpolating polynomial can
be written as

For 4 data points


Interpolation
• Lagrange Interpolating Polynomials
A reformulation of the Newton polynomial that avoids the computation of divided
differences.

where,

For example ,
Interpolation

Assignments
Inverse interpolation
Spline interpolation

You might also like