Lecture 5 Curve FittingMS
Lecture 5 Curve FittingMS
SEMM/SKMM3023
Curve Fitting
1
Why curve fitting?
2
Application of Curve Fitting
■ Data fitting
■ Optimisation
3
Curve Fitting
1. Collocation-Polynomial
2. Least square regression (Statistical method) – mth order polynomial
regression (linear, quadratic, cubic, 4th ,5th and so on) and multiple
regression (more than one independent variable)
3. Interpolation (Calculus method) – Lagrange and Divided difference
4
Collocation Polynomial
■ The collocation method fits a curve that passes through all the data
points. It is also referred to an exact fit or a direct fit
■ It is usually used when the data is known to be accurate, and a specific
trend is clear
■ Not suited for scattered data
5
Collocation Polynomial
6
Collocation Polynomial
■ In matrix form:
𝑥13 𝑥12 𝑥1 1 𝑎3 𝑦1
𝑥23 𝑥22 𝑥2 1 𝑎2 𝑦2
𝑎1 = 𝑦3
𝑥33 𝑥32 𝑥3 1
𝑎0 𝑦4
𝑥43 𝑥42 𝑥4 1
■ The matrix above can now be solved using any of the methods you have
learnt previously
7
Collocation Polynomial
■ Since there are 5 data points, a 4th order polynomial can be used
8
Collocation Polynomial
i 1 2 3 4 5
x 4 8 12 16 18
■ In matrix form y 486 12138 68750 228690 372368
9
Collocation Polynomial
i 1 2 3 4 5
x 4 8 12 16 18
y 486 12138 68750 228690 372368
10
Collocation Polynomial
■ Sometimes, although you have many data points, you may want to
ignore some points since the trend is a lower order polynomial
11
Collocation Polynomial
x y
1 1
3 3
7 7
8 8
10 10
12
Collocation Polynomial
x y
1 1
3 3
7 7
8 8
10 10
■ Although there are 5 data points, the trend is obviously linear. Hence,
we only need to choose two data points to fit. However, if you try to fit a
4th order polynomial, you may get coefficients a4, a3 and a2 = 0.
13
Least Square Regression
■ The method to obtain best-fit curve of a given type is the curve that has
the minimal sum of the deviations squared (least square error) from a
given set of data. It represents the general trend of the data and the
curve may not pass through all the data points.
■ Usually applied for scattered data
14
Least Square Regression
15
Least Square Regression
16
Least Square Regression
17
Linear Regression Method
18
Linear Regression Method
19
Linear Regression Method
■ Note that 𝑎1 is the gradient and 𝑎0 is the y-intercept of the linear line
20
Linear Regression Method
n=5
21
Linear Regression Method
22
Linear Regression Method
i xi xi2 yi xiyi
1 1 1 0.7 0.7
2 2 4 2.2 4.4
3 3 9 2.8 8.4
4 4 16 4.4 18
5 5 25 4.9 25
1 Σ 15 55 15 56
𝑎0 = 15 − 1.1 15 = −0.3
5
𝑓 𝑥 = 1.1𝑥 − 0.3
23
Linear Regression Method
𝑓 𝑥 = 1.1𝑥 − 0.3
24
Quadratic Regression Method
25
Quadratic Regression Method
26
Quadratic Regression Method
27
Quadratic Regression Method
n n
n
n x i 2
xi yi
n i =0 i =0
a 0 i =0
3
n n n
x
x
2
xi a1 = xi yi
i =0
i
i =0
i
i =0
i =0
n n n a2 n
xi2 i
x 3
x 4
x 2
y
i i
i =0
i
i =0 i =0 i =0
28
Quadratic Regression Method
29
Quadratic Regression Method
n n
n
n x i x2
i i y
n i =0 i =0
a0 ni =0
x
n n
3
x i i
2
x i 1a = x y
i =0
i
i =0
i
i =0
i =0
n n n 2 n
a
xi2 i
x 3
x 4
x 2
y
i i
i =0
i
i =0 i =0 i =0
8 36 204 𝑎0 61
36 204 1296 𝑎1 = 284.72
204 1296 8772 𝑎2 1707.84
30
Quadratic Regression Method
■ Solution:
𝑎0 11.5693
𝑎1 = −2.8263
𝑎2 0.3432
𝑓 𝑥
= 0.3432𝑥 2 − 2.8263𝑥 + 11.5693
31
mth Order Regression Method
■ To approximate a set of data that has more or equal to m+1 data points,
the least square regression method uses a mth degree polynomial,
𝑓(𝑥) = 𝑎𝑚 𝑥 𝑚 + 𝑎𝑚−1 𝑥 𝑚−1 +. . . 𝑎2 𝑥 2 + 𝑎1 𝑥 + 𝑎0
■ The best fitting curve f(x) would have the least square error, meaning
32
mth Order Regression Method
■ Again, to obtain the least square error, the unknown coefficients must
yield zero first derivatives
𝑎0
𝑎1
𝑎2
𝑎𝑚
33
mth Order Regression Method
■ And hence form the following set of linear equations which can be
solved
34
mth Order Regression Method
35
Multiple Regression Method
■ This method is used for equations with more than one independent
variable. To approximate a set of data that has more or equal to 3 data
points in a linear case, the equation is as follows
𝑓(𝑥, 𝑦) = 𝑎 + 𝑏𝑥 + 𝑐𝑦
■ The best fitting curve f (x) has the least square error
■ Where 𝑎, 𝑏 and 𝑐 are unknown coefficients while all 𝑥𝑖 and 𝑦𝑖 are given.
36
Multiple Regression Method
37
Multiple Regression Method
38
Multiple Regression Method
■ In matrix form:
39
Least Square Regression
How to calculate errors?
■ Tips for least square
regression: i xi xi2 yi xiyi f(xi) [yi-f(xi)]2
1 1 1 0.7 0.7 0.8
1. Select method to solve curve 2 2 4 2.2 4.4 1.9
fitting, 1st, 2nd m- order or
3 3 9 2.8 8.4 3.0
multiple regression.
4 4 16 4.4 18 4.1
2. Calculate error for each 5 5 25 4.9 25 5.2
method. Σ 15 55 15 56 Error
3. The smaller the error the
better the solution
40
Interpolation
41
Interpolation
■ In interpolation, the curve will fit through all the data points.
45
Lagrange Polynomials
■ Can be used for both unequally spaced data and equally spaced data.
■ However, for values of n > 3, this solution needs a computer program
and may not be very accurate.
■ It is based on fitting second-degree Lagrange polynomial, to a (sub)set
of three discrete data pairs (quadratic function through 3 points) such
as (𝑥0 , 𝑦0 ), (𝑥1 , 𝑦1 ), (𝑥2 , 𝑦2 )
■ The second-degree Lagrange polynomial is:
46
Lagrange Polynomials
47
48
Since for n>3 it becomes
Lagrange Polynomials less accurate, it may be
better to divide data
points into set of 3 data
and fit multiple 2nd order
■ Third-order Lagrange polynomial: Lagrange
■ General equation:
49
Divided Difference Polynomials
■ Can be used for both unequally spaced data and equally spaced data.
■ Given the points (x0, y0), (x1, y1), (x2, y2), (x3, y3), the function f(x) can be
expressed in a 3rd order polynomial
50
Divided Difference Polynomials
■ The unknown coefficients a0, a1, a2 and a3 can be found using the divided
difference table:
51
52
Divided Difference Polynomials
53
Divided Difference Polynomials
54
Example 3 data point hence must use
second-order polynomial!
55
Example
1. Collocation-polynomial: 𝑦 = 𝑎2 𝑥 2 + 𝑎1 𝑥 + 𝑎0
𝑦0 = 𝑎2 𝑥02 + 𝑎1 𝑥0 + 𝑎0
𝑦1 = 𝑎2 𝑥12 + 𝑎1 𝑥1 + 𝑎0
𝑦2 = 𝑎2 𝑥22 + 𝑎1 𝑥2 + 𝑎0 Largest can fit would
be a 2nd order
polynomial!
2
0.294118 = 𝑎2 3.4 + 𝑎1 3.4 + 𝑎0
2
0.285714 = 𝑎2 3.5 + 𝑎1 3.5 + 𝑎0
2
0.277778 = 𝑎2 3.6 + 𝑎1 3.6 + 𝑎0
56
Example
In matrix form
2
0.294118 = 𝑎2 3.4 + 𝑎1 3.4 + 𝑎0
0.285714 = 𝑎2 3.5 2 + 𝑎 3.5 + 𝑎
1 0
0.277778 = 𝑎2 3.6 2 + 𝑎 3.6 + 𝑎
1 0
Can be solved using either of the methods you have previously learnt
57
Example
Hence
𝑎2 0.023400
𝑎1 = −0.245500
𝑎0 0.858314
58
Example
1. Lagrange polynomial
59
Example
Differentiation:
f’(3.5)
f’’(3.5)
60
Example
61
Example
62
Example
Differentiating
f’(3.5)
f’’(3.5)
63
Example
■ Note that all 3 methods here will give the exact same answer if the same
points are used!
■ The error in f ′(3.5) is
Error = 100% x |f ′(3.5) − (-0.081633)|/-0.081633
= 100% x |−0.081700 + 0.081633|/-0.081633
= 0.082%
64
Curve Fitting in MATLAB
1. polyfit
2. polyval
Polyfits takes in at least 3 inputs: x, y and n where x and y are the data
points and n is the order/degree of the polynomial. The function then
outputs the coefficients for the specified polynomial
Polyval needs at least two inputs, p and x where p is a vector of coefficients
obtained from polyfit and x is a vector of the variable input. It outputs the y
values of the polynomial
65
Thank you
66