Lec4 Numerical Model
Lec4 Numerical Model
Mechanical Engineering
Chapter 17 in Textbook
1
Lesson Outcomes
By the end of the lesson, students should be able to
• to perform curve fitting numerically on the given data
• study 2 types curve-fitting techniques:
i. Regression you get general trend of data
ii. Interpolation you get a curve that connects all data
points
f(x) ??
Regression Interpolation
2
CURVE FITTING TECHNIQUES
1) Least-Squares Regression
• Data is not accurate/precise and exhibits a significant
degree of scatter e.g. experimental data
• Strategy: to use a single curve that best fits the general
trend of the data . Application : Trend analysis
• The best-fit line can be straight or polynomial and does not
necessarily passes through individual points.
y=a0+a1x
3
CURVE FITTING TECHNIQUES
2) Interpolation
• Data is very precise/accurate
• Strategy: to construct a curve
that passes through each of
the discrete points
• Application: to estimate any
intermediate values between
precise data points
4
1) LEAST-SQUARES REGRESSION
a0, a1 = coefficients
•The best-fit line are determined by minimizing the sum of
the squares of the error, Sr
6
BEST-FIT CRITERIA
OR
y y a a x 2
n n n
Find the S ei2 2 y
r i i,model i 0 1 i
least error: i 1 i 1 i1
S r n
Rearranging to LAE:
a a x y
0 1 i i To solve for
a x a x y x
0
2
1 i i i
unknown a0 , a1
a na
0 0
8
LINEAR REGRESSION
n
x a y
i 0 i
x a x y
i
x
2
i 1 i i
9
LINEAR REGRESSION
• Follow formulas
n xi yi xi yi
a1
n x xi
2 2
coefficient a1: i
substitute
coefficient a0: a0 y a1 x
mean values of xi and yi
where mean y
y i
and x
x i
, i 1,2,..., n
n n
Therefore, the linear regression line is: y = a0 + a1x 1
10
ERROR QUANTIFICATION
y
n n
Sum squared
Sr ei2 a0 a1 xi
2
i
regression error
i 1 i 1
11
ERROR QUANTIFICATION
• The difference between St and Sr provides a measure of the
accuracy of regression or “goodness of fit”.
• To get “goodness of fit”, we calculate R-squared value or
Coefficient of Determination which is the relative measure
of the percentage of how many data points fall on the
regression line. n
S r yi a0 a1 xi
2
R-squared r 2 S t S r i 1
value, r2 St
St yi y
2
• Example: if r2 = 0.8 means that 80% of variability of the data fit the
regression model. r2 = 1 means perfect fit. r2 = 0 means zero
correlation. A good fit should be r2 > 0.8
x2 1 4 … … … x2
13
SOLUTION
ybar = 3
xbar = 3
a1 = 1.06 y = –0.18 + 1.06x
a0 = -0.18 r2 = 0.9737
St = 11.54
Sr = 0.304
Sx/y = 0.3183
r= 0.9867
r2 = 0.9737
CLASS ACTIVITY #2
n x x
2 2
i i
x 1 3 5 7 10
a0 y a1 x
y 4 5 6 5 8
n
S r yi a0 a1 xi
2
Find i 1
S t yi y
2
(a) Least-squares equation y = a0 +a1x
(b) St and Sr Sy/x
Sr
(c) Standard error of the estimates, S y/x n2
The data below was obtained from a creep test performed on a metal
specimen. Table shows the increase in strain (%) over time (min) while a
constant load was applied to a test specimen. Using linear regression
method, find the equation of the line which best fits the data.
n xi yi xi yi
a1
n xi2 xi
2
Source: explainxkcd.com
18
Cont
19
POLYNOMIAL REGRESSION
linear
y a0 a1 x a2 x e
2
n
S r yi a0 a1 xi a x 2 2
2 i
i 1
2nd order
Chap17/18
20
POLYNOMIAL REGRESSION
n
S r yi a0 a1 xi a x 2 2
2 i
i 1
y a
coefficient a 0: S r
2 a x a 2 i 0
x 2
a0
i 0 1 i
coefficient a 1: S r
2 xi yi a0 a1 xi a2 xi2 0
a1
2 x y a a x a x 0
coefficient a 2: S r 2 2
a2
i i 0 1 i 2 i
21
POLYNOMIAL REGRESSION
n a0
x a x a y
i 1 i
2
2 i
x a x a x a x y
i 0 i
2
1
3
i 2 i i
x a x a x a x y
i
2
0
3
i 1 i
4
2 i
2
i
22
POLYNOMIAL REGRESSION
• OR in a matrix form: AX = B
2
n
yi
n n
n x i xi
n i 1 i 1
0
a n
i 1
xy
1 i i
n n
xi i
x 2
xi3 a
i 1
i 1 i 1 i 1
n n n a2 n
xi2 i
x 3
x 4
x 2
y
i 1
i
i i
i 1 i 1 i 1
• Then, solve for a0, a1, a2 by using Gauss elimination method
23
EXAMPLE
6 15 55 a0 152.6
15 55 225 a 585.6
1
55 225 979 a2 2488.8
By using Gauss elimination, we obtain:
a0 = 2.47857, a1=2.35929, a2 = 1.86071
Hence, the least-squares quadratic equation:
y = 2.47857 + 2.35929x + 1.86071x2
(see Example 17.5 in the textbook)
25
SUMMARY
• 2 Techniques to do curve-fitting: Regression and
Interpolation.
• Types of Least-squares Regression:
• Linear Regression
• Polynomial Regression
26