0% found this document useful (0 votes)
3 views15 pages

Na 11

Download as pdf or txt
Download as pdf or txt
Download as pdf or txt
You are on page 1/ 15

NUMERICAL ANALYSIS

Prof. Dr. Süheyla ÇEHRELİ

11
CURVE FİTTİNG
There are two general approaches for curve fitting:

• Least Squares regression:


Data exhibit a significant degree of scatter. The
strategy is to derive a single curve that represents
the general trend of the data.

• Interpolation:
Data is very precise. The strategy is to pass a
curve or a series of curves through each of the
points.
MATHEMATİCAL BACKGROUND
 Arithmetic mean. The sum of the individual data
points (yi) divided by the number of points (n).

y
 y i
, i  1, , n
n

 Standard deviation. The most common measure of


a spread for a sample.
St
Sy  , S t   ( yi  y ) 2
n 1
MATHEMATİCAL BACKGROUND
 Variance. Representation of spread by the square of
the standard deviation.

 i   y   y 
2 2
( y y ) 2
/n
S 
2
or S 2

i i
n 1
y
n 1
y

 Coefficientof variation. Has the utility to quantify


the spread of data.

Sy
c.v.  100 %
y
LEAST SQUARES REGRESSİON
Linear Regression
Fitting a straight line to a set of paired
observations: (x1, y1), (x2, y2),…,(xn, yn).
y = a0+ a1 x + e
a1 - slope
a0 - intercept
e - error, or residual, between the model and
the observations
CRİTERİA FOR A “BEST” FİT

 How to find a0 and a1 so that the error would


be minimum?
n n
min e  (y  a
i 1
i
i 1
i 0  a1 xi )

n n
min | e |  | y  a
i 1
i
i 1
i 0  a1 xi |
CRİTERİA FOR A “BEST” FİT

 Another strategy is to minimize the sum of the


residuals between the measured y and the y
calculated with the linear model
n n n
S r   ei2   ( yi , measured  yi , model ) 2   ( yi  a0  a1 xi ) 2
i 1 i 1 i 1

n n
min S r   i e 2
  ( yi  a0  a1 xi ) 2
i 1 i 1
 The coefficients a0 and a1 that minimize Sr
must satisfy the following conditions:
 S r S r
 2 ( yi  ao  a1 xi )  0
 a  0 ao
 0
 S r
 S r  0  2 ( yi  ao  a1 xi ) xi   0
a1
 a1
0   yi   a 0   a1 xi
0   yi xi   a 0 xi   a1 xi2

a 0  na0
na0   xi a1   yi 2 equations with 2 unknowns,
 ii  0i 1i
y x  a x  a x 2 can be solved simultaneously
n xi yi   xi  yi
a1  a0  y  a1 x
n xi2   xi 
2

Total sum of the squares around the mean for the


dependent variable, y, is St
S t   ( yi  y ) 2

Sum of the squares of residuals around the regression


line is Sr
n n
S r   ei2   ( yi  ao  a1 xi ) 2
i 1 i 1
Standard deviation for the regression line can
be determined as;

𝑆𝑟
𝑆𝑦/𝑥 =
𝑛−2
where Sy/x is the standard error of the estimate.

y/x designates that the error is for a predicted value of


y corresponding to a particular value of x.
 St-Srquantifies the improvement or error reduction due to
describing data in terms of a straight line rather than as
an average value.

St  S r
r  2

St
r2: coefficient of determination
r : correlation coefficient
For a perfect fit:
 Sr= 0 and r = r2 =1, signifying that the line explains 100
percent of the variability of the data.
 For r = r2 = 0, Sr = St, the fit represents no improvement.
 EXP 11-1: Fit a straight line to x, y values given below.

x y
0 200
3 230
5 240
8 270
10 290
 EXP 11-2: Fit a straight line to x, y values given below, and
calculate the standard deviation, the standard error of the
estimate, and correlation coefficient of the data.

x y
1 0,5
2 2,5
3 2
4 4
5 3,5
6 6
7 5,5

You might also like