0% found this document useful (0 votes)
107 views31 pages

Lecture 11

The document discusses least squares curve fitting and linear regression. It introduces the concepts of selecting appropriate functions to fit data, deciding on a criterion for determining the best fit, and deriving the normal equations for linear regression. Examples are provided to illustrate linear and polynomial regression as well as multiple linear regression. The objective is to minimize the sum of the squared residuals by determining the coefficients that best fit the observed data to the chosen function.

Uploaded by

Aya Zaied
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
107 views31 pages

Lecture 11

The document discusses least squares curve fitting and linear regression. It introduces the concepts of selecting appropriate functions to fit data, deciding on a criterion for determining the best fit, and deriving the normal equations for linear regression. Examples are provided to illustrate linear and polynomial regression as well as multiple linear regression. The objective is to minimize the sum of the squared residuals by determining the coefficients that best fit the observed data to the chosen function.

Uploaded by

Aya Zaied
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 31

LEAST SQUARES CURVE FITTING

Read sections 17.1, 17.2 and 17.3 in the textbook

1
CURVE FITTING
SELECTIONS OF THE FUNCTIONS
DECIDE ON THE CRITERION
LINEAR REGRESSION

n
S r   ( yi  a0  a1 xi ) 2 St   ( yi  y ) 2
i 1

n xi yi   xi  yi
a1  St  S r
n x   xi  r 
2 2 2
i
St
a0  y  a1 x
EXAMPLE

xi 1 2 3
Given:
yi 2.4 5 9

Find a function f(x)  ae bx that best fits the data.

 
n
S r (a, b)   yi  ae bxi 2

i 1

Normal Equations are obtained using :


S r
 
n
 2 yi  aebxi ebxi  0
a i 1
Difficult to Solve

S r
 
n
 2 yi  aebxi a xi ebxi  0
b i 1
LINEARIZATION METHOD
Find a function f(x)  ae bx that best fits the data.
Define g ( x )  ln( f ( x ))  ln( a )  b x

Define zi  ln( yi )  ln(a )  bxi


Let   ln(a ) and zi  ln( yi )

 
n
Instead of minimizing : S r (a, b)   yi  ae bxi 2

i 1
n
Minimize : S r ( , b)    zi    bxi  (Easier to solve)
2

i 1
EXAMPLE: EQUATIONS
n
S r ( , b)    zi    bxi 
2

i 1

Normal Equations are obtained using :


S r n
 2  zi    bxi   0
 i 1

S r n
 2  zi    bxi  xi  0
b i 1
n n n n n
 n  b  xi   zi and   xi  b x   ( xi zi )
2
i
i 1 i 1 i 1 i 1 i 1
EVALUATING SUMS AND SOLVING
xi 1 2 3 ∑=6
yi 2.4 5 9
zi=ln(yi) 0.875469 1.609438 2.197225 ∑=4.68213
x i2 1 4 9 ∑=14
xi z i 0.875469 3.218876 6.591674 ∑=10.6860

Equations :
3   6 b  4.68213
  ln( a ), a  e
6   14 b  10.686 a  e 0.23897  1.26994
Solving Equations : f ( x )  ae bx  1.26994 e 0.66087 x
  0.23897, b  0.66087
See pages 467 and
468 in the textbook
EXAMPLE SHOWING THAT QUADRATIC IS
PREFERABLE THAN LINEAR REGRESSION

y y

x x
Linear Regression Quadratic Regression
POLYNOMIAL REGRESSION
The least squares method can be extended to fit the data to a higher-
order polynomial

f ( x)  a  bx  cx 2 , ei2  ( yi  f ( x)) 2

 
n
Minimize Sr   yi  a  bxi  cx 2 2
i
i 1

Necessary conditions :
S r S r S r
 0,  0, 0
a b c
NORMAL EQUATIONS
n n n
a n  b  xi  c  2
xi   yi
i 1 i 1 i 1
n n n n
a  xi  b  2
xi c 3
xi   xi y i
i 1 i 1 i 1 i 1
n n n n
a 2
xi b 3
xi c 4
xi  2
xi y i
i 1 i 1 i 1 i 1
EXAMPLE 1: POLYNOMIAL REGRESSION
Fit a second-order polynomial to the following data
xi 0 1 2 3 4 5 ∑=15
yi 2.1 7.7 13.6 27.2 40.9 61.1 ∑=152.6

xi2 0 1 4 9 16 25 ∑=55
xi3 0 1 8 27 64 125 225
xi4 0 1 16 81 256 625 ∑=979
xi yi 0 7.7 27.2 81.6 163.6 305.5 ∑=585.6

xi2 yi 0 7.7 54.4 244.8 654.4 1527.5 ∑=2488.8


EXAMPLE 1: EQUATIONS AND SOLUTION

6 a  15 b  55 c  152.6
15 a  55 b  225 c  585.6
55 a  225 b  979 c  2488.8
Solving . . .
a  2.4786, b  2.3593, c  1.8607
f ( x )  2.4786  2.3593 x  1.8607 x 2
EXAMPLE 2

Fit a line regression, second-order and third order for the following table

X 1 1.5 2 2.75 3.25 4.1 4.9 5 5.5 5.7 7 7.5 8.5 9.2 9.8 11 11.2 11.6
Y 2 1 1.8 2.2 1 3 4 5.3 6 7 7.5 9.3 8.9 7.5 7 6 5.3 4
X Y x^2 X^3 X^4 X^5 x^6 xy x^2y X^3y
1 2 1 1 1 1 1 2 2 2
1.5 1 2.25 3.375 5.0625 7.59375 11.390625 1.5 2.25 3.375
2 1.8 4 8 16 32 64 3.6 7.2 14.4
2.75 2.2 7.5625 20.796875 57.19141 157.276367 432.51001 6.05 16.6375 45.75313
3.25 1 10.5625 34.328125 111.5664 362.59082 1178.42017 3.25 10.5625 34.32813
4.1 3 16.81 68.921 282.5761 1158.56201 4750.10424 12.3 50.43 206.763
4.9 4 24.01 117.649 576.4801 2824.75249 13841.2872 19.6 96.04 470.596
5 5.3 25 125 625 3125 15625 26.5 132.5 662.5
5.5 6 30.25 166.375 915.0625 5032.84375 27680.6406 33 181.5 998.25
5.7 7 32.49 185.193 1055.6 6016.92057 34296.4472 39.9 227.43 1296.351
7 7.5 49 343 2401 16807 117649 52.5 367.5 2572.5
7.5 9.3 56.25 421.875 3164.063 23730.4688 177978.516 69.75 523.125 3923.438
8.5 8.9 72.25 614.125 5220.063 44370.5313 377149.516 75.65 643.025 5465.713
9.2 7.5 84.64 778.688 7163.93 65908.1523 606355.001 69 634.8 5840.16
9.8 7 96.04 941.192 9223.682 90392.0797 885842.381 68.6 672.28 6588.344
11 6 121 1331 14641 161051 1771561 66 726 7986
11.2 5.3 125.44 1404.928 15735.19 176234.168 1973822.69 59.36 664.832 7446.118
11.6 4 134.56 1560.896 18106.39 210034.166 2436396.32 46.4 538.24 6243.584
Sum=111.5 88.8 893.115 8126.342 79300.86 807246.106 8444635.22 654.96 5496.352 49800.17
EXAMPLE 2

Linear regression
18 111.5 𝑎0 88.8
𝑎 =
111.5 893.115 1 654.96
𝑎0 1.7236
𝑎1 = 0.5182
EXAMPLE 2
Third –order Polynomial

Second–order Polynomial
EXAMPLE 2
10

6
Points
linear reg
4
2nd Order
3rd Order
2

0
0 2 4 6 8 10 12 14

-2
MULTIPLE LINEAR REGRESSION

f ( x)  a  bx1  cx2 , ei2  ( yi  f ( x)) 2


n
Minimize Sr    yi  a  bx1i  cx2i 
2

i 1

Necessary conditions :
S r S r S r
 0,  0, 0
a b c
EQUATIONS FOR MULTIPLE LINEAR REGRESSION
n
Minimize S r    yi  a  bx1i  cx2i 
2

i 1

S r n
 2  yi  a  bx1i  cx2i   0
a i 1

S r n
 2  yi  a  bx1i  cx2i  x1i  0
b i 1

S r n
 2  yi  a  bx1i  cx2i  x2i  0
c i 1
NORMAL EQUATIONS
n n n
a n  b  x1i  c  x2i   yi
i 1 i 1 i 1
n n n n
a  x1i  b  x  c  x1i x2i   x1i yi
2
1i
i 1 i 1 i 1 i 1
n n n n
a  x2i  b  x1i x2i  c  x   x2i yi 2
2i
i 1 i 1 i 1 i 1
EXAMPLE
SOLUTION
FITTING WITH NONLINEAR FUNCTIONS

xi 0.24 0.65 0.95 1.24 1.73 2.01 2.23 2.52

yi 0.23 -0.23 -1.1 -0.45 0.27 0.1 -0.29 0.24

It is required to find a function of the form :


f ( x)  a ln( x)  b cos( x)  c e x
to fit the data.
n
S r   ( yi  f ( xi )) 2

i 1
FITTING WITH NONLINEAR
n FUNCTIONS
S r   ( yi  a ln( xi )  b cos( xi )  c e )
xi 2

i 1

Necessary condition for the minimum :


 Sr 
 0
a 
 Sr 
 0  Normal Equations
b 
 Sr 
 0
c 
NORMAL
n EQUATIONS
n n n
a  (ln xi ) 2  b (ln xi )(cos xi )  c  (ln xi )( e xi )   yi (ln xi )
i 1 i 1 i 1 i 1

n n n n
a  (ln xi )(cos xi )  b (cos xi ) 2  c  (cos xi )( e xi )   yi (cos xi )
i 1 i 1 i 1 i 1

n n n n
a  (ln xi )( e xi )  b (cos xi )( e xi )  c  ( e xi ) 2   yi ( e xi )
i 1 i 1 i 1 i 1

Evaluate the sums and solve the normal equations.


xi 0.24 0.65 0.95 1.24 1.73 2.01 2.23 2.52 ∑=11.57

yi 0.23 -0.23 -1.1 -0.45 0.27 0.1 -0.29 0.24 ∑=-1.23

(ln xi)2 2.036 0.1856 0.0026 0.0463 0.3004 0.4874 0.6432 0.8543 ∑=4.556

ln(xi) cos(xi) -1.386 -0.3429 -0.0298 0.0699 -0.0869 -0.2969 -0.4912 -0.7514 ∑=-3.316

ln(xi) * exi -1.814 -0.8252 -0.1326 0.7433 3.0918 5.2104 7.4585 11.487 ∑=25.219

yi * ln(xi) -0.328 0.0991 0.0564 -0.0968 0.1480 0.0698 -0.2326 0.2218 ∑=-0.0625

cos(xi)2 0.943 0.6337 0.3384 0.1055 0.0251 0.1808 0.3751 0.6609 ∑=3.26307

cos(xi) * exi 1.235 1.5249 1.5041 1.1224 -0.8942 -3.1735 -5.696 -10.104 ∑=-14.481

yi*cos(xi) 0.223 -0.1831 -0.6399 -0.1462 -0.0428 -0.0425 0.1776 -0.1951 ∑=-0.8485

(exi)2 1.616 3.6693 6.6859 11.941 31.817 55.701 86.488 154.47 ∑=352.39

yi * exi 0.2924 -0.4406 -2.844 -1.555 1.523 0.7463 -2.697 2.9829 ∑=-1.9923
EXAMPLE 3: EQUATIONS & SOLUTION
4.55643 a  3.31547 b  25.2192 c  0.062486
 3.31547 a  3.26307 b  14.4815 c  0.848514
25.2192 a  14.4815 b  352.388 c  1.992283

Solving the above equations :


a   0.88815, b   1.1074, c  0.012398
Therefore,
f ( x )  0.88815 ln( x )  1.1074 cos( x )  0.012398 e x

You might also like