0% found this document useful (0 votes)
22 views19 pages

Numerical Methods Module 5

Numerical Methods Module 5

Uploaded by

Sunil Kumar R A
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
22 views19 pages

Numerical Methods Module 5

Numerical Methods Module 5

Uploaded by

Sunil Kumar R A
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 19

Computational Techniques

Module 5: Regression and Interpolation

Dr. Niket Kaisare


Department of Chemical Engineering
Indian Institute of Technology - Madras
Regression Example
 Given the following data:
x 0.8 1.4 2.7 3.8 4.8 4.9
y 0.69 1.00 2.02 2.39 2.34 2.83

Regression:
Ob
Obtain a straight
h line
l
that best fits the data
Interpolation Example
 Given the following data:
x 0.8 1.4 2.7 3.8 4.8 4.9
y 0.69 1.00 2.02 2.39 2.34 2.83

Interpolation:
“J
“Join the
h dots”
d ” and
d
find a curve passing
through the data.
Regression vs. Interpolation
x 0.8 1.4 2.7 3.8 4.8 4.9
y 0.69 1.00 2.02 2.39 2.34 2.83

 In regression,
regression we are interested in fitting a
chosen function to data
y = 0.45
0 45 + 0.47x
0 47x

 In interpolation,
p , ggiven finite amount of data,, we
are interested in obtaining new data-points
within this range.
At x = 2.0, y = 1.87
Example: Kinetic Rate Constants
(Regression)
R i )
 Experiments with conversion measured at
various temperatures
6
E
r  k0e ( RT )
ca 4

log(rr)
0 Slope = E/R

E 1
ln(r )  ln(k0 )     ln(ca )
-2

R T  -4
Intercept = ln(k0)
-3.5 -3 -2.5 -2 -1.5
-1/T -3
x 10
Example: Viscosity of Oil
(Interpolation)
I l i )
 Viscosity of lubricant oil was measured between
-20 to 200 degrees Celcius in steps of 20 oC.
 Interpolation is used if viscosity is desired at an
intermediate temperature
General Setup
 Let x be an independent variable
and y be a dependent variable

 Given the data:


 x1, y1 ,  x2 , y2 , ,  x N , y N 
Find parameters θ to get a “best‐fit” curve
y  f  x; 
Regression vs. Interpolation
Regression Interpolation
 Choose a function  Various standard
form for f(x;θ) function forms exist
 For a given θ, obtain  The interpolating
the values ŷˆ i from function passes
the model through all the points
 The best θ minimizes  Can be used to “fill-
the error yi  yˆi in” the data at new
points
Regression vs. Interpolation
 “Curve Fitting”
g  “Joining
J g the dots”

 Obtain a functional  Obtain the value of y


form to fit the data at intermediate point
p
Outline: Regression
 Linear Regression in One Variable

 Linear Regression in Multiple Variables

 Polynomial Regression

 A l i and
Analysis d EExtension
i

 Non-Linear Regression
g
Linear Regression: One Variable
 Model: y  f  x; 

 Actual Data:  x1, y1 ,  x2 , y2 , ,  x N , y N 


 P d
Prediction: yˆi  f  xi ; 

 Errors: ei  yi  yˆi

yi  f  xi ;   ei

 
yˆ i

 Mean / Variance:
 i
 2
x  xi sx 
x  x
( N  1)   sx
N
Extension to Multi
Multi--Variables
 Let x1, x2, …, xn be n variables.
Let there be N data points for each:
 x11, x21,, xn1; y1 ,
 x12 , x22 ,, xn 2 ; y2 ,

 x1N , x2 N ,, xnN ; y N 

Obtain θ for

y  f x; θ 
Linear Regression (multi
(multi--variable)
 Data  xi , ui , wi ; yi 
 Model y  a0  a1x  a2u  a3w

 Error ei  yi  a0  a1xi  a2ui  a3wi 

 Least Squares Criterion


2
N  

min  yi  a0  a1xi  a2ui  a3wi 
a0 , a1 , a2 , a3 i 1    
 ei 
Linear Regression (alternate)
1 x1 u1 w1   a0   y1 
1 x u w  a   y 
 2 2 2  1   2 

      a2    
1 x  a   y 
 u
NN
w N  3  N
X  Y

Least Squares

 X X T
1 T
X Y
Polynomial / Functional Regression
 Example:
p Specific
p heat as a function of T
◦ Methane: cp = 85.8 + 1.126e-2 T – 2.1141e-6 T2

 Example: Antoine’s vapor pressure relationship


b
◦ ln psat   a 
T c
Polynomial / Functional Regression
1 x1 x12 x12 
 3
Model:  1 x2 x22 x2 
X
y  a0  a1x  a2 x 2  a3 x 3     
 2 3 
1 x N xN x N 

1 ln x1  1
x1 x1 
 
 1 ln x2  1 x2 
M d l
Model: X  x2
a2     
y  a0  a1 ln( x)   a3 x  
x 1 ln x N  1
xN xN 

Outline: Interpolation
 Polynomial fitting and limitations

 Lagrange interpolating polynomials

 Newton’s methods

 S li interpolation
Spline i l i
Lagrange Polynomials
Pi 
 x  x1  x  x2 ... x  xi 1  x  xi 1 ... x  x N 
xi  x1 xi  x2 ...xi  xi 1  xi  xi 1 ...xi  xN 
 x xj 
  
 
j  i  xi  x j 

The interpolating
polynomial becomes

f ( x)  y1P1  y2 P2  ...  y N PN
Newton’s
Newton s Divided Differences
yi 11  yi
yi  1, i  
xi 1  xi
y[i  2, i  1]  y[i  1, i ]
yi  2, i  1, i  
xi 2  xi
y[i  3, i  2, i  1]  y[i  2, i  1, i ]
yi  3, i  2, i  1, i  
xi 3  xi

c0  y1, c1  y2,1, c2  y3,2,1, c3  y4,3,2,1, ...

f ( x)  c0  c1 ( x  x1 )  c2 ( x  x1 )( x  x2 )  ...

You might also like