0% found this document useful (0 votes)
14 views12 pages

MAT531 CIA3 ApplicationsOfLinALg

This document discusses applications of linear algebra concepts like interpolation and linear regression. It describes linear interpolation and using a system of equations in matrix form to find coefficients for a polynomial interpolation. It also discusses the Runge's phenomenon of high degree polynomials and introduces Lagrange polynomial interpolation as a better approach.

Uploaded by

Shubham Gupta
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views12 pages

MAT531 CIA3 ApplicationsOfLinALg

This document discusses applications of linear algebra concepts like interpolation and linear regression. It describes linear interpolation and using a system of equations in matrix form to find coefficients for a polynomial interpolation. It also discusses the Runge's phenomenon of high degree polynomials and introduces Lagrange polynomial interpolation as a better approach.

Uploaded by

Shubham Gupta
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 12

Application of Linear Algebra in Real Life

Interpolation and Linear Regression Model


CIA 3

Ayush Gupta
2040816

Govind Arun Nampoothiri


2040825

Indrajith Ajith
2040827

Moksh Jain
2040836

Submitted to
Dr. Joseph T. V

Bachelor of Science
in
Economics, Mathematics and Statistics

1
Contents
1 Abstract 3

2 Interpolation 4
2.1 Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
2.2 Types of Interpolation . . . . . . . . . . . . . . . . . . . . . . . . 4
2.2.1 Linear Interpolation . . . . . . . . . . . . . . . . . . . . . 4
2.3 System of Equations . . . . . . . . . . . . . . . . . . . . . . . . . 4
2.4 Runge’s Phenomenon . . . . . . . . . . . . . . . . . . . . . . . . 6
2.4.1 Lagrange’s Polynomial Interpolation . . . . . . . . . . . . 7

3 Linear Regression 9
3.1 Least-Squares Lines . . . . . . . . . . . . . . . . . . . . . . . . . 9
3.2 Multiple Linear Regression Model . . . . . . . . . . . . . . . . . . 10

4 Conclusion 11

5 References 12

2
1 Abstract
Linear algebra, otherwise known as the mathematics of data, draws its name
from the fact that it is the study of linear combinations. Linear combinations of
what?, of vectors and vector spaces. In real life, everything is interrelated and
inter-dependent on each other. Therefore, it is with the help of linear algebra
that we are able to find a solution for a system of equations with many variables.
A leading principle of linear algebra is to standardize the field of linear algebra
of both approaches i.e. the physics approach of a vector and the computer ap-
proach of a vector.

This report discusses an application of linear algebra through the concept


of linear interpolation using two different methodologies: a system of linear
equations in the form of matrices, as well as through the use of Lagrange’s
Polynomial Interpolation. As EMS students, inferring from any given data
and coming to meaningful conclusions is a common recurrence in our career.
The process of interpolation helps us simplify the experimental exploration by
maintain a minimum number of measurement points, thereby allowing us to
estimate the remaining data while also learning more about the data in between.

3
2 Interpolation
2.1 Definition
The computation of points or values between ones that are known or tab-
ulated using the surrounding points or values. In particular, given a uni-
variate function, f = f (x), interpolation is the process of using known val-
ues f (x0 ), f (x1 ). . . .F (xn ) to find values for f (x) at point x ̸= xi where i =
0, 1, 2, ..., n.

In general this technique involves the construction of a function L(x) called


the interpolant which agrees with f at the point x = xi , which is then used to
compute the desired values.

2.2 Types of Interpolation


There are multiple ways to interpolate any given set of data points. The simplest
of these methods would be that of linear interpolation.

2.2.1 Linear Interpolation

The linear interpolation formula is the simplest method that is used for estimat-
ing the value of a function between any two known values, thereby rendering
it useful for curve fitting using linear polynomials. The linear interpolation
method is used for finding new values for any function using the set of values.

2.3 System of Equations


A system of equations is a set of one or more equations involving a number of
variables. The solutions to systems of equations are the variable mappings such
that all component’s equations are satisfied—in other words, the locations at
which all of these equations intersect. These equations can be solved using the
theory of matrices.

A given system of equations can be represented in the form AX = B

     
a11 a12 .. a1n x1 B1
 a21 a22 .. a2n   x2   B2 
     
 . . . .   .   . 
A=  , X =   and B=
     

 . . . .   .   . 
     
 . . . .   .   . 
am1 am2 .. amn xn Bm

4
, where A is the coefficient matrix, X is the variable matrix and B is the
constant matrix.

For example, consider four data points:

X 1 2 3 4
Y 0 1 1 1

To fit this data into a cubic polynomial of the form f (x) = a+bx+cx2 +dx3 ,
we can find the different values of x for which the function f (x) satisfies the
discrete data points using a system of equations. In such a case, the system of
equations would be:

a+b+c+d=0 (1)
a + 2b + 4c + 8d = 1 (2)
a + 3b + 9c + 27d = 1 (3)
a + 4b + 16c + 64d = 1 (4)
and the respective matrices would be:

     
1 1 1 1 a 0
1 2 4 8 b
   1
A=  , X =   and B= 
  
1 3 9 27 c 1
1 4 16 24 d 1

Solving for X using matrix multiplication in X = A−1 B, we find

 
−3
 13/3 
X=
 
−3/2

1/6
13
Then, the interpolated function becomes f (x) = −3 + 3 x − 32 x2 + 16 x3

5
Figure 1: Graph of the Interpolated Polynomial

2.4 Runge’s Phenomenon


However, as we complicate the function that we try to fit the data points to,
the more likely we are to see wiggles such as the one in the above graph and
also observe that the function does not hold true for data points outside the
interval already provided, which make it difficult to interpolate the value for a
data point that would lie in between or outside the discrete observations. This
is known as Runge’s phenomenon, which illustrates the error that can occur
when constructing a polynomial interpolant of a high degree.

One manner of dealing with this phenomenon is to simply pick better func-
tions while trying to maintain a low degree of the polynomial interpolant. For
example, we can replace x3 with x1 to flatten the curve and reduce the wiggles.
In doing so, the coefficient matrix changes to:
 
1 1 1 1
1 2 4 1/2
A=
 

1 3 9 1/3
1 4 16 1/4
16
and the interpolated function becomes f (x) = 3 − 32 x + 61 x2 − 4 x1 .

6
Figure 2: Graph of the Adjusted Interpolated Polynomial

However, this can only come out of experience in selecting suitable functions
and is a very hit-and-trial method that defeats the purpose of using linear
algebra. A more significant manner of dealing with this is by using a slightly
more complicated but a much more effective method - Lagrange’s Polynomial
Interpolation.

2.4.1 Lagrange’s Polynomial Interpolation

The Lagrange interpolating polynomial is the polynomial P (x) of the degree ≤


(n−1) that passes through the n points (x1 , y1 = f (x1 ), (x2 , y2 = f (x2 ), ..., (xn , yn =
f (xn ) and is given by
n
X
P (x) = fj (x) (5)
j=1

where
n
Y x − xk
fj (x) = yj (6)
xj − xk
k=1,k̸=j

7
Written explicitly,
(x − x2 )(x − x3 )...(x − xn ) (x − x1 )(x − x3 )...(x − xn )
f (x) = y1 + y2
(x1 − x2 )(x1 − x3 )...(x1 − xn ) (x2 − x1 )(x2 − x3 )...(x2 − xn )
(x − x1 )(x − x2 )...(x − xn−1 )
+... + yn (7)
(xn − x2 )(xn − x3 )...(xn − xn−1 )

For example, considering the same data points as in the previous example,
we find the Lagrange polynomials to be:

(x − 2)(x − 3)(x − 4)
P1 (x) = (8)
(1 − 2)(1 − 3)(1 − 4)
(x − 1)(x − 3)(x − 4)
P2 (x) = (9)
(2 − 1)(2 − 3)(2 − 4)
(x − 1)(x − 2)(x − 4)
P3 (x) = (10)
(3 − 1)(3 − 2)(3 − 4)
(x − 1)(x − 2)(x − 3)
P4 (x) = (11)
(4 − 1)(4 − 2)(4 − 3)
Therefore, we get f (x) = P1 (x) ∗ y1 + P2 (x) ∗ y2 + P3 (x) ∗ y3 + P4 (x) ∗ y4 .
Substituting for the values of yi , we get f (x) as the linear combination of the
Lagrangean basis polynomials and their corresponding y values, i.e.,

(x − 2)(x − 3)(x − 4) (x − 1)(x − 3)(x − 4)


f (x) = 0 +1
(1 − 2)(1 − 3)(1 − 4) (2 − 1)(2 − 3)(2 − 4)
(x − 1)(x − 2)(x − 4) (x − 1)(x − 2)(x − 3)
+1 +1 (12)
(3 − 1)(3 − 2)(3 − 4) (4 − 1)(4 − 2)(4 − 3)
which simplifies to the same polynomial obtianed using a system of equations
with the basis {1, x, x2 , x3 }, i.e.,

13 3 1
f (x) = −3 + x − x2 + x3
3 2 6
When our initial focus was on the coefficients a, b, c and d, which led to a
linear system of equations, this could be interpreted as working with the basis
{1, x, x2 , x3 }, with the alternative approach being dealing with the Lagrange
basis {P1 , P2 , P3 , P4 }. We can say that the Lagrange basis polynomial function
is easier to work with, as all that is required to interpolate the data is to find
the linear combination of the polynomials in the basis, where the coefficients
are simply the data points. Although this might encourage one to think that
the Lagrange basis is always better, this is not necessarily true as it varies from
case to case and there is no one size fits all for every situation.

8
3 Linear Regression
In statistics, linear regression analysis is used to predict the value of a variable
based on the value of another variable. The variable you want to predict is
called the dependent variable and the variable you are using to predict the
other variable’s value is called the independent variable. Linear regression fits a
straight line or surface that minimizes the discrepancies between predicted and
actual output values.

3.1 Least-Squares Lines


The simplest relation between two variables x and y is the linear equation y = β0
+ β1 x. Experimental data often produce points (x1 , y1 ), ..., (xn , yn ) that when
graphed seemed to lie close to a line. We want to determine the paramters β0
and β1 that the line as ”close” to the points as possible.

Suppose β0 and β0 are fixed, and consider the line y = β0 + β1 x in Fig.


2. Corresponding to each data point (xj , yj ) there is a point (xj , β0 + β1 x) on
the line with the same x-coordinate. We call yj the observed value of y and
β0 + β1 x the predicted y-value (determined by the line). The difference between
an observed y-value and a predicted y-value is called a residual.

Figure 3: Fitting a Line of Best Fit

9
3.2 Multiple Linear Regression Model
Suppose an experiment involves two independent variables - say u and v - and
one dependent variable y. A simple equation to predict y from u and v has the
form:

y = β 0 + β 1 u + β2 v
A more general prediction equation might have the form:

y = β0 + β1 u + β2 u2 + β3 u3 + ... + βn un
This equation is used in geology, for instance, to model erosion surfaces,
glacial cirques, soil pH, and other quantities. In such cases, the least-squares fit
is called a trend surface.

However, though Linear Regression is an indirect application of linear in-


terpolation, the major difference between then is in the difference between ex-
trapolation and interpolation. While interpolation tries to join all the known
points and evaluate unknowns in between them, extrapolation deals with using
the known points to estimate for parameter values outside the given range, the
latter being the essence of linear regression.

10
4 Conclusion
Linear Algebra, with its vast array of topics that it encompasses, has an equally
wide scope. In this report, we looked into one of it’s specific applications, namely
Linear Interpolation, which deals with the estimation of the value of a function
between any two given values for a given interval. We also looked at two ways
to go about fitting the data points into any form of required function - using a
system of linear equations and using Lagrange’s Polynomial Interpolation, the
both of which may possibly result in the same function.

Then, we explored an application of linear interpolation in Linear Regression,


Multiple Linear Regression, which deals with predicting the value of a dependent
variable given the values of two or more independent variables and a little more
in depth about least squares and their role in the estimation process. Finally, we
concluded with a small note on the difference between interpolation, which is the
primary use of linear interpolation, and extrapolation, which the the primary
use of linear regression.

11
5 References
[1] Lay, D. C. Student Study Guide for Linear Algebra and Its Applications.(2011)

[2] Strang, G. Linear Algebra and Its Applications, 4th Edition (4th ed.). Cen-
gage Learning.(2022)

[3] Pavel Grinfeld, Part 1 Linear Algebra: An In-Depth Introduction with a


Focus on Applications. (n.d.). [Video]. YouTube.

[4] Calculator Suite - GeoGebra. (n.d.). https://www.geogebra.org/calculator

[5] Wolfram Demonstrations Project. (n.d.). https://demonstrations.wolfram.com/

[6] Wolfram MathWorld: The Web’s Most Extensive Mathematics Resource.


(n.d.). https://mathworld.wolfram.com/

12

You might also like