0% found this document useful (0 votes)
38 views4 pages

WEEK5

The document describes using linear least squares approximation and Lagrange interpolation to fit a polynomial to data points. It involves: 1) Finding the best fit polynomial using linear least squares by constructing the design matrix and solving the normal equations for degrees m=0,1,2,... and choosing the optimal m. 2) Solving the least squares problem using QR factorization without forming the normal equations. 3) Performing Lagrange interpolation on a given function over an interval using uniform meshes and Chebyshev nodes, and comparing the results.

Uploaded by

yecot
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as TXT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
38 views4 pages

WEEK5

The document describes using linear least squares approximation and Lagrange interpolation to fit a polynomial to data points. It involves: 1) Finding the best fit polynomial using linear least squares by constructing the design matrix and solving the normal equations for degrees m=0,1,2,... and choosing the optimal m. 2) Solving the least squares problem using QR factorization without forming the normal equations. 3) Performing Lagrange interpolation on a given function over an interval using uniform meshes and Chebyshev nodes, and comparing the results.

Uploaded by

yecot
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as TXT, PDF, TXT or read online on Scribd
You are on page 1/ 4

{

"cells": [
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# I. Linear least squares approximation"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Consider a function $y = f(x)$ which is defined by a set of values $y_0,
y_1, \\cdots, y_n$ at points $x_0, x_1, \\cdots, x_n$."
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"x = [-1, -0.7, -0.43, -0.14, -0.14, 0.43, 0.71, 1, 1.29, 1.57, 1.86, 2.14,
2.43, 2.71, 3]\n",
"y = [-2.25, -0.77, 0.21, 0.44, 0.64, 0.03, -0.22, -0.84, -1.2, -1.03, -0.37,
0.61, 2.67, 5.04, 8.90]"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### I.I. Find a best fit polynomial\n",
"\n",
"$$\n",
"P_m(x) = a_0 + a_1 x + \\cdots + a_m x^m\n",
"$$\n",
"\n",
"using the linear least squares approach. To this end\n",
"\n",
"1. implement a function which constructs the design matrix using $1, x,
\\cdots, x^m$ as the basis functions.\n",
"\n",
"2. construct explicitly the normal system of equations of the linear least
squares problem at fixed $m$.\n",
"\n",
"3. Solve the normal equations to find the coefficients of $P_m(x)$ for $m = 0,
1, 2, \\dots$. For the linear algebra problem, you can either use library functions
(`numpy.linalg.solve`) or your LU factorization code from week 1.\n",
"\n",
"(20% of the total grade)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"# ... ENTER YOUR CODE HERE"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### I.II \n",
"\n",
"To find the optimal value of m, use the following criterion: take $m=0, 1,
2, \\dots$, for each value of $m$ compute \n",
"\n",
"$$\n",
"\\sigma_m^2 = \\frac{1}{n - m} \\sum_{k=0}^n \\left( P_m(x_k) - y_k
\\right)^2\n",
"$$\n",
"\n",
"And take the value of $m$, at which $\\sigma_m$ stabilizes or starts
increasing.\n",
"\n",
"(20% of the total grade)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"# ... ENTER YOUR CODE HERE ..."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"Plot your polynomials $P_m(x)$ on one plot, together with the datapoints.
Visually compare best-fit polynomials of different degrees. Is the visual
comparison consistent with the optimal value of $m$?"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"# ... ENTER YOUR CODE HERE"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### I.III. Linear least-squares using the QR factorization.\n",
"\n",
"For the optimal value of $m$ from the previous part, solve the LLS problem
using the QR factorization, withou ever forming the normal equations explicitly.
For linear algebra, you can use standard library functions (look up
`numpy.linalg.solve`, `numpy.linalg.qr` etc) or your code from previous weeks.\n",
"\n",
"Compare the results with the results of solving the normal system of
equations.\n",
"\n",
"(20% of the grade)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"# ... ENTER YOUR CODE HERE ..."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"# II. Lagrange interpolation"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### II.1 \n",
"\n",
"Consider the function, $f(x) = x^2 \\cos{x}$. On the interval $x\\in
[\\pi/2, \\pi]$, interpolate the function using the Lagrange interpolating
polynomial of degree $m$ with $m=1, 2, 3, 4, 5$. Use the uniform mesh. Plot the
resulting interpolants together with $f(x)$.\n",
"\n",
"(20% of the total grade)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"# ... ENTER YOUR CODE HERE ..."
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### II.2. \n",
"\n",
"Repeat the previous task using the Chebyshev nodes. Compare the quality of
interpolation on a uniform mesh and Chebyshev nodes for $m=3$.\n",
"\n",
"(20% of the total grade)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {
"collapsed": true
},
"outputs": [],
"source": [
"# ... ENTER YOUR CODE HERE ..."
]
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.5.2"
}
},
"nbformat": 4,
"nbformat_minor": 2
}

You might also like