0% found this document useful (0 votes)
9 views13 pages

HW 5

This document outlines Homework 5, focusing on function approximation using polynomials in Julia. It includes implementations for Bernstein polynomials, Lagrange interpolation, and generating orthogonal polynomials, along with exercises for students to complete. The document also features visualizations of polynomial approximations for various functions, demonstrating the effectiveness and limitations of different methods.

Uploaded by

bradley.c.yu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views13 pages

HW 5

This document outlines Homework 5, focusing on function approximation using polynomials in Julia. It includes implementations for Bernstein polynomials, Lagrange interpolation, and generating orthogonal polynomials, along with exercises for students to complete. The document also features visualizations of polynomial approximations for various functions, demonstrating the effectiveness and limitations of different methods.

Uploaded by

bradley.c.yu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 13

hw5

February 8, 2024

1 Homework 5
In this homework we test the approximation of functions with polynomials.
We start by loading three Julia packages. The first one is to deal with polynomials. The second
one provides pre-made methods for numerical integration. The third is the standard package for
plotting functions.
[ ]: using Polynomials
using QuadGK
using Plots

The polynomials can be defined from a list of coefficients. Arguably, it looks better to define a
global variable x that refers to the 𝑥 polynomial and then do basic arithmetic with it. Let us see.
[ ]: global x = Polynomial([0,1])
display(1 + 2x + 3x^2 + 5x^3)
display((x-1.5)*(x-2))

1 + 2 ⋅ 𝑥 + 3 ⋅ 𝑥2 + 5 ⋅ 𝑥3
3.0 − 3.5 ⋅ 𝑥 + 1.0 ⋅ 𝑥2
Now, the following function returns the Bernstien polynomial 𝛽𝑛,𝑖 .

[ ]: function bernstein_beta(n::Integer, i::Integer)


return binomial(n,i) * x^i * (1-x)^(n-i)
end

[ ]: bernstein_beta (generic function with 1 method)

Let us see how it looks


[ ]: plot(bernstein_beta(20,6),0,1,label="beta_{20,6}")
[ ]:

1
1.1 Exercise 1
It is your job to implement a function that returns the Bernstein polynomial corresponding to any
function f
[ ]: function bernstein(f::Function, n::Integer)
b = 0
for i = 0:n
b += f(i/n) * bernstein_beta(n,i)
end
return b
end

[ ]: bernstein (generic function with 1 method)

[ ]: # Let us test if it seems to be working fine


f(t) = sin(2*pi*t)
p = bernstein(f,10)
xg = 0:0.01:1
plot(xg,map(p,xg), label="Bernstein 10")
plot!(f,0,1, label="sin(2 pi x)")
[ ]:

2
2 Lagrange interpolation
You implemented Lagrange interpolation in last week’s homework. Here is a possible way to do it.
[ ]: function newton(f::Function, x::Vector{<:Real})
n = length(x)
if (n==1) return f(x[1]) end
yn = x[1:end-1]
yl = vcat(x[1:end-2],[x[n]])
return (newton(f,yn)-newton(f,yl))/(x[n-1]-x[n])
end

function interpolate(f::Function, points::Vector{<:Real})


n = length(points) # note that the polynomial will be of degree n-1
p = Polynomial(0)
terms = []
for i in 1:n
term = Polynomial(newton(f,points[1:i]))
for j in 1:i-1
term *= (x-points[j])
end
p += term
push!(terms,term)

3
end
return p
end

[ ]: interpolate (generic function with 1 method)

2.1 Exercise 2
We now want to generate a sequence of orthogonal polynomials in [0,1] and use them to find the
least squares approximation of any given function as a polynomial of degree ≤ 𝑛 in the interval
[0,1].
First of all, we need to generate a list of orthogonal polynomials. The following function is supposed
to return an array with the orthogonal polynomials 𝑝0 , 𝑝1 , 𝑝2 , 𝑝3 , … so that each 𝑝𝑘 has degree 𝑘.

[ ]: function orthogonal_polynomials(n::Integer)
# This is going to be a list of orthogonal polynomials.
p = []

# The first polynomial is just constant 1


if n>0 push!(p,Polynomial(1)) end

for k in 1:n
# Start with the standard basis polynomial x^k
pk = Polynomial([zeros(k); 1])

for j in 1:length(p)
pj = p[j]
# Define functions for the inner products
f_pk_pj(x) = pk(x) * pj(x)
f_pj_pj(x) = pj(x)^2

# Numerically integrate to compute the inner products


inner_pk_pj, _ = quadgk(f_pk_pj, 0, 1)
inner_pj_pj, _ = quadgk(f_pj_pj, 0, 1)

# Subtract the projection


pk -= (inner_pk_pj / inner_pj_pj) * pj
end

push!(p, pk)
end
return p
end

# The following is a test


orthogonal_polynomials(3)

4
[ ]: 4-element Vector{Any}:
Polynomial(1)
Polynomial(-0.5 + 1.0*x)
Polynomial(0.16666666666666669 - 1.0*x + 1.0*x^2)
Polynomial(-0.04999999999999993 + 0.5999999999999999*x - 1.5*x^2 + 1.0*x^3)

I provide the function that generates the polynomial of a given degree that best approximates 𝑓 with
respect to the 𝐿2 distance. It requires a well done implementation of orthogonal_polynomials to
work.
Note that we use the function integrate to integrate polynomials. It should be fast and roughly
exact. We use quadgk to integrate numerically any generic function. It is presumably slower and
maybe less accurate. We will study how this is done in the next chapter.
[ ]: function least_squares_approximation(f::Function, degree::Integer)
ops = orthogonal_polynomials(degree)
p = Polynomial(0)
for po in ops
p += quadgk(t -> f(t)*po(t), 0,1, rtol=1e-14)[1] / integrate(po*po,0,1)␣
↪* po

end
return p
end

[ ]: least_squares_approximation (generic function with 1 method)

[ ]: # Let us test if it seems to be working fine


f(t) = sin(2*pi*t)
p = least_squares_approximation(f,4)
xg = 0:0.01:1
plot(xg,map(p,xg), label="Least squares approximation")
plot!(f,0,1, label="sin(2 pi x)")
[ ]:

5
2.2 Some pictures.
Let us plot a few functions and their corresponding polynomials.
We use a uniform grid in [0,1] for the interpolation. We use polynomials of degree n with (n+1)
points.
[ ]: function plot_all_polynomials(f::Function, n::Integer)
pB = bernstein(f,n)
pI = interpolate(f,Array(0:1/n:1))
pL = least_squares_approximation(f,n)
xi = 0:0.002:1
scatter(0:1/n:1, map(f,0:1/n:1), label="Interpolation points")
plot!(xi,map(f,xi), label="function")
plot!(xi,map(pB,xi), label="Bernstein polynomial")
plot!(xi,map(pI,xi), label="Lagrange interpolation")
plot!(xi,map(pL,xi), label="Least squares approximation")
end

[ ]: plot_all_polynomials (generic function with 1 method)

[ ]: plot_all_polynomials(t->sin(2*pi*t),4)
[ ]:

6
In this first example, the sine function is very smooth and looks very much like a polinomial. The
Lagrange interpolation does a very good job, as well as the least square approximation.
Let us now try a Gaussian, that is very localized near 0.5. We use 9 points.
[ ]: plot_all_polynomials(t->exp(-50*(t-0.5)^2),8)
[ ]:

7
The Lagrange interpolation polynomial does not look so well any more. The least squares approx-
imation stays somewhere close to the Gaussian. The Bernstein polynomial is still far from the
graph, but at least it has more or less the same visual shape.
What if we do it with a few more points?
[ ]: plot_all_polynomials(t->exp(-50*(t-0.5)^2),12)
[ ]:

8
If I try with more points the least squares approximation seems to lose symmetry. Honestly, I do
not understand why. There is no reason for that to happen. I can only think of two reasons: *
The truncation errors in the numerical integration grow out of control. * I made a mistake in the
implementation of the function. I would appreciate if someone sheds some light on this issue.
Let us now try other functions. Recall that the error in Lagrange interpolation had to do with some
higher order derivative of the function. What would happen then, if we apply it to a continuous
function that is not differentiable. Let us try a tent function 𝑓(𝑥) = (1 − 4|𝑥 − 0.5|)+ .

[ ]: pos(t) = t>0 ? t : 0.

plot_all_polynomials(t->pos(1 - 4*abs(t-0.5)),5)
[ ]:

9
[ ]: plot_all_polynomials(t->pos(1 - 4*abs(t-0.5)),15)
[ ]:

10
Noticeably, the Lagrange interpolation has some severe ondulations near the endpoints of the
interval.
Let us now try an even more singular function: the heavyside!
[ ]: plot_all_polynomials(t-> t>0.5 ? 0. : 1.,5)
[ ]:

[ ]: plot_all_polynomials(t-> t>0.5 ? 0. : 1.,10)


[ ]:

11
2.3 We want a movie!!
In all the pictures above it seems that the Bernstein polynomials do a bad job at approximating
the function. However, the Bernstein polynomials always converge to the function as 𝑛 → ∞. It
just takes a large 𝑛 to get there. Let us see how a sequence of Bernstein polynomials converge to
the Gaussian function.
[ ]: anim = @animate for n in 1:30
f(t) = exp(-50*(t-0.5)^2)
pb = bernstein(f,n)
xg = 0:0.01:1

yg = map(pb,xg)
plot(xg,yg, label="Bernstein with degree "*string(n))
plot!(f,0,1, label="Gaussian")
end
gif(anim,fps=5)

[ Info: Saved animation to


/Users/bradleyyu/Documents/Coding-Things/MATH212/tmp.gif

[ ]: Plots.AnimatedGif("/Users/bradleyyu/Documents/Coding-Things/MATH212/tmp.gif")

I stopped the computation at degree 30 because I was getting rubbish afterwards. The Bernstein

12
polynomials are supposed to converge to the function as their degree → ∞. But the numerical
errors due to floating point truncation eventually become dominant.

13

You might also like