0% found this document useful (0 votes)
312 views8 pages

Brent Optimization

The lecture discusses root-finding and optimization algorithms for functions of a single variable. It covers hybrid root-finding methods like Brent's method and NewtSafe that combine bisection, secant, and Newton's methods. Optimization methods discussed include bracketing using golden section search, interpolation using successive parabolic interpolation, and hybrid methods implemented in MATLAB's fminbnd function. Examples and demos are provided to illustrate the algorithms.

Uploaded by

Finigan Joyce
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
312 views8 pages

Brent Optimization

The lecture discusses root-finding and optimization algorithms for functions of a single variable. It covers hybrid root-finding methods like Brent's method and NewtSafe that combine bisection, secant, and Newton's methods. Optimization methods discussed include bracketing using golden section search, interpolation using successive parabolic interpolation, and hybrid methods implemented in MATLAB's fminbnd function. Examples and demos are provided to illustrate the algorithms.

Uploaded by

Finigan Joyce
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

Lecture06

February 06, 2008

Lecture 06: Rootfinding and Optimization for functions


of a single variable f(x)
Outline

1) Fail-safe hybrid methods


NewtSafe (Newton + Bisection: Numerical Recipes)
Brent's method (Secant+IQI+Bisection)
Matlab's fzero (Brent's method)
2) Examples and Demo's
3) Optimization algorithms to find min(f(x)) on [a,b]
Bracketing Algorithms: Golden Section Search
Interpolation Algorithms: Successive Parabolic Interpolation
Hybrid Methods: fminbnd

Hybrid Methods:
Design Goals:
1) Robustness: Given a bracket [a,b], maintain the bracket
2) Efficiency: Use superlinear convergent methods when possible

Some Options:
Have derivatives:
NewtSafe (or RootSafe, Numerical Recipes)
Newton's method within a bracket, Bisection otherwise

No Derivatives:
Brent's Algorithm (zbrent Numerical Recipes, fzero Matlab)
Returns minimum bracket using combination of
Bisection
Secant method
Inverse Quadratic Interpolation

Lecture06

February 06, 2008

NewtSafe: Bisection + Newton


The Geometric picture
Example: f(x)=sin(2x)

fa

b
a
c
fb

NewtSafe Algorithm:

Lecture06

February 06, 2008


NewtSafe Code:
function [xFinal, xN, errorN] = newtSafe(func,a,b,tol)
% NEWTSAFE - root-finder using hybrid Newton-Bisection method to always maintain bracket
%
% [xFinal, xN, errorN] = newtSafe(func,a,b,tol)
%
% func: function [f, df] =func(x); returns both the function and its derivative
% a,b: initial bracket
% tol: stopping condition for error f(x) <= tol or |b-a|<tol*b
%
% xFinal: final value
% xN:
vector of intermediate iterates
% errorN: vector of errors
MAX_ITERATIONS = 100;
VERBOSE = true;
% initialize the problem
h = b-a;
fa = func(a);
fb = func(b);
if ( sign(fa) == sign(fb) )
error('function must be bracketed to begin with' );
end
c = a ; % start on the left side (could also choose the middle
[ fc, df] = func(c);
xN(1) = c;
errorN(1,:) = [ abs(fc) h ];
% begin iteration until convergence or Maximum Iterations
for i = 1:MAX_ITERATIONS
% try a Newton step
useNewton = true;
c = c - fc/df;
% if not in bracket choose bisection
if ( ~(a <= c & b >= c) )
c = a + h/2;
useNewton = false;
end
% Evaluate function and derivative at new c
[fc,df]=func(c);
% check and maintain bracket
if ( sign(fc) ~= sign(fb) )
a=c;
fa=fc;
else
b = c;
fb = fc;
end
h = b-a;
% calculate errors and track solutions
absError = abs(fc);
relError = h;
xN(i) = c;
errorN(i,:) = [ absError, relError ];
% be yacky
if VERBOSE
if useNewton
disp(sprintf('i=%d Newton: a=%12.8e, b=%12.8e, c=%12.8e, f(c)=%12.8e, h=%12.8e' ,i,a,b,c,fc,h));
else
disp(sprintf('i=%d Bisect : a=%12.8e, b=%12.8e, c=%12.8e, f(c)=%12.8e, h=%12.8e' ,i,a,b,c,fc,h));
end
end
% check if converged
if ( absError < tol || relError < tol*b )
break;
end
end
% clean up
if ( i == MAX_ITERATIONS)
warning('Maximum iterations exceeded' );
end
xFinal = xN(end);
xN = xN(:); % convert output to column vectors

Brent-Dekker Algorithm:
Hybrid method using IQI+Secant+Bisection (foolproof)
Given: f(x) and a bracket [a,b]
Initialize: use Secant method to find c between a and b
Until Converged: abs(b-a)<tol*b

or f(c)=0

Arrange a,b and c so that


- a and b form a bracket
- |f(b)| <= |f(a)|
- c is the previous value of b
if c != a:
Try c=IQI
elseif c=a
Try c=Secant
end
if c in the bracket
keep it
else
use c=Bisection
end

Lecture06

February 06, 2008

Brent's method: Bisection + Secant + IQI


The Geometric picture
Example: f(x)=sin(2x)

fa

b
a
c
fb

Brent-Dekker Algorithm:
Comments:
This algorithm is Bullet-proof
It's guaranteed to always maintain a bracket
Doesn't require derivatives
Uses rapidly converging methods when reliable
Uses slow but sure method when necessary
In Matlab: fzero...demonstrate with fzerogui (code in fzerotxt)
basic syntax (see help fzero, help optimset)
options = optimset('disp','iter')
x = fzero(@func,x0,options)
if x0 is scalar, it searches for a bracket if x0 = [a b] it tests for
a bracket and fails if sign(f(a)) sign(f(b))

Lecture06

February 06, 2008

Optimization (finding extrema) for functions of one


variable
Closely related problem to root finding, but rather than finding f(x)=0 on some
interval. Find min(f(x)) on some interval...

Optimization (finding extrema) for functions of one


variable
General algorithms (similar to rootfinding algorithms):
Bracketing algorithms: Golden-Section Search (linear convergence)
Interpolation algorithms: repeated parabolic interpolation
Hybrid algorithms: Matlab's fminbnd(func,a,b,tol)

Lecture06

February 06, 2008

Bracketing Algorithm: Golden Section Search


Like Bisection: given f(x) in C[a,b] that is convex (uni-modal) over an interval [a,b]
reduce the interval size until it "brackets" the minimum

Note: bracketing not as well defined,


you should always plot your function
Questions:
1) How many points are required to
approximate a minimum?
2) How many points are needed to
subdivide? [a,b]?
3) How to choose those points
efficiently

Bracketing Algorithm: Golden Section Search


How to divide an interval for successive minima brackets...

Consider the Unit interval:

0
this

or

1-

Lecture06

February 06, 2008

Golden Section Search: The Algorithm


Given f(x) and unimodal bracket [a,b]

1-

Initialize:

Interpolation Algorithm: Successive Parabolic


Interpolation
Like Secant: use multiple samplings of the function to approximate the function.

Lecture06

February 06, 2008

Successive Parabolic Interpolation: The Algorithm

Given: f(x) and [a,b]


initialize: x = [ a b (a+b)/2 ]
n=2:-1:0;
for i=1:MAX_ITERATIONS
f = func(x);
p = polyfit(x,f,2);
pPrime = n.*p;
xNew(i) = -pPrime(2)/pPrime(1);
x = [ x(2:end) xNew(i) ];
relErr = abs(xNew(i)-xNew(i-1))/abs(xNew(i));
if relErr < tol
break
end
end

Hybrid schemes: fminbnd


successive Parabolic interpolation + golden section
search
in Matlab: use fminbnd
syntax:
options = optimset('disp','iter')
x = fminbnd(@func,x1,x2,options)

examples:
x = fminbnd(@(x) sin(2*pi*x),0,1,options)
x = fminbnd(@(x) -humps(x),.1,.4,options)

You might also like