0% found this document useful (0 votes)
46 views21 pages

Golden Section Method - New Practice

The document discusses various numerical optimization methods including finding local and global optima, gradient descent, Newton-Raphson, bisection, golden-section search, and parabolic interpolation. It also covers multidimensional optimization problems and visualization techniques and describes MATLAB functions for optimization like fminbnd and fminsearch.

Uploaded by

Hedy Liu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
46 views21 pages

Golden Section Method - New Practice

The document discusses various numerical optimization methods including finding local and global optima, gradient descent, Newton-Raphson, bisection, golden-section search, and parabolic interpolation. It also covers multidimensional optimization problems and visualization techniques and describes MATLAB functions for optimization like fminbnd and fminsearch.

Uploaded by

Hedy Liu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 21

Numerical Analysis

2015 Fall

Optimization
Optimization
• Optimization is the process of creating
something that is as effective as possible.

• From a mathematical perspective,


optimization deals with finding the maxima
and minima of a function that depends on
one or more variables.
Optimization
• Numerically, guessing and searching a point
on the function (similar to root finding
methods)

 Root finding in f(x)’


Multidimensional Optimization
• One-dimensional problems involve functions that
depend on a single dependent variable -for example,
f(x).
• Multidimensional problems involve functions that
depend on two or more dependent variables - for
example, f(x,y)
Multidimensional Optimization
• Face detection

Face Model

 5 variables
Global vs. Local
• A global optimum represents the very best
solution while a local optimum is better than
its immediate neighbors.
• Cases that include local optima are called
multimodal.
• Generally desire to find the global optimum.
Golden Ratio
• Two quantities are in the golden ratio if their
ratio is the same as the ratio of their sum to
the larger of the two quantities.
Golden Ratio
Bisection
• The bisection method is a
variation of the incremental
search method in which the
interval is always divided in
half.
• If a function changes sign
over an interval, the function
value at the midpoint is
evaluated.
• The location of the root is
then determined as lying
within the subinterval where
the sign change occurs.
• The absolute error is reduced
by a factor of 2 for each
iteration.
Golden-Section Search
• Search algorithm for finding a minimum on an
interval [xl xu] with a single minimum
(unimodal interval)

• Uses the golden ratio =1.6180… to


determine location of two interior points x1
and x2

• By using the golden ratio, one of the interior


points can be re-used in the next iteration.
Golden-Section Search (cont)
d  ( 1)(x u  x l )
x1  x l  d
x2  xu  d

• If f(x1)<f(x2), x2 becomes the new


lower limit and x1 becomes the new
x2
(as in figure).
• If f(x2)<f(x1), x1 becomes the new
upper limit and x2 becomes the new
x1.
• In either case, only one new interior
point is needed and the function is
only evaluated one more time.
 Golden ratio…
Example
Error

Max error
Code for Golden-Section
Homework
Parabolic Interpolation
• It uses parabolic interpolation of three points to
estimate optimum location.
• The location of the maximum/minimum of a parabola
defined as the interpolation of three points (x1, x2,
and x3) is:
1 x 2  x1   f x 2   f x 3  x 2  x 3   f x 2   f x1
2 2

x4  x2 
2 x 2  x1  f x 2   f x 3  x 2  x 3  f x 2   f x1

• The new point x4 and the two


 surrounding it (either x1 andx2
or x2 and x3) are used for the
next iteration of the algorithm.
fminbnd Function Homework

• MATLAB has a built-in function, fminbnd,


which combines the golden-section
search and the parabolic interpolation.
– [xmin, fval] = fminbnd(function, x1, x2)
• Options may be passed through a
fourth argument using optimset, similar
to fzero.
Multidimensional Visualization
• Functions of two-dimensions may be
visualized using contour or surface/mesh
plots.
Gradient Descent
• Gradient descent is based on the observation that if
the multivariable function is defined and differentiable
in a neighborhood of a point, then decreases fastest
if one goes in the direction of the negative gradient.

• Gradient descent is a first-order optimization


algorithm.
Newton-Raphson
• Based on forming the tangent line to the first
derivative of g(x) curve at some guess x, then
following the tangent line to where it crosses
the x-axis.

g ' ( xi ) f ( xi )
xi 1  xi   xi 
g ' ' ( xi ) f ' ( xi )
fminsearch Function Homework
• MATLAB has a built-in function, fminsearch,
that can be used to determine the minimum
of a multidimensional function.
– [xmin, fval] = fminsearch(function, x0)
– xminin this case will be a row vector containing the
location of the minimum, while x0 is an initial guess.
Note that x0 must contain as many entries as the
function expects of it.
• The function must be written in terms of a
single variable, where different dimensions
are represented by different indices of that
variable.
fminsearch Function Homework
• To minimize
f(x,y)=2+x-y+2x2+2xy+y2
rewrite as
f(x1, x2)=2+x1-x2+2(x1)2+2x1x2+(x2)2
• f=@(x) 2+x(1)-x(2)+2*x(1)^2+2*x(1)*x(2)+x(2)^2
[x, fval] = fminsearch(f, [-0.5, 0.5])

• Note that x0 has two entries - f is


expecting it to contain two values.
• MATLAB reports the minimum value is
0.7500 at a location of [-1.000 1.5000]

You might also like