0% found this document useful (0 votes)
28 views

Optimization & 1-D Unconstrained Optimization

The document discusses optimization methods for finding the maximum or minimum of an objective function subject to constraints. It begins by introducing the mathematical formulation of optimization problems and classifying them as linear, quadratic, or nonlinear programming based on the nature of the objective function and constraints. It then describes several approaches for solving unconstrained and constrained optimization problems, including golden-section search, quadratic interpolation, and Newton's method for finding local optima of functions. The document emphasizes that the goal is to find global rather than local optima and to do so efficiently with as few function evaluations as possible.
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
28 views

Optimization & 1-D Unconstrained Optimization

The document discusses optimization methods for finding the maximum or minimum of an objective function subject to constraints. It begins by introducing the mathematical formulation of optimization problems and classifying them as linear, quadratic, or nonlinear programming based on the nature of the objective function and constraints. It then describes several approaches for solving unconstrained and constrained optimization problems, including golden-section search, quadratic interpolation, and Newton's method for finding local optima of functions. The document emphasizes that the goal is to find global rather than local optima and to do so efficiently with as few function evaluations as possible.
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 21

Optimization

Introduction
&
1-D Unconstrained Optimization

1
Mathematical Background
Objective: Maximize or Minimize f(x)
subject to
d i ( x )  ai i  1,2,  , m *
 Constraints
ei ( x )  bi i  1,2,  , p * 

x = {x1, x2, …, xn}


f(x): objective function Maximize f ( x )
di(x): inequality constraints 
ei(x): equality constraints Minimize  f ( x )
ai and bi are constants 2
Classification of Optimization Problems
• If f(x) and the constraints are linear, we have
linear programming.
– e.g.: Maximize x + y subject to
3x + 4y ≤ 2
y≤5

• If f(x) is quadratic and the constraints are linear,


we have quadratic programming.

• If f(x) is not linear or quadratic and/or the


constraints are nonlinear, we have nonlinear
programming.
3
Classification of Optimization Problems

When constraints (equations marked with *) are


included, we have a constrained optimization
problem;

Otherwise, we have an unconstrained optimization


problem.

4
Optimization Methods
One-Dimensional Unconstrained Optimization
Golden-Section Search
Quadratic Interpolation
Newton's Method

Multi-Dimensional Unconstrained Optimization


Non-gradient or direct methods
Gradient methods

Linear Programming (Constrained)


Graphical Solution
Simplex Method

5
Global and Local Optima
A function is said to be multimodal on a given
interval if there are more than one
minimum/maximum point in the interval.

6
Characteristics of Optima

To find the optima, we can find the zeroes of f'(x).


7
Newton’s Method
Let g(x) = f'(x)

Thus the zeroes of g(x) is the optima of f(x).

Substituting g(x) into the updating formula of


Newton-Rahpson method, we have
g ( xi ) f ' ( xi )
xi 1  xi   xi 
g ' ( xi ) f " ( xi )

Note: Other root finding methods will also work.


8
Newton’s Method
• Shortcomings
– Need to derive f'(x) and f"(x).
– May diverge
– May "jump" to another solution far away

• Advantages
– Fast convergent rate near solution
– Hybrid approach: Use bracketing method to find an
approximation near the solution, then switch to
Newton's method.
9
Bracketing Method
f (x )

xl xa x b xu x

Suppose f(x) is unimodal on the interval [xl, xu]. That is, there
is only one local maximum point in [xl, xu].
Let xa and xb be two points in (xl, xu) where xa < xb.
10
Bracketing Method

xl xa x b x u x xl xa xb x u x

If f(xa) > f(xb), then the maximum point will not reside in the
interval [xb, xu] and as a result we can eliminate the portion
toward the right of xb.
In other words, in the next iteration we can make xb the new xu
11
Generic Bracketing Method (Pseudocode)
// xl, xu: Lower and upper bounds of the interval
// es: Acceptable relative error
function BracketingMax(xl, xu, es) {
do {
prev_optimal = optimal;
Select xa and xb s.t. xl <= xa < xb <= xu;

if (f(xa) < f(xb))


xl = xa;
else
if (f(xa) > f(xb))
xu = xb;

optimal = max(f(xa), f(xb));


ea = abs((max – prev_max) / max);
} while (ea < es);

return max;
} 12
Bracketing Method
How would you suggest we select xa and xb (with
the objective to minimize computation)?
– Reduce as much interval as possible in each iteration
• Set xa and xb close to the center so that we can halve
the interval in each iteration
• Drawbacks: function evaluation is usually a costly
operation.

– Reduce the number of function evaluations


• Select xa and xb such that one of them can be reused
in the next iteration (so that we only need to evaluate
f(x) once at each iteration).
• How should we select such points?
13
Current iteration
Objective:
l1 l1'
R '
l1 l0 l0
l1 xb'  xa or xa'  xb
lo
If we can calculate xa and xb
xl xa xb xu
based on the ratio R w.r.t. the
Next iteration current interval length in each
iteration, then we can reuse
one of xa and xb in the next
l'1 iteraton.
l'1
l'o
In this example, xa is reused
x'l x'a x'b x'u as x'b in the next iteration so
in the next iteration we only
need to evaluate f(x'a).
14
Current iteration
Since l0'  l1 and l1'  l0  l1
l1'
l1 '
R
l0
l1 l0  l1
lo  R
l1
xl xa xb xu l0  Rl0 l1
 R [  R  l1  Rl0 ]
Rl0 l0
Next iteration
 R 2l0  Rl0  l0  0
 R2  R 1  0
l'1
l'1  1  1  4(1)
l'o R
2(1)
x'l x'a x'b x'u 5 1
  0.61803
2
Golden Ratio 15
Golden-Section Search
• Starts with two initial guesses, xl and xu
• Two interior points xa and xb are calculated based on
the golden ratio as
5 1
xa  xu  d or xb  xl  d where d  ( xu  xl )
2
• In the first iteration, both xa and xb need to be
calculated.
• In subsequent iteration, xl and xu are updated
accordingly and only one of the two interior points
needs to be calculated. (The other one is inherited
from the previous iteration.) 16
Golden-Section Search
• In each iteration the interval is reduced to about
61.8% (Golden ratio) of its previous length.

• After 10 iterations, the interval is shrunk to about


(0.618)10 or 0.8% of its initial length.

• After 20 iterations, the interval is shrunk to about


(0.618)20 or 0.0066%.

17
Quadratic Interpolation

f(x
)

x0 x1 x3 x2 x

Idea:
(i) Approximate f(x) using a quadratic function g(x) = ax2+bx+c
(ii) Optima of f(x) ≈ Optima of g(x)
18
Quadratic Interpolation
• Shape near optima typically appears like a
parabola. We can approximate the original function
f(x) using a quadratic function g(x) = ax2 + bx + c.

• At the optimum point of g(x), g'(x) = 2ax + b = 0. Let x3


be the optimum point, then x3 = -b/2a.

• How to compute b and a?


– 2 points => unique straight line (1st-order polynomial)
– 3 points => unique parabola (2nd-order polynomial)
– So, we need to pick three points that surround the
optima.
– Let these points be x0, x1, x2 such that x0 < x1 < x2
19
Quadratic Interpolation

• a and b can be obtained by solving the system of


linear equations
ax02  bx0  c  f ( x0 )
ax12  bx1  c  f ( x1 )
ax22  bx2  c  f ( x2 )

• Substitute a and b into x3 = -b/2a yields

f ( x0 )( x12  x22 )  f ( x1 )( x22  x02 )  f ( x2 )( x02  x12 )


x3 
2 f ( x0 )( x1  x2 )  2 f ( x1 )( x2  x0 )  2 f ( x2 )( x0  x1 )

20
Quadratic Interpolation

• The process can be repeated to improve the


approximation.
• Next step, decide which sub-interval to discard
– Since f(x3) > f(x1)

if x3 > x1, discard the interval toward the left of x1


i.e., Set x0 = x1 and x1 = x3

if x3 < x1, discard the interval toward the right of x1


i.e., Set x2 = x1 and x1 = x3

• Calculate x3 based on the new x0, x1, x2


21

You might also like