Classical Optimization
Classical Optimization
Equality constraint:
Solving optimization problems
Constants Responses
Model
Design
variables Derivatives of
responses
Optimizer (design sensi-
tivities)
Defining an optimization problem
1. Choose design variables and their bounds
2. Formulate objective
3. Formulate constraints (restrictions)
4. Choose suitable optimization algorithm
Example – Design of a SODA Can
5
Design a SODA can to hold an specified
amount of SODA and other requirements.
The cans will be produced in billions, so it
is desirable to minimize the cost of
manufacturing.
Since the cost is related directly to the
surface area of the sheet metal used, it is
reasonable to minimize the sheet metal
required to fabricate the can.
Example – Design of a SODA Can (Cont.)
6
Requirements:
1. The diameter of the can should be no
more than 8 cm and no less than 3.5
cm.
2. The height of the can should be no
more than18 cm and no less than 8 cm.
3. The can is required to hold at least
400 ml of fluid.
Example – Design of a SODA Can (Cont.)
7
Design variables
D = diameter of the can (cm)
H = height of the can (cm)
Objective function
The design objective is to minimize the surface area
Example – Design of a SODA Can (Cont.)
8
The constraints must be formulated in terms of design variables.
The first constraint is that the can must hold at least 400 ml of fluid.
The problem has two independent design variables and five explicit
constraints.
Optimization Problem Characteristics
Linearity
Nonlinear objective functions can have multiple
local optima:
x2 f
f
x2 x1
x x1
VARIABLES
x = No. of HONDA CITY cars produced
y = No. of HONDA CIVIC cars produced
COST of Manufacturing;
C (x, y)= 6x2 + 12y2
OBJECTIVE:
MINIMIZE COST
CONSTRAINT: 90 cars per day
x + y = 90
Original problem is rewritten as:
minimize L(x, ) = f(x) - h1(x)
EXAMPLE (Cont.)
18
Unconstrained optimization algorithms
Single-variable methods
0th order (involving only f )
1st order (involving f and f ’ )
2nd order (involving f, f ’ and f ” )
Multiple variable methods
Gradient Descent Methods
Simplex Method
Sequential Linear Programming
Sequential Quadratic Programming
Etc.
Single-variable methods
Bisection method
Optimality conditions: minimum at stationary point
Root finding of f’
● Similar to sectioning methods, but uses derivative:
f f’
f '( xk )
● New guess: x k 1 x k h k x k
f " ( xk )
Newton’s method (cont.)
Best convergence of all methods:
f’ f’
xk+1 xk+2
xk+2 xk
xk
xk+1
● Unless it diverges
Summary single variable methods
Bracketing +
Dichotomous sectioning
Fibonacci sectioning
0th order
Golden ratio sectioning
Quadratic interpolation
Cubic interpolation
Bisection method 1st order
Secant method
Newton method 2nd order
Function: xcos(2x)
Single variable Minimization (cont.)
(1) Change starting points
(2) Discuss and show sensitivity of solutions
Multiple variable methods
GRADIENT DESCENT METHODS
Consider a function J ( x), x x1 , x2 ,....xn
The gradient of J(x) at a point x0 is a vector of length n.
J 0
x ( x )
1
J 0
x ( x )
2
J ( x 0 ) .
.
.
J ( x 0 )
xn
This is an LP problem with the design variables contained in δx. The functions and
gradients evaluated at x0 are constant coefficients.
SEQUENTIAL LINEAR PROGRAMMING (Cont.)
1. Initial guess x0
2. Linearize about x0 using first order Taylor series
3. Solve resulting LP to find δx
4. Update x1=x0 + δx
5. Linearize about x1 and repeat:
xq=xq-1 + δx
Where δx is the solution of LP (model linearized about xq-1.
SEQUENTIAL QUADRATIC PROGRAMMING
Create a quadratic approximation to the Lagrangian
Create linear approximations to the constraints
Solve the quadratic problem to find the search direction, S
Perform the 1-D search
Update the approximation to the Lagrangian
Newton method
Expand f(x) by its Taylor series about the point xk