MITESD 77S10 Lec07
MITESD 77S10 Lec07
Numerical Optimization I
Lecture 7
Karen Willcox
min J ( x)
s.t. g j (x ) 0 j 1,.., m1
hk (x ) 0 k 1,.., m2
xi xi x u
i i 1,.., n
• For now, we consider a single objective
function, J(x).
• There are n design variables, and a total of
m constraints (m=m1+m2).
• The bounds are known as side constraints.
7 © Massachusetts Institute of Technology - Prof. de Weck and Prof. Willcox
Engineering Systems Division and Dept. of Aeronautics and Astronautics
Linear vs. Nonlinear
6 x1 x2 2x3 10 is linear in x
6 x1 x2 2x32 10 is nonlinear in x
MATLAB® demo
J 0
(x )
xn
2 2
J J
x1 x n xn2
2 3
J ( x) 3x1 x1x2 x 3 6x x 2 3
df 1 d 2f
f (z ) f (z 0 ) (z z 0 ) ( z z 0 )2
dz z0 2 dz 2 z0
When function depends on a vector:
T
0 0
J (x) J(x ) J(x ) (x x0 )
1 n n 1
1
(x x 0 )T H(x 0 )( x x0 )
2
1 n n n n 1
The gradient vector and Hessian matrix can be approximated using finite
differences if they are not available analytically or using adjoints (L8).
15 © Massachusetts Institute of Technology - Prof. de Weck and Prof. Willcox
Engineering Systems Division and Dept. of Aeronautics and Astronautics
Existence & Uniqueness of an
Optimum Solution
• Usually cannot guarantee that global optimum is found
– multiple solutions may exist
– numerical ill-conditioning
start from several initial solutions
• Can determine mathematically if have relative minimum
• Under certain conditions can guarantee global optimum
(special class of optimization problems or with global
optimization methods)
• It is very important to interrogate the “optimum” solution
J(x*)=0
local minimum at x*
H(x*)>0
J(x)
The minimum is only guaranteed
to be a global optimum if H(x)>0
for all values of x (e.g. simple x
2 2
min J(x) x 1 3x 2
s.t. x1 x2 1
j 0
m1 k unrestrict ed in sign
x
x1 x2
26 © Massachusetts Institute of Technology - Prof. de Weck and Prof. Willcox
Engineering Systems Division and Dept. of Aeronautics and Astronautics
Convex Spaces
Pick any two points in the feasible region. If all points on the line
connecting these points lie in the feasible region, then the
constraint surfaces are convex.
In general, for engineering problems, the design space is not convex ...
• Global optimization
Most methods
have some
• Local derivative-free optimization convergence
analysis and/or
• Local gradient-based optimization proofs.
• Heuristic methods
• Repeat
– Identify potentially optimal
hyper-rectangles
– Divide potentially optimal
hyper-rectangles
– Sample at centers of new
hyper-rectangles
x0, q=0
Calculate J(xq)
Calculate Sq
q=q+1
no yes
Converged? Done
Algorithm:
choose x0, set x=x0
repeat until converged:
S = - J(x)
choose to minimize J(x+ S)
x=x+ S
S1 = - J(x0)
Sq = - J(xq-1) + qSq-1
2
q 1
q
J (x )
2
q 2
J(x )
• search directions are now conjugate
• directions Sj and Sk are conjugate if SjT H Sk = 0
(also called H-orthogonal)
• makes use of information from previous iterations
without having to store a matrix
X2 X2
8 8
100 100
0 0
60 -40 60 -40
20 20
X
4 4 -20 X
-20
-8 -4 0 4 8 X1 -8 -4 0 4 8 X1
-4
-4
Adapted from: "Optimal Design in Multidisciplinary System." AIAA Professional Development Short Course Notes. September 2002.
at optimum J(x*)=0
J (x 0 ) H(x 0 ) x 0
1
0
x H(x ) J(x 0 )
37 © Massachusetts Institute of Technology - Prof. de Weck and Prof. Willcox
Engineering Systems Division and Dept. of Aeronautics and Astronautics
Newton’s Method
1
0 0
S H(x ) J (x )
• Polynomial interpolation
– pick several values for
– fit polynomials to J( )
– efficient, but need to be careful with implementation
• Golden section search
– easy to implement, but inefficient
• The one-dimensional search is one of the more
challenging aspects of implementing a gradient-based
optimization algorithm
dJ J x1 J x2
J TS
d x1 x2
J dJ/d
0 10 -5
1 6 -5
2 8 -5
For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.