Numerical Optimization
Numerical Optimization
f x
0.75 1 1
0.65 x tan 0.65
1 x 2
x
General outline of NLP
• Start with an initial trial point X1
• Find a suitable direction Si that points in the
direction of optimum
• Find an appropriate step length i for
movement in direction of Si
• Obtain the new approximation Xi+1 as
Xi 1 Xi iSi
• Test if Xi+1 is optimum, if not optimum, set
i=i+1 and go to step 2 else stop
Classification of NLP
• F0(0)=0 • X4=0.15
• Let x1=0.0+0.05 • F3(0.15)=0.15(0.15-
• F1(x2)=0.05(0.05-1.5) 1.5)
=-0.0725 =-0.2025
• F1<F0, continue • F3<F2, continue
• X3=0.05+0.05=0.1 • X5=0.2
• F2(0.1)=0.1(0.1-1.5) • F4(0.2)=0.2(0.2-1.5)
=-0.14 =-0.26
• F2<F1, continue • F4<F3, continue
Fibonacci
• Define two points x1 and x2 as
• Makes use of
Fibonacci numbers
• Fibonacci numbers
• Compare f(x1) and f(x2)
• Delete infeasible interval and
form new interval
• Algorithm • Repeat the procedure until
• Assume initial interval of specified number of iterations
uncertainty be L0=[a b]
• Let total number of
experiments be n , then
Flow chart of the Fibonacci method
Example: minimize f(x) given by
• Stopping criteria is
f ' x i 1
Example: Find the minimum of the function, with starting
point x=0.1 and stopping criteria =0.01
0.75 1 1
f ( x ) 0.65 0.65 x tan
1 x2 x
Solution: X1=0.1
f x
1.5x 0.65x 1 1
'
0.65 tan
1 x
2 2 1 x 2
x
1.51 3x 0.651 x 0.65
2 2
2.8 3.2x 2
f ''
x
1 x
2 3
1 x 1 x
2 2 2
1 x
2 3
• Gradient
• Ji is the Hessian matrix of
– N component vector second order derivatives
of partial derivatives • Disadvantages
– represent direction – Needs storage of J
of steepest decent – Computation of J is
– Next iteration is difficult
obtained as – Needs the inversion
of J
• Solution: use quasi
– Where is the Newton Method
gradient of f
Example: minimize f(x1,x2) given by
1 1
x i 1 x i J f ( x i )
2
xi Xi+1 J-1 f check
[0;0] [0.5;1.5] [0.5 -0.5;-0.5 1] [1;-1] ||f||=1.41>
[-1;1.5] [-1;1.5] [0.5 -0.5;-0.5 1] [0;0] ||f||=0<
Using MATLAB
• Steepest descent
• Quasi-Newton
Method
Constrained NLP
• Problem statement
Sequential quadratic programming
• Is the latest and best method
• Converts the constrained NLP to a
quadratic problem using
– Gradient of function
– Lagrangian multiplier
• Derivation
– Lagrangian equation of the above NLP is
• Converted Problem
Solution
• The solution procedure has the following steps
– Step 1 – start with an initial guess X
– Step 2 update X as
• MATLAB solution
– Steps :
• Write the objective function as m-file and save it
• Write the constraint function as a separate m-file and
save it
• Prepare the upper and lower bounds as vectors
• Call the build in function fmincon() using the objective
function, constrain functions and upper and lower
bounds as arguments
Example: solve the following minimization problem using
MATLAB
• Minimize
• function [c ceq]=constr(x)
ceq=[];
c=[(0.6/x(1)) +(0.3464/x(2))-0.1;6-x(1);7-
x(2)];
end
Save this also in a separate m-file named constr
• Write the main calling function
• xo=[ 11.8756 7];
• [x,
fx]=fmincon(@obj2,xo,[],[],[],[],[],[],@constr);
The answer will be
x=
9.4639 9.4642
fx =
1.4928
Modern optimization techniques
• Drawback of classical • AI based techniques
optimization – Fuzzy logic system
techniques – Neural network
– Need differential – Evolutionary
– Single directional algorithm
– Stack at local • Genetic algorithm
extreme • Simulated annealing
• Particle swarm