Optimization
Optimization
Optimization is an iterative process by which a desired solution (max/min) of the problem can be found while satisfying all its constraint or bounded conditions.
Figure 2: Optimum solution is found while satisfying its constraint (derivative must be zero at optimum).
Optimization problem could be linear or non-linear. Non linear optimization is accomplished by numerical Search Methods.
Search methods are used iteratively before a solution is achieved. The search procedure is termed as algorithm.
What is Optimization?(Cont.)
Linear problem solved by Simplex or Graphical methods. The solution of the linear problem lies on boundaries of the feasible region.
Non-linear problem solution lies within and on the boundaries of the feasible region.
Optimal points Local minima/maxima points: A point or Solution x* is at local point if there is no other x in its Neighborhood less than x* Global minima/maxima points: A point or Solution x** is at global point if there is no other x in entire search space less than x**
Mathematical Background
Slop or gradient of the objective function f represent the direction in which the function will decrease/increase most rapidly
df dx lim
x
f (x
x) x
f ( x)
lim
x
f x
( x) 2 .......
xp
f (X *) 0
Hessian Second derivative of f of several variables
2
x2 2 f x y
f y x 2 f y2
Second order condition (SOC) Eigen values of H(X*) are all positive Determinants of all lower order of H(X*) are +ve
Optimization Algorithm
Deterministic - specific rules to move from one iteration to next , gradient, Hessian
Stochastic probalistic rules are used for subsequent iteration Optimal Design Engineering Design based on optimization algorithm
Lagrangian method sum of objective function and linear combination of the constraints.
Optimization Methods
Deterministic Direct Search Use Objective function values to locate minimum Gradient Based first or second order of objective function. Minimization objective function f(x) is used with ve sign f(x) for maximization problem. Single Variable Newton Raphson is Gradient based technique (FOC) Golden Search step size reducing iterative method
Multivariable Techniques ( Make use of Single variable Techniques specially Golden Section) Unconstrained Optimization
a.) Powell Method Quadratic (degree 2) objective function polynomial is non-gradient based. b.) Gradient Based Steepest Descent (FOC) or Least Square minimum (LMS) c.) Hessian Based -Conjugate Gradient (FOC) and BFGS (SOC)
Global
Simulated Annealing (SA) method minimum energy principle of cooling metal crystalline structure
Genetic Algorithm (GA) Survival of the fittest principle based upon evolutionary theory
The contours of the J paraboloid shrinks as it is decrease function retval = Example6_1(x) % example 6.1 retval = 3 + (x(1) - 1.5*x(2))^2 + (x(2) - 2)^2;
>> SteepestDescent('Example6_1', [0.5 0.5], 20, 0.0001, 0, 1, 20) Where [0.5 0.5] -initial guess value 20 -No. of iteration 0.001 -Golden search tol. 0 -initial step size 1 -step interval 20 -scanning step >> ans 2.7585 1.8960
PART II
Presentation Outline
Introduction
Function Optimization Optimization Toolbox Routines / Algorithms available
Minimization Problems
Unconstrained Constrained
Multiobjective Optimization
Optimal PID Control Example
Function Optimization
Optimization concerns the minimization or maximization of functions
Standard Optimization Problem:
min f x
x
~
Subject to:
hi x
~
0 Equality Constraints
gj x
~
0 Inequality Constraints
xLk
xk
xU k Side Constraints
Where:
f x
~
is the objective function, which measure and evaluate the performance of a system. In a standard problem, we are minimizing the function. For maximization, it is equivalent to minimization of the ve of the objective function. is a column vector of design variables, which can affect the performance of the system.
x
~
hi x
~
0 0
gj x
~
xLk
xk
xU k
Side Constraints
Optimization Toolbox
Is a collection of functions that extend the capability of MATLAB.
The toolbox includes routines for: Unconstrained optimization Constrained nonlinear optimization, including goal attainment problems, minimax problems, and semi-infinite minimization problems Quadratic and linear programming Nonlinear least squares and curve fitting
Minimization Algorithm
Least-Squares Algorithms
Unconstrained Minimization
Consider the problem of finding a set of values [x1 x2]T that solves
min f x
x
~
e x1 4 x12
x1 x2
T
2 2 x2
4 x1 x2
2 x2 1
x
~
Steps:
Create an M-file that returns the function value (Objective Function). Call it objfun.m Then, invoke the unconstrained minimization routine. Use fminunc
x
function f = objfun(x)
~
x1
x2
f=exp(x(1))*(4*x(1)^2+2*x(2)^2+4*x(1)*x(2)+2*x(2)+1);
Objective function
x0 = [-1,1];
Output arguments
Input arguments
Results
xmin =
0.5000 -1.0000 Minimum point of design variables Objective function value feval = 1.3028e-010 1
exitflag = output = Exitflag tells if the algorithm is converged. If exitflag > 0, then local minimum is found
iterations: 7 funcCount: 40
stepsize: 1
firstorderopt: 8.1998e-004
search'
[xmin,feval,exitflag,output,grad,hessian]=
fminunc(fun,x0,options,P1,P2,)
fun x0 a vector
: Return a function of objective function. : Starts with an initial guess. The guess must be
of size of number of design variables. : To set some of the optimization parameters. after few slides) : To pass additional parameters.
[xmin,feval,exitflag,output,grad,hessian]=
fminunc(fun,x0,options,P1,P2,)
xmin : Vector of the minimum point (optimal point). The size is the number of design variables. feval : The objective function value of at the optimal point. exitflag : A value shows whether the optimization routine is terminated successfully. (converged if >0) Output : This structure gives more details about the optimization grad : The gradient value at the optimal point. hessian : The hessian value of at the optimal point
Options =
optimset(param1,value1, param2,value2,)
The routines in Optimization Toolbox has a set of default optimization parameters. However, the toolbox allows you to alter some of those parameters, for example: the tolerance, the step size, the gradient or hessian values, the max. number of iterations etc.
There are also a list of features available, for example: displaying the values at each iterations, compare the user supply gradient or hessian, etc. You can also choose the algorithm you wish to use.
Options =
optimset(param1,value1, param2,value2,)
Type help optimset in command window, a list of options setting available will be displayed. How to read? For example:
Parameter (param1)
Options =
optimset(param1,value1, param2,value2,)
LargeScale - Use large-scale algorithm if possible [ {on} | off ]
Since the default is on, if we would like to turn off, we just type:
Display - Level of display [ off | iter | notify | final ] MaxIter - Maximum number of iterations allowed [ positive integer ]
This function may only give local solutions.. fminsearch is generally less efficient than fminunc for problems of order greater than two. However, when the problem is highly discontinuous, fminsearch may be more robust.
This is a direct search method that does not use numerical or analytic gradients as in fminunc. This function may only give local solutions.
Constrained Minimization
Vector of Lagrange Multiplier at optimal point
Example
function f = myfun(x)
f=-x(1)*x(2)*x(3);
min f x
x
~
x1 x2 x3
x2 0
Subject to:
2 x12
x1 2 x2 x1 2 x2
2 x3 2 x3
0 72
0 x1 , x2 , x3 30
A 1 1 2 2 2 , B 2 0 72
LB
0 0 , UB 0
30 30 30
Example (Cont.)
For
2 x12
x2
Create a function call nonlcon which returns 2 constraint vectors [C,Ceq] function [C,Ceq]=nonlcon(x)
C=2*x(1)^2+x(2); Ceq=[];
Example (Cont.)
1 1
2 2
2 , B 2
0 72
B=[0 72]';
LB = [0 0 0]'; UB = [30 30 30]';
LB
0 0 , UB 0
30 30 30
[x,feval]=fmincon(@myfun,x0,A,B,[],[],LB,UB,@nonlcon)
CAREFUL!!!
fmincon(fun,x0,A,B,Aeq,Beq,LB,UB,NONLCON,options,P1,P2,)
Example (Cont.)
Warning: Large-scale (trust region) method does not currently solve this type of problem, switching to medium-scale (line search). > In D:\Programs\MATLAB6p1\toolbox\optim\fmincon.m at line 213
In D:\usr\CHINTANG\OptToolbox\min_con.m at line 6
Optimization terminated successfully: Magnitude of directional derivative in search direction less than 2*options.TolFun and maximum constraint violation is less than options.TolCon Const. 1 Active Constraints:
x1 2 x2
2 x3
2
9 x= 0.00050378663220 0.00000000000000
Const. 3 Const. 4
x1 2 x2 2 x3 72 Const. 2 Const. 5 0 x1 30
0 x2 30
Const. 6 Const. 8
30.00000000000000
feval = -4.657237250542452e-035
Const. 7
0 x3 30 2 x12 x2 0
Const. 9
Sequence: A,B,Aeq,Beq,LB,UB,C,Ceq