Classical Optimization Techniques
Classical Optimization Techniques
OPTIMIZATION TECHNIQUES
Chapter 1
x1
Design constraints
Restrictions on Behavioral constraints
variables Power cannot be
Find x which minimize: f(x) negative
Subject to:
gi(x) < 0 for i = 1, 2, ..., h Geometric constraints
li(x) = 0 for i = h+1, ..., m
Constraints due to
Where:
gi(x) is inequality constrain geometry
li(x) is equality constraint
Constraint surface
Set of values which satisfy
a single constraint
Plot of gi(x)=0
Four possible points
Free and acceptable
Free and unacceptable
Bound acceptable
Bound unacceptable
Example
Draw constraint surface for problem of minimization
f ( x ) 9.82 x1x 2 2 x1 Subject to 1
0.9
2 x1 14 0.8
0.7
x2
0.5
0.4
2500
500 0 0.3
x1x 2
0.2
0.1
2 4 6 8 10 12 14
x1
0.9
0.8
0.7
x2
objective function 0.5
0.2
0.1
2 4 6 8 10 12 14
x1
Classification of optimization problems
Constrained optimization
1)Based on
existence of Formulation
constraints Find X which Min F(X)
subject to
gj(X)0
Unconstrained optimization
Formulation
Min F(X)
Classification cont…
2)Based on nature of Static
design variables
Design variables are simple variables
Dynamic
Design variables are function of other
variables
Classification cont…
4)Based on
expression of
Geometric programming – objective
objective function function and/or constraint are
or constrains
expressed as power terms
Quadratic programming
Special case of NLP
Objective function is quadratic form
Classical Optimization techniques
Used for continuous and differentiable
functions
Make use of differential calculus
disadvantages
Theorem 2
Example
f (x) 12x 5 45x 4 40x 3 5 400
350
Soln. 300
250
150
The extreme points are x=0,1 and 2
100
50
X=0 is inflection point, x=2 is local 0
minima and x=1 is local maxima -50
-100
-1 -0.5 0 0.5 1 1.5 2 2.5 3
Multivariable optimization
Without constraint and With constraint
Has similar condition with single variable case
Theorem 3
Theorem 4
Example: find extreme points of
Function of two variable Findextreme points
Check the Hessian
matrix by determining
Soln. second derivatives and
Evaluate first partial determinants
derivatives and
Multivariable optimization with equality
constraints
Problem formulation
Find which minimizes F(x) subject to the constraint
Subject to
Constrained variation
At a minimum Rewriting the equation
Minimize
Subject to
Problem formulation
s.t.
Procedure:
A function L can be formed as
Subject to
Subject to
Linear programming
History Problem statement
George B. Dantzing
1947, simplex method Subject to the constraint
Kuhn and Tucker –
duality theory
Charles and Cooper -
industrial application
Properties of LP
The objective function is Transformations
minimization type If problem is maximization
use –f
All constraints are linear If there is negative
equality type variable, write it as
difference of two
All decision variables
If constraint is inequality,
are nonnegative add slack or surplus
variables
Simplex algorithm
Objective of simplex Algorithm
algorithm is to find vector X 1. Convert the system of
0 which minimizes f(X) equation to canonical form
2.Identify the basic solution
3. Test optimality and stop if
and satisfies equality optimum
constraints of the form 4. If not, select next pivotal
point and re-write equation
5. Go to step 2
Simplex algorithm
Example: Maximize
Solution:
Subject to
Step 1. convert to canonical form
Soln.
1. Form the matrices containing coefficients of the objective function, constraints
equations and constants separately
f=[5 2];
A=[3 4;1 -1;-1 -4;-3 -1];
b=[24;3;-4;-3];
lb=zeros(2,1);