ECEG-6311
Power System Optimization
and AI
Lecture 3:
Classical Optimization Techniques
Contd..
Yoseph Mekonnen (Ph.D.)
Page 1
Outlines
Multivariable Optimization with No Constraints
Multivariable Optimization with Equality Constraints
Direct substitution
constrained variation
Lagrange multipliers
Page 2
…Contd..
Necessary Condition
If f(X) has an extreme point (maximum or minimum) at X =
X∗ and if the first partial derivatives of f (X) exist at X∗ ,
then:
Sufficient Condition
A sufficient condition for a stationary point X∗ to be an
extreme point is that the matrix of second partial
derivatives (Hessian matrix) of f(X) evaluated at X∗ is
(i) positive definite when X∗ is a relative minimum point
(ii) negative definite when X∗ is a relative maximum point.
Page 3
..Contd..
A matrix A will be positive definite if all its eigen values
are positive; that is, all the values of λ that satisfy the
determinantal equation should be positive. Similarly, the
matrix [A] will be negative definite if its eigenvalues are
negative.
Another test that can be used to find the positive
definiteness of a matrix A of order n involves evaluation of
the determinants.
Page 4
..Contd..
The matrix A will be positive definite if and only if all the
values A1, A2, A3, . . . , An are positive.
The matrix A will be negative definite if and only if the
sign of Aj is (–1)j for j = 1, 2, . . . , n.
If some of the Aj are positive and the remaining Aj are
zero, the matrix A will be positive semidefinite.
Page 5
..Contd..
Example: Equilibrium
Necessary Condition
The values of x1 and x2 corresponding to the equilibrium
state
Page 6
..Contd..
The sufficiency conditions for the minimum at (x∗1 , x∗2 )
can also be verified by testing the positive definiteness of
the Hessian matrix of U.
The Hessian matrix of U evaluated at (x∗1 , x∗2 ) is:
The determinants of the square submatrices of J are
Page 7
Reading Assignment
Semidefinite Case
Saddle Point
Page 8
MULTIVARIABLE OPTIMIZATION WITH
EQUALITY CONSTRAINTS
Definition
Here m is less than or equal to n; otherwise (if m>n), the
problem becomes over defined and, in general, there will be
no solution.
Page 9
Solution by Direct Substitution
For a problem with n variables and m equality constraints,
it is theoretically possible to solve simultaneously the m
equality constraints and express any set of m variables in
terms of the remaining n − m variables.
When these expressions are substituted into the original
objective function, there results a new objective function
involving only n − m variables.
The new objective function is not subjected to any
constraint, and hence its optimum can be found by using the
unconstrained optimization techniques discussed.
Simply this process is converting the constrained
optimization to unconstructed optimization problem.
Page 10
..Contd..
Example
Maximize
Subject to
This problem has three design variables and one equality
constraint. Hence the equality constraint can be used to
eliminate any one of the design variables from the objective
function. If we choose to eliminate x3.
Page 11
..Contd..
Thus the objective function becomes
Now it can be maximized as an unconstrained function in
two variables.
Necessary Condition
simplified to obtain
Page 12
..Contd..
From which it follows that x∗1 = x∗2 = 1/√3 and hence x∗3 =
1/√3. This solution gives the maximum volume of the box as:
To find whether the solution found corresponds to a
maximum or a minimum, we apply the sufficiency conditions
to f (x1, x2) on f(x1, x2)
Simply we have to find the hessian Matrix
Page 13
..Contd..
Hessian Matrix
The Hessian matrix of f is negative definite at (x∗1 , x∗2 ).
Hence the point (x∗1 , x∗2 ) corresponds to the maximum of
f.
Page 14
Solution by the Method of Lagrange Multipliers
Problem with Two Variables and One Constraint
Consider the problem
A necessary condition for f to have a minimum at some point
(x∗1, x∗2 ) is that the total derivative of f (x1, x2) with
respect to x1 must be zero at (x∗1 , x∗2 ).
By setting the total differential of f (x1, x2) equal to zero,
we obtain
Page 15
..Contd..
Since g(x∗1 , x∗ 2 ) = 0 at the minimum point, any variations
dx1 and dx2 taken about the point (x∗1 , x∗2 ) are called
admissible variations provided that the new point lies on the
constraint:
The Taylor’s series expansion of the function in Eq. (2.20)
about the point (x∗ 1 , x∗ 2 ) gives
where dx1 and dx2 are assumed to be small. Since g(x∗ 1 ,
x∗ 2 ) = 0, it reduces to
Page 16
..Contd..
Assuming that ∂g/∂x2≠0, it can be rewritten as:
This relation indicates that once the variation in x1(dx1) is
chosen arbitrarily, the variation in x2 (dx2) is decided
automatically in order to have dx1 and dx2 as a set of
admissible variations. Substituting the above eqn into df:
Page 17
..Contd..
The expression on the left-hand side is called the
constrained variation of f . Note that the last eqn has to be
satisfied for all values of dx1. Since dx1 can be chosen
arbitrarily and leads to
The above equation represents a necessary condition in
order to have (x∗1 , x∗2 ) as an extreme point (minimum or
maximum).
Page 18
..Contd..
By defining a quantity λ, called the Lagrange multiplier, as
The necessary condition equation
Becomes
Can also be written as
Page 19
..Contd..
The constraint equation has to be satisfied at the extreme
point, that is
Thus the three eqns represent the necessary conditions
for the point (x∗1 , x∗2 ) to be an extreme point.
Page 20
..Contd..
Notice that the partial derivative (∂g/∂x2)|(x∗1 , x∗2 ) has
to be nonzero to be able to define λ by:
This is because the variation dx2 was expressed in terms
of dx1 in the derivation of:
On the other hand, if we choose to express dx1 in terms of dx2,
we would have obtained the requirement that (∂g/∂x1)|(x∗ 1 , x∗
2 ) be nonzero to define λ. Thus the derivation of the necessary
conditions by the method of Lagrange multipliers requires that at
least one of the partial derivatives of g(x1, x2) be nonzero at an
extreme point
Page 21
..Contd..
The aforementioned expressions generalized to a three
variable expression called Lagrange function
By treating L as a function of the three variables x1, x2,
and λ, the necessary conditions for its extremum are given
by
Page 22
..Contd..
Sufficient Condition (General Case)
The following determinantal equation, be positive
(negative):
Page 23
..Contd..
Example
The Lagrange function is:
Necessary Condition
Page 24
..Contd..
Gives
The Maximum Becomes
If A0 = 24π, the optimum solution becomes
Page 25
..Contd..
Sufficient Condition
Since the value of z is negative, the point (x∗1 , x∗2 )
corresponds to the maximum of f .
Page 26
Reading Assignment
Multivariable Optimization With Inequality Constraints
Page 27
Thank You!
Page 28