0% found this document useful (0 votes)
54 views

ECEG-6311 Power System Optimization and AI

This document outlines classical optimization techniques, including single variable and multivariable optimization with no constraints. For single variable optimization, it discusses necessary conditions where the derivative must equal zero at a relative minimum. For multivariable optimization, it states the necessary condition that the partial derivatives must equal zero at an extreme point, and gives sufficient conditions involving the Hessian matrix being positive or negative definite for a minimum or maximum. Examples are provided to illustrate the concepts.

Uploaded by

Fitsum Haile
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
54 views

ECEG-6311 Power System Optimization and AI

This document outlines classical optimization techniques, including single variable and multivariable optimization with no constraints. For single variable optimization, it discusses necessary conditions where the derivative must equal zero at a relative minimum. For multivariable optimization, it states the necessary condition that the partial derivatives must equal zero at an extreme point, and gives sufficient conditions involving the Hessian matrix being positive or negative definite for a minimum or maximum. Examples are provided to illustrate the concepts.

Uploaded by

Fitsum Haile
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 22

ECEG-6311

Power System Optimization


and AI
Lecture 2:
Classical Optimization Techniques

Yoseph Mekonnen (Ph.D.)

Page 1
Outlines
Single Variable
Multivariable Optimization with No Constraints

Page 2
Introduction
The classical methods of optimization are useful in finding
the optimum solution of continuous and differentiable
functions.
These methods are analytical and make use of the
techniques of differential calculus in locating the optimum
points.

Since some of the practical problems involve objective


functions that are not continuous and/or differentiable, the
classical optimization techniques have limited scope in
practical applications.
There is necessary and sufficient conditions in locating the
optimum solution.

Page 3
SINGLE-VARIABLE OPTIMIZATION
A function of one variable f(x) is said to have a relative or
local minimum at x = x∗ if f(x∗) ≤ f(x∗+h) for all sufficiently
small positive and negative values of h.

Similarly, a point x∗ is called a relative or local maximum if


f (x∗) ≥ f (x∗+h) for all values of h sufficiently close to zero.

A function f(x) is said to have a global or absolute


minimum at x∗ if f(x∗) ≤ f(x) for all x, and not just for all x
close to x∗ , in the domain over which f (x) is defined.

Similarly, a point x∗ will be a global maximum of f(x) if


f(x∗) ≥ f(x) for all x in the domain.

Page 4
…Contd..
Relative and Global Minima

Page 5
single-variable optimization problem
A single-variable optimization problem is one in which the
value of x = x∗ is to be found in the interval [a, b] such that
x∗ minimizes f (x).

MINIMIZATION

Page 6
Conditions
Theorem 1: Necessary Condition
If a function f(x) is defined in the interval a ≤ x ≤ b and has
a relative minimum at x = x∗ , where a<x∗<b, and if the
derivative df (x)/dx = f′(x) exists as a finite number at x=x∗
, then f ′(x∗) = 0.

Page 7
…Contd..
Important Points
This theorem can works even if x∗ is a relative maximum.
The theorem does not say what happens if a minimum or
maximum occurs at a point x∗ where the derivative fails to
exist.
The theorem does not say what happens if a minimum or
maximum occurs at an endpoint of the interval of definition
of the function.
The theorem does not say that the function necessarily
will have a minimum or maximum at every point where the
derivative is zero.
In general, a point x∗ at which f′(x∗)=0 is called a
stationary point.

Page 8
…Contd..
Example:

The derivative f′(x)=0 at x=0 for the function shown in Fig.


However, this point is neither a minimum nor a maximum.

Page 9
…Contd..
If the function f(x) possesses continuous derivatives of
every order that come in question, in the neighborhood of
x=x∗ , it needs sufficient condition for finding the minimum
or maximum value of the function.

Theorem 2:Sufficient Condition

Let f ′(x∗) = f ′′(x∗) = · · · = f (n−1) (x∗) = 0, but f (n)(x∗) ≠0.


Then f(x∗) is:
(i) a minimum value of f(x) if f(n)(x∗) > 0 and n is even;
(ii) a maximum value of f(x) if f(n)(x∗) < 0 and n is even;
(iii) neither a maximum nor a minimum if n is odd.

Page 10
…Contd..
Example
Determine the maximum and minimum values of the
function

Page 11
Multivariable Optimization with No Constraints
Necessary Condition
If f(X) has an extreme point (maximum or minimum) at X =
X∗ and if the first partial derivatives of f (X) exist at X∗ ,
then:

Sufficient Condition
A sufficient condition for a stationary point X∗ to be an
extreme point is that the matrix of second partial
derivatives (Hessian matrix) of f(X) evaluated at X∗ is
(i) positive definite when X∗ is a relative minimum point
(ii) negative definite when X∗ is a relative maximum point.

Page 12
..Contd..
A matrix A will be positive definite if all its eigen values
are positive; that is, all the values of λ that satisfy the
determinantal equation should be positive. Similarly, the
matrix [A] will be negative definite if its eigenvalues are
negative.

Another test that can be used to find the positive


definiteness of a matrix A of order n involves evaluation of
the determinants.

Page 13
..Contd..
The matrix A will be positive definite if and only if all the
values A1, A2, A3, . . . , An are positive.
The matrix A will be negative definite if and only if the
sign of Aj is (–1)j for j = 1, 2, . . . , n.
If some of the Aj are positive and the remaining Aj are
zero, the matrix A will be positive semidefinite.

Page 14
..Contd..
Example: Equilibrium

Necessary Condition

The values of x1 and x2 corresponding to the equilibrium


state

Page 15
..Contd..
The sufficiency conditions for the minimum at (x∗1 , x∗2 )
can also be verified by testing the positive definiteness of
the Hessian matrix of U.
The Hessian matrix of U evaluated at (x∗1 , x∗2 ) is:

The determinants of the square submatrices of J are

Page 16
Reading Assignment
Semi definite Case
Saddle Point

Page 17
…Contd..
Example

Page 18
…Contd..
Necessary Condition

Page 19
…Contd..
Sufficient Condition

Page 20
…Contd..
Minimum and Maximum

The matrix A will be positive definite if and only if all the values A1, A2, A3, . . . , An
are positive.
The matrix A will be negative definite if and only if the sign of Aj is (–1)j for j = 1, 2,.n.

Page 21
Thank You!

Page 22

You might also like