0% found this document useful (0 votes)
438 views

Introduction To Optimization - (Part 1)

Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
438 views

Introduction To Optimization - (Part 1)

Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 71

Introduction to

Optimization
(Part 1)
Daniel Kirschen

Economic dispatch problem

Several generating units serving the


load
What share of the load should each
generating unit produce?
Consider the limits of the generating
units
Ignore the limits of the network

2011 D. Kirschen and University of


Washington

Characteristics of the generating units


T

J/h
Input

Fuel
Electric Power
(Input)
(Output)

Pmax

Pmin

MW

Thermal generating units


Output
Consider the running costs only
Input / Output curve
Fuel vs. electric power
Fuel consumption measured by its energy content
Upper and lower limit on output of the generating unit

2011 D. Kirschen and University of Washington

Cost Curve
Multiply fuel input by fuel cost
No-load cost
Cost of keeping the unit running if it could produce
zero MW

Cost

$/h

No-load cost

MW
Pmax

Pmin
2011 D. Kirschen and University of
Washington

Output

Incremental Cost Curve


Incremental cost curve

Cost [$/h]

FuelCost vs Power
Power
Derivative of the cost curve
In $/MWh
Cost of the next MWh

F
P

MW

Incremental Cost
[$/MWh]

MW
2011 D. Kirschen and University of
Washington

Mathematical formulation
Objective function
C CA (PA ) CB (PB ) CC (PC )

Constraints

Load / Generation balance:


L PA PB PC

Unit Constraints:
PAmin PA PAmax
PBmin PB PBmax
PCmin PC PCmax
2011 D. Kirschen and University of
Washington

This is an optimization problem


6

Introduction to Optimization

An engineer can do with one dollar


which any bungler can do with two
A. M. Wellington (1847-1895)

2011 D. Kirschen and University of


Washington

Objective
Most engineering activities have an
objective:
Achieve the best possible design
Achieve the most economical operating
conditions

This objective is usually quantifiable


Examples:
minimize cost of building a transformer
minimize cost of supplying power
minimize losses in a power system
maximize profit from a bidding strategy
2011 D. Kirschen and University of
Washington

Decision Variables
The value of the objective is a
function of some decision variables:

F f x1 , x2 , x3 , .. xn
Examples of decision variables:
Dimensions of the transformer
Output of generating units, position of
taps
Parameters of bids for selling electrical
energy
2011 D. Kirschen and University of
Washington

10

Optimization Problem
What value should the decision
F ftake
, x3that
, .. xn
x1, x2so
variables
is minimum or maximum?

2011 D. Kirschen and University of


Washington

11

Example: function of one


variable
f(x*)

f(x)

x*

f(x)ismaximumforx=x*

2011 D. Kirschen and University of Washington

12

Minimization and
Maximization
f(x*)

f(x)

x*

f(x)
f(x*)
Ifx=x*maximizesf(x)thenitminimizesf(x)
2011 D. Kirschen and University of Washington

13

Minimization and Maximization


maximizing f(x) is thus the same
thing as minimizing g(x) = -f(x)
Minimization and maximization
problems are thus interchangeable
Depending on the problem, the
optimum is either a maximum or a
minimum
2011 D. Kirschen and University of
Washington

14

Necessary Condition for


Optimality
f(x )
*

f(x)

df
0
dx

df
0
dx
x*

If x x * maximises f ( x) then:
df
f ( x) f ( x ) for x x
0 for x x *
dx
*

df
f ( x) f ( x ) for x x
0 for x x *
dx
*

2011 D. Kirschen and University of Washington

15

Necessary Condition for


Optimality
df
0
dx

f(x)

x*

df
If x x maximises f ( x) then
0 for x x *
dx
*

2011 D. Kirschen and University of Washington

16

Example
f(x)

df
Forwhatvaluesofxis ?
0
dx

Inotherwords,forwhatvaluesofxisthenecessaryconditionfor
optimalitysatisfied?
2011 D. Kirschen and University of Washington

17

Example
f(x)

A, B, C, D are stationary points


A and D are maxima
B is a minimum
C is an inflexion point

2011 D. Kirschen and University of


Washington

18

How can we distinguish minima and


maxima?
f(x)

For x A and x D, we have:

d2 f
dx

Theobjectivefunctionisconcavearoundamaximum
2011 D. Kirschen and University of Washington

19

How can we distinguish minima and


maxima?
f(x)

For x B we have:

d2 f
dx

Theobjectivefunctionisconvexaroundaminimum
2011 D. Kirschen and University of Washington

20

How can we distinguish minima and


maxima?
f(x)

For x C, we have:

d2 f
dx

Theobjectivefunctionisflataroundaninflexionpoint
2011 D. Kirschen and University of Washington

21

Necessary and Sufficient Conditions of


Optimality

df
Necessary condition:
0
dx

Sufficient condition:2
For a maximum:

d f
0
2
dx

For a minimum:

d2 f
0
2
dx

2011 D. Kirschen and University of


Washington

22

Isnt all this obvious?


Cant we tell all this by looking at the
objective function?
Yes, for a simple, one-dimensional case
when we know the shape of the
objective function
For complex, multi-dimensional cases
(i.e. with many decision variables) we
cant visualize the shape of the
objective function
We must then rely on mathematical
techniques

2011 D. Kirschen and University of


Washington

23

Feasible Set
The values that the decision
variables can take are usually limited
Examples:
Physical dimensions of a transformer
must be positive
Active power output of a generator may
be limited to a certain range (e.g. 200
MW to 500 MW)
Reactive power output of a generator
may be limited to a certain range (e.g.
-100 MVAr to 150 MVAr)

2011 D. Kirschen and University of


Washington

24

Feasible Set
f(x)

xMIN

xMAX

FeasibleSet
Thevaluesoftheobjectivefunctionoutside
thefeasiblesetdonotmatter

2011 D. Kirschen and University of Washington

25

Interior and Boundary


Solutions
f(x)

xMIN

xMAX

A and D are interior maxima


B and E are interior minima
XMIN is a boundary minimum Do not satisfy the
XMAX is a boundary maximumOptimality conditions!

2011 D. Kirschen and University of Washington

26

Two-Dimensional Case
f(x1,x2)

x1*
x2*
x2

2011 D. Kirschen and University of Washington

x1
f(x1,x2)isminimumforx1*,x2*
27

Necessary Conditions for Optimality


f(x1,x2)

f ( x1 ,x2 )
x1

0
x* ,x*
1 2

f ( x1 ,x2 )
x2

x* ,x*
1 2

x1*
x2*
x2

2011 D. Kirschen and University of Washington

x1

28

Multi-Dimensional Case
Atamaximumorminimumvalueof f x1 , x2 , x3 , .. xn
wemusthave:

f
0
x1
f
0
x 2

f
0
x n

Apointwheretheseconditionsaresatisfiediscalledastationarypoint
2011 D. Kirschen and University of Washington

29

Sufficient Conditions for


Optimality
f(x1,x2)

minimum

maximum

x1

x2

2011 D. Kirschen and University of Washington

30

Sufficient Conditions for


Optimality
f(x1,x2)

Saddlepoint

x1

x2

2011 D. Kirschen and University of Washington

31

Sufficient Conditions for


Optimality
CalculatetheHessianmatrixatthestationarypoint:

2 f
x 2
1

2 f

x2 x1

2 f

xn x1

2011 D. Kirschen and University of Washington

2 f
x1 x 2
2 f
x 22

2 f
xn x 2

2 f

x1 x n

2
f

x2 x n


2 f

2
xn

32

Sufficient Conditions for


Optimality
Calculate the eigenvalues of the Hessian matrix
at the stationary point
If all the eigenvalues are greater or equal to zero:
The matrix is positive semi-definite
The stationary point is a minimum

If all the eigenvalues are less or equal to zero:


The matrix is negative semi-definite
The stationary point is a maximum

If some or the eigenvalues are positive and other


are negative:
The stationary point is a saddle point

2011 D. Kirschen and University of


Washington

33

Contours
f(x1,x2)
F2
F1

x1
F1
x2

2011 D. Kirschen and University of Washington

F2
34

Contours
Acontouristhelocusofallthepointthatgivethesamevalue
totheobjectivefunction
x2

2011 D. Kirschen and University of Washington

Minimumormaximum

x1
35

Example 1
Minimise C x12 4 x 22 2 x1 x2
Necessary conditions for optimality:
C
2 x1 2 x 2 0
x1
C
2 x1 8 x2 0
x 2

2011 D. Kirschen and University of Washington

x1 0

x 2 0

isastationary
point

36

Example 1
Sufficientconditionsforoptimality:

2C
x 2
1

Hessian Matrix:
2C

x 2 x1

2C
x1 x2

2 C

x22

2 2

2 8

mustbepositivedefinite(i.e.alleigenvaluesmustbepositive)

2
2
0 2 10 12 0
2
8
=

10 52
0
2

2011 D. Kirschen and University of Washington

Thestationarypoint
isaminimum
37

Example 1
x2
C=9
C=1

C=4

x1
Minimum:C=0

2011 D. Kirschen and University of Washington

38

Example 2
Minimize C x12 3x22 2x1 x2
Necessary conditions for optimality:
C
2x1 2x2 0
x1
x1 0

C
x 2 0
2x1 6x2 0
x2

2011 D. Kirschen and University of Washington

isastationary
point

39

Example 2
Sufficientconditionsforoptimality:

2C
x 2
1

Hessian Matrix:
2C

x 2 x1

2C
x1 x2

2
C

2
x2

2 2

2 6

2 2
0 2 4 8 0
2 6
4 80
=
0
2
or =

80
2

2011 D. Kirschen and University of Washington

Thestationarypoint
isasaddlepoint

40

Example 2
x2
C=0
C=9
C=4
C=1
C=9

C=4

C=4

C=1

C=9

x1
C=1
C=4
C=9

This stationary point is a saddle point


2011 D. Kirschen and University of Washington

C=0
41

Optimization with
Constraints

Optimization with Equality


Constraints
There are usually restrictions on the
values that the decision variables
can take
Minimise

f x1 , x2 ,.. , xn

Objectivefunction

subject to:

1 x1 , x2 ,.. , xn 0

Equalityconstraints

m x1 , x2 ,.. , xn 0
2011 D. Kirschen and University of
Washington

43

Number of Constraints
N decision variables
M equality constraints
If M > N, the problems is overconstrained
There is usually no solution

If M = N, the problem is determined


There may be a solution

If M < N, the problem is underconstrained


There is usually room for optimization

2011 D. Kirschen and University of


Washington

44

Example 1

Minimise f x1 , x 2 0.25 x12 x22

x2

Subject to x1 , x 2 5 x1 x 2 0
x1 , x2 5 x1 x 2 0

Minimum

x1

f x1 , x2 0.25 x12 x 22
2011 D. Kirschen and University of Washington

45

Example 2: Economic
Dispatch
G1

x1

G2

x2

C1 a1 b1 x12

Costofrunningunit1

C 2 a2 b2 x 22

Costofrunningunit2

C C1 C 2 a1 a2 b1 x12 b2 x 22

Totalcost

Optimizationproblem:

Minimise C a1 a2 b1 x12 b2 x 22
Subject to: x1 x 2 L
2011 D. Kirschen and University of Washington

46

Solution by substitution
Minimise C a1 a2 b1 x12 b2 x 22
Subject to: x1 x 2 L
x2 L x1
C a1 a2 b1 x12 b2 L x1

Unconstrainedminimization

dC
2 b1 x1 2 b2 L x1 0
dx1
b2 L
x1
b1 b2
d2 C
2
1

dx

2011 D. Kirschen and University of Washington

b1 L

x 2

b1 b2

2b1 2 b2 0 minimum
47

Solution by substitution
Difficult
Usually impossible when constraints
are non-linear
Provides little or no insight into
solution
Solution using Lagrange multipliers

2011 D. Kirschen and University of


Washington

48

Gradient
Consider a function f (x1 , x2 ,.. , xn )

The gradient of f is the vector f

2011 D. Kirschen and University of


Washington

f
x1

f
x2

xn

49

Properties of the Gradient


Each component of the gradient
vector indicates the rate of change of
the function in that direction
The gradient indicates the direction
in which a function of several
variables increases most rapidly
The magnitude and direction of the
gradient usually depend on the point
considered
At each point, the gradient is
perpendicular to the contour of the

2011 D. Kirschen and University of


Washington

50

Example 3
f ( x, y) ax 2 by 2
f
x 2 ax
f

f
2 by
y

y
B
C

A
x

D
2011 D. Kirschen and University of Washington

51

Example 4
f ( x, y) ax by
f
x
f
f

y

a

b

f f3
f f2

f f1

f
f

2011 D. Kirschen and University of Washington

52

Lagrange multipliers
Minimise f x1 , x 2 0.25 x12 x22 subject to x1 , x 2 5 x1 x2 0

x1 , x2 5 x1 x 2
f 0.25 x12 x 22 6
f 0.25 x12 x 22 5

2011 D. Kirschen and University of Washington

53

Lagrange multipliers

f
x1
f

x 2

f x1 , x2 6
f x1 , x2 5

f
f

2011 D. Kirschen and University of Washington

54

Lagrange multipliers
x1 , x2


x1


x 2

f x1 , x2 6
f x1 , x2 5

2011 D. Kirschen and University of Washington

55

Lagrange multipliers
Thesolutionmustbeontheconstraint

x1 , x2

Toreducethevalueoff,wemustmove
inadirectionoppositetothegradient
f

f x1 , x2 6
f x1 , x2 5

A
?

f
B
2011 D. Kirschen and University of Washington

56

Lagrange multipliers
x1 , x2

Westopwhenthegradientofthefunction
isperpendiculartotheconstraintbecause
movingfurtherwouldincreasethevalue
ofthefunction
f

f x1 , x2 6
f x1 , x2 5

Attheoptimum,thegradientofthe
functionisparalleltothegradient
oftheconstraint
2011 D. Kirschen and University of Washington

57

Lagrange multipliers
Attheoptimum,wemusthave: f
Whichcanbeexpressedas:

f 0

Intermsofthecoordinates:

0
x1
x1
f

0
x 2
x 2

Theconstraintmustalsobesatisfied: x1 , x2 0

iscalledtheLagrangemultiplier

2011 D. Kirschen and University of Washington

58

Lagrangian function
Tosimplifythewritingoftheconditionsforoptimality,
itisusefultodefinetheLagrangianfunction:

x
1 , x2 , f x1 , x2 x1 , x2

Thenecessaryconditionsforoptimalityarethengiven
bythepartialderivativesoftheLagrangian:

x1 , x 2 ,
x1
x1 , x 2 ,
x 2
x1 , x 2 ,

2011 D. Kirschen and University of Washington

0
x1
x1
f

0
x2
x 2
x1 , x2 0
59

Example
Minimise f x1 , x 2 0.25 x12 x22 subject to x1 , x 2 5 x1 x2 0

x
1 , x2 , 0.25 x12 x22 5 x1 x2
x1 , x 2 ,
x1
x1 , x 2 ,
x 2
x1 , x 2 ,

0.5 x1 0
2 x2 0
5 x1 x 2 0

2011 D. Kirschen and University of Washington

60

Example
x1 , x 2 ,
x1
x1 , x 2 ,
x 2
x1 , x 2 ,

0.5 x1 0

x1 2

2 x2 0

1
x2
2

5 x1 x 2 0

1
5 2 0
2

2
x1 4
x2 1
2011 D. Kirschen and University of Washington

61

Example
x2

Minimise f x1 , x 2 0.25 x12 x22


Subject to x1 , x 2 5 x1 x 2 0

x1 , x2 5 x1 x 2 0

f x1 , x2 5

Minimum

x1

4
2011 D. Kirschen and University of Washington

62

Important Note!
Iftheconstraintisoftheform:

ax1 bx 2 L

ItmustbeincludedintheLagrangianasfollows:

= f x1 ,.. , xn L ax1 bx2


Andnotasfollows:

= f x1 ,.. , xn ax1 bx2


2011 D. Kirschen and University of Washington

63

Application to Economic
Dispatch
G1

x1

G2

x2

minimise f x1 , x 2 C1 x1 C 2 x 2
s.t. x1 , x2 L x1 x 2 0

x
1 , x2 , C1 x1 C 2 x2 L x1 x2
dC1

0
x1 dx1
dC 2

0
x 2 dx 2

L x1 x 2 0

2011 D. Kirschen and University of Washington

dC1
dx1

dC 2
dx2

Equalincrementalcost
solution
64

Equal incremental cost


solution
Costcurves:

C1 ( x1 )

C 2 ( x2 )

x1

Incremental
costcurves:

x2

dC 1

dC 2

dx1

dx 2

2011 D. Kirschen and University of Washington

x1

x652

Interpretation of this
solution
dC 1

dC 2

dx1

dx 2

x1*
L

x1

x2*

Lx x
*
1

2011 D. Kirschen and University of Washington

x2

*
2

If<0,reduce
If>0,increase
66

Physical interpretation
C
lim
dx x0 x

dC

C( x)

For x sufficiently small:


dC
C
x
dx
If x 1 MW :
dC
C
dx

dC(x)
dx

x
2011 D. Kirschen and University of Washington

Theincrementalcostisthecostof
oneadditionalMWforonehour.
Thiscostdependsontheoutputof
thegenerator.
67

Physical interpretation
dC1
: Cost of one more MW from unit 1
dx1
dC 2
: Cost of one more MW from unit 2
dx2
dC1 dC 2
Suppose that

dx1 dx 2
dC1
Decrease output of unit 1 by 1MW decrease in cost =
dx1
dC 2
Increase output of unit 2 by 1MW increase in cost =
dx 2
dC 2 dC1
Net change in cost =

0
dx2 dx1
2011 D. Kirschen and University of Washington

68

Physical interpretation
Itpaystoincreasetheoutputofunit2anddecreasethe
outputofunit1untilwehave:

dC1
dx1

dC 2
dx2

TheLagrangemultiplieristhusthecostofonemoreMW
attheoptimalsolution.
Thisisaveryimportantresultwithmanyapplicationsin
economics.
2011 D. Kirschen and University of Washington

69

Generalization
Minimise
f x1 , x2 ,.. , xn
subject to:

1 x1 , x2 ,.. , xn 0

m x1 , x2 ,.. , xn 0
Lagrangian:

= f x1 ,.. , xn 1 1 x1 ,.. , xn m m x1 ,.. , xn

OneLagrangemultiplierforeachconstraint
n+mvariables:x1,,xnand1,,m
2011 D. Kirschen and University of Washington

70

Optimality conditions

= f x1 ,.. , xn 1 1 x1 ,.. , xn m m x1 ,.. , xn


1
m
f

1
m
0
x1 x1
x1
x1

nequations

1
m
f

1
m
0
x n x n
x n
x n

1 x1 ,, xn 0
1

m x1 ,, xn 0
m
2011 D. Kirschen and University of Washington

mequations
n+mequationsin
n+mvariables
71

You might also like