GoodOptimizationModelingPracticesPyomo
GoodOptimizationModelingPracticesPyomo
with Pyomo
All You Wanted to Know About Practical Optimization but Were Afraid to Ask
Andres Ramos
https://www.iit.comillas.edu/aramos/
[email protected]
[email protected]
Good Optimization Modeling Practices with Pyomo. May 2024 1
Do not confuse the ingredients of the recipe
• Mathematical formulation
• LP, MIP, NLP, QCP, MCP
• Modeling language
• GAMS, Pyomo
• Solver
• CPLEX, Gurobi
• Optimization algorithm
• Primal simplex, dual simplex,
interior point
• Input/output interface
• Text file, CSV, Microsoft Excel
• Operating system
• Windows, Linux, macOS
Good Optimization Modeling Practices with Pyomo. May 2024 2
What is relevant in an optimization model?
Appealing Name
Documentation
Maintainability
and reusability
Code attributes:
• Clarity
• Modularity
• Completeness
• Interoperability
• Maintainability
• Standardization
Developer's Principle
• Pros: • Pros:
• Consistency • Flexibility, multiple choices
• Maturity. Everything has • Powerful Python libraries to
already been written be used (e.g., input data,
• Documentation output results, visualization)
• Customer support • Cons:
• Documentation is a Babel
tower, mined when using
ChatGPT
• Getting the duals
GAMS “Performance in Optimization Models: A Comparative Analysis of GAMS, Pyomo, GurobiPy,
and JuMP” July 2023
Good Optimization Modeling Practices with Pyomo. May 2024 6
Python vs. Julia
1. GAMS has been around for longer than Pyomo, which means 1. Pyomo is open-source software, which means it is free to use
it has a more established user community and more extensive and modify, while GAMS is a commercial software that requires
documentation and support. a license to use.
2. GAMS is specifically designed for mathematical programming, 2. Pyomo is based on Python, a popular general-purpose
programming language that has a large and active user
whereas Pyomo is a general-purpose modeling tool that
community, while GAMS has its own proprietary modeling
includes mathematical programming as one of its features.
language that can have a steeper learning curve.
3. GAMS provides a powerful, integrated modeling language that 3. Pyomo provides more flexibility than GAMS, allowing users to
allows you to specify models using a concise syntax, while integrate their optimization models with other Python libraries
Pyomo requires you to write Python code to define your and tools.
models. 4. Pyomo can be used for a wide range of optimization problems,
4. GAMS has a wide range of solvers available, including not just mathematical programming, such as stochastic
commercial and open-source options, and it can seamlessly programming, optimization under uncertainty, and mixed-
switch between solvers, whereas Pyomo requires more effort integer nonlinear programming.
to switch between solvers. 5. Pyomo has a more modular design that makes it easier to add
new features or extensions, while GAMS has a more monolithic
5. GAMS includes built-in functionality for handling linear,
architecture that can be harder to customize.
nonlinear, and mixed-integer programming problems,
whereas Pyomo requires additional libraries to handle some
types of optimization problems.
• Data manipulation
• pandas - Python Data Analysis Library (https://pandas.pydata.org/)
• Plotting
• Plotly Open Source Graphing Library for Python (https://plotly.com/python/)
• Matplotlib Visualization with Python (https://matplotlib.org/)
• Vega-Altair Declarative Visualization in Python (https://altair-viz.github.io/)
• Documentation
• reStructuredText (https://www.sphinx-doc.org/en/master/usage/restructuredtext/index.html)
• Sphinx makes it easy to create intelligent and beautiful documentation (https://www.sphinx-doc.org/en/master/)
• ReadtheDocs. Build, host, and share documentation, all with a single platform
(https://about.readthedocs.com/?ref=readthedocs.com)
My first example
Code conventions
• Must be defined in blocks. For example, a set and all its subsets
should constitute one block in the set section.
• Names are intended to be meaningful. Follow conventions
• Items with the same name represent the same concept in different
models
• Units should be used in all definitions
• Parameters are named pParameterName (e.g., pOperReserveDw)
• Variables are named vVariableName (e.g., vReserveDown)
• Equations are named eEquationName (e.g., eOperReserveDw)
• Use short set names (one or two letters) for easier reading
• Equations are laid out as clearly as possible
sets min 𝑐 𝑥
I origins / VIGO, ALGECIRAS /
J destinations / MADRID, BARCELONA, VALENCIA /
parameters 𝑥 ≤ 𝑎 ∀𝑖
pA(i) origin capacity
/ VIGO 350
ALGECIRAS 700 /
𝑥 ≥ 𝑏 ∀𝑗
pB(j) destination demand
/ MADRID 400 A. Mizielinska y D. Mizielinski Atlas del mundo: Un insólito viaje por las
BARCELONA 450 𝑥 ≥0 mil curiosidades y maravillas del mundo Ed. Maeva 2015
VALENCIA 150 /
variables
vX(i,j) units transported
vCost transportation cost
positive variable vX
equations
eCost transportation cost
eCapacity(i) maximum capacity of each origin
eDemand (j) demand supply at destination ;
TransportationCost = {
('Vigo',
('Vigo',
'Madrid' ):
'Barcelona'):
0.06,
0.12,
𝑥 ≥ 𝑏 ∀𝑗
('Vigo', 'Valencia' ): 0.09,
('Algeciras', 'Madrid' ): 0.05,
('Algeciras', 'Barcelona'):
('Algeciras', 'Valencia' ):
0.15,
0.11,
𝑥 ≥0
}
def eCost(mTransport):
return sum(mTransport.pC[i,j]*mTransport.vX[i,j] for i,j in mTransport.i*mTransport.j)
mTransport.eCost = Objective(rule=eCost, sense=minimize, doc='transportation cost')
mTransport.dual = Suffix(direction=Suffix.IMPORT)
Solver = SolverFactory('gurobi')
Solver.options['LogFile'] = 'mTransport.log'
SolverResults = Solver.solve(mTransport, tee=True)
SolverResults.write()
mTransport.pprint()
mTransport.vX.display()
for j in mTransport.j:
print(mTransport.dual[mTransport.eDemand[j]])
import pandas as pd
capacities = pd.DataFrame(
[["seattle", 350], ["san-diego", 600]], columns=["city", "capacity"] c = Parameter(
).set_index("city") container=m,
demands = pd.DataFrame( name="c",
[["new-york", 325], ["chicago", 300], ["topeka", 275]], columns=["city", "demand"] domain=[i, j],
).set_index("city") description="cost per unit of shipment between plant i and market j",
distances = pd.DataFrame( )
[ cost = freight_cost * distances / 1000
["seattle", "new-york", 2.5], c.setRecords(cost.reset_index())
["seattle", "chicago", 1.7], x = Variable(
["seattle", "topeka", 1.8], container=m,
["san-diego", "new-york", 2.5], name="x",
["san-diego", "chicago", 1.8], domain=[i, j],
["san-diego", "topeka", 1.4], type="Positive",
], description="amount of commodity to ship from plant i to market j",
columns=["from", "to", "distance"], )
).set_index(["from", "to"]) supply = Equation(
freight_cost = 90 container=m, name="supply", domain=i, description="observe supply limit at plant i"
from gamspy import Container, Set, Parameter, Variable, Equation, Model, Sum, Sense )
m = Container() demand = Equation(
i = Set(container=m, name="i", description="plants") container=m, name="demand", domain=j, description="satisfy demand at market j"
i.setRecords(capacities.index) )
j = Set(container=m, name="j", description="markets", records=demands.index) obj = Sum((i, j), c[i, j] * x[i, j])
a = Parameter( transport = Model(
container=m, m,
name="a", name="transport",
domain=i, equations=[supply, demand],
description="supply of commodity at plant i (in cases)", problem="LP",
records=capacities.reset_index(), sense=Sense.MIN,
) objective=obj,
b = Parameter( )
container=m, import sys
name="b", transport.solve(output=sys.stdout)
domain=j, x.records.set_index(["i", "j"])
description="demand for commodity at market j (in cases)", transport.objective_value
records=demands.reset_index(),
)
Additional Features
Sets
• Subsets
mSDUC.g = Set(initialize=mSDUC.gg, ordered=False, doc='generating units', filter=lambda mSDUC,gg: gg in mSDUC.gg and pRatedMaxPower[gg] > 0.0)
mSDUC.t = Set(initialize=mSDUC.g , ordered=False, doc='thermal units', filter=lambda mSDUC,g : g in mSDUC.g and pLinearVarCost [g] > 0.0)
.fix
# fixing the ESS inventory at the last load level .unfix
for sc,p,es in mTEPES.sc*mTEPES.p*mTEPES.es: .free
mTEPES.vESSInventory[sc,p,mTEPES.n.last(),es].fix(pESSInitialInventory[es]) .setlb
.setub
• Counting constraints
print('eBalance ... ', len(mTEPES.eBalance), ' rows’)
• Counting time
StartTime = time.time()
# constraint creation
• Distributed computing
• Create the problems and send them to be solved in parallel
• Retrieve the solution once solved
model.ConstraintName.deactivate()
model.del_component(model.SetName)
model.ConstraintName.activate()
# Developed by
#
# Andres Ramos
# Instituto de Investigacion Tecnologica
# Escuela Tecnica Superior de Ingenieria - ICAI
# UNIVERSIDAD PONTIFICIA COMILLAS
# Alberto Aguilera 23
# 28015 Madrid, Spain
# [email protected]
# https://pascua.iit.comillas.edu/aramos/Ramos_CV.htm
#
# May 8, 2023
mTransport.pA = Param(mTransport.i, initialize={'Vigo' : 350, 'Algeciras': 700 }, doc='origin capacity' ) mTransport.vX = Var (mTransport.i, mTransport.j, bounds=(0.0,None), doc='units transported',
mTransport.pB = Param(mTransport.j, initialize={'Madrid': 400, 'Barcelona': 450, 'Valencia': 150}, doc='destination demand', mutable=True) within=NonNegativeReals)
mTransport.pD = Param(mTransport.j, initialize={'Madrid': 400, 'Barcelona': 450, 'Valencia': 150}, doc='destination demand', mutable=True)
def eCapacity(mTransport, i):
TransportationCost = { return sum(mTransport.vX[i,j] for j in mTransport.j) <= mTransport.pA[i]
('Vigo', 'Madrid' ): 0.06, mTransport.eCapacity = Constraint(mTransport.i, rule=eCapacity, doc='maximum capacity of each origin')
('Vigo', 'Barcelona'): 0.12,
('Vigo', 'Valencia' ): 0.09, def eDemand (mTransport, j):
('Algeciras', 'Madrid' ): 0.05, return sum(mTransport.vX[i,j] for i in mTransport.i) >= mTransport.pB[j]
('Algeciras', 'Barcelona'): 0.15, mTransport.eDemand = Constraint(mTransport.j, rule=eDemand, doc='demand supply at destination' )
('Algeciras', 'Valencia' ): 0.11,
} def eCost(mTransport):
return sum(mTransport.pC[i,j]*mTransport.vX[i,j] for i,j in mTransport.i*mTransport.j)
mTransport.eCost = Objective(rule=eCost, sense=minimize, doc='transportation cost')
mTransport.dual = Suffix(direction=Suffix.IMPORT)
opt = appsi.solvers.Gurobi()
timer = HierarchicalTimer()
for p_val in np.linspace(0.8, 1):
for j in mTransport.j:
mTransport.pB[j] = float(p_val)*mTransport.pD[j]
res = opt.solve(mTransport, timer=timer)
assert res.termination_condition == appsi.base.TerminationCondition.optimal
Good Optimization Modeling Practices with Pyomo. May 2024
print(mTransport.eCost())
print(timer)
32
Hashi puzzle (https://en.wikipedia.org/wiki/Hashiwokakero)
Connect bridges between islands, according to their values, to form
one interconnecting path
7
1 2 3 4 5 6 7 8 9 10 11 12 13
Good Optimization Modeling Practices with Pyomo. May 2024 33
# Developed by
# Andres Ramos
https://github.com/IIT-EnergySystemModels/Fixed-Charge-Transportation-Problem-Benders-Decomposition/blob/main/HashiPuzzle.py
# Instituto de Investigacion Tecnologica
# Escuela Tecnica Superior de Ingenieria - ICAI
# UNIVERSIDAD PONTIFICIA COMILLAS
# Alberto Aguilera 23
# 28015 Madrid, Spain
# [email protected]
import pandas as pd
from pyomo.environ import ConcreteModel, NonNegativeIntegers, Set, Param, Var, Constraint, Objective, minimize
from pyomo.opt import SolverFactory
from collections import defaultdict
SolverName = 'gurobi'
# model declaration
mHP = ConcreteModel('Hashi puzzle')
mHP.x = Set(initialize=['x00', 'x01', 'x02', 'x03', 'x04', 'x05', 'x06', 'x07', 'x08', 'x09', 'x10', 'x11'], ordered=True, doc='abscises' )
mHP.y = Set(initialize=['y00', 'y01', 'y02', 'y03', 'y04', 'y05', 'y06', 'y07', 'y08', 'y09', 'y10', 'y11'], ordered=True, doc='ordinates')
Nodes = {
('x05', 'y01'): 1,
('x02', 'y02'): 3,
('x04', 'y02'): 3,
('x06', 'y02'): 4, Arcs = set()
('x03', 'y03'): 7, Neighbors = defaultdict(list)
('x05', 'y03'): 6, for x,y,xx,yy in mHP.x*mHP.y*mHP.x*mHP.y:
('x09', 'y03'): 1, if (x,y) in Nodes and (xx,yy) in Nodes:
('x04', 'y04'): 7, if mHP.x.ord(x) > 2 and mHP.x.ord(x) < len(mHP.x)-2 and mHP.y.ord(y) > 1 and mHP.y.ord(y) < len(mHP.y)-1:
('x06', 'y04'): 6, if ((xx,yy) == (mHP.x.next(x),mHP.y.next(y)) or (xx,yy) == (mHP.x.next(x,2),y) or (xx,yy) == (mHP.x.next(x),mHP.y.prev(y)) or
('x10', 'y04'): 7, (xx,yy) == (mHP.x.prev(x),mHP.y.prev(y)) or (xx,yy) == (mHP.x.prev(x,2),y) or (xx,yy) == (mHP.x.prev(x),mHP.y.next(y))):
('x07', 'y05'): 5, # add the arc and its neighbors to the list
('x09', 'y05'): 7, Arcs.add((x, y, xx, yy))
('x11', 'y05'): 4, Arcs.add((xx, yy, x, y))
('x06', 'y06'): 5, Neighbors[x,y].append((xx,yy))
('x08', 'y06'): 6,
} # parameters
mHP.pNodes = Param(mHP.x, mHP.y, initialize=Nodes, doc='number of connections per node')
# Variables
mHP.vConnection = Var(Arcs, within=NonNegativeIntegers, doc='connections between two neighbor nodes')
def eTotalConnections(mHP):
return sum(mHP.vConnection[x,y,xx,yy] for x,y,xx,yy in Arcs)
mHP.eTotalConnections = Objective(rule=eTotalConnections, sense=minimize, doc='total number of connections')
Solver = SolverFactory(SolverName)
mHP.write('connection.lp', io_options={'symbolic_solver_labels': True})
SolverResults = Solver.solve(mHP, tee=True)
SolverResults.write() # summary of the solver results
mHP.vConnection.pprint()
if __name__ == "__main__":
plain_run(mHP, SolverName)
Good Optimization Modeling Practices with Pyomo. May 2024 34
Picking the right campsite
https://pubsonline.informs.org/do/10.1287/orms.2015.03.06/full
• Selecting the ideal camping site requires careful consideration of many factors
such as the proximity to water, availability of firewood and protection from the
elements. Your potential camping sites are shown in the corresponding map.
The map consists of 64 sites each with varying characteristics, including water,
wood, swamp and mosquitoes.
• The quality of a site is determined by a points system. A site that contains water
receives +3 points and a site that is near water receives +1 point. A site that
contains wood receives +2 points and a site that is near wood receives +1 point.
A site that contains a swamp is -2 points and a site that is near a swamp is -1
point. A site that contains mosquitoes receives -3 points and a site that is near
mosquitoes is -2 points.
• “Near” is defined as adjacent sites including diagonals. For example, site B5 is
worth 1 point (based on +3 on water, +1 near wood, -2 near mosquitoes, -1
near swamp). Note that you only count points once for each type of
characteristic.
• Where is the best campsite?
I I I
300
250
200 Lower Bound
150 Upper Bound
100
50
0
7 8 9 10 11 12
Iterations
import pandas as pd
import pyomo.environ as pyo
from pyomo.environ import ConcreteModel, Set, Param, Var, Binary, NonNegativeReals, RealSet, Constraint, Objective, minimize, Suffix, TerminationCondition
from pyomo.opt import SolverFactory
mMaster_Bd.l = Set(initialize=['it1', 'it2', 'it3', 'it4', 'it5', 'it6', 'it7', 'it8', 'it9', 'it10'], ordered=True, doc='iterations')
mMaster_Bd.ll = Set( doc='iterations')
mFCTP.pA = Param(mFCTP.i, initialize={'i1': 20, 'i2': 30, 'i3': 40, 'i4': 20}, doc='origin capacity' )
mFCTP.pB = Param(mFCTP.j, initialize={'j1': 20, 'j2': 50, 'j3':30 }, doc='destination demand')
FixedCost = {
('i1', 'j1'): 10,
('i1', 'j2'): 20,
('i1', 'j3'): 30,
('i2', 'j1'): 20,
('i2', 'j2'): 30,
('i2', 'j3'): 40,
('i3', 'j1'): 30,
('i3', 'j2'): 40,
('i3', 'j3'): 50,
('i4', 'j1'): 40,
('i4', 'j2'): 50,
('i4', 'j3'): 60,
}
TransportationCost = {
('i1', 'j1'): 1,
('i1', 'j2'): 2,
('i1', 'j3'): 3,
('i2', 'j1'): 3,
('i2', 'j2'): 2,
('i2', 'j3'): 1,
('i3', 'j1'): 2,
('i3', 'j2'): 3,
('i3', 'j3'): 4,
('i4', 'j1'): 4,
('i4', 'j2'): 3,
('i4', 'j3'): 2,
} Good Optimization Modeling Practices with Pyomo. May 2024 39
FCTP solved by Benders decomposition (ii)
mFCTP.pF = Param(mFCTP.i, mFCTP.j, initialize=FixedCost, doc='fixed investment cost' )
mFCTP.pC = Param(mFCTP.i, mFCTP.j, initialize=TransportationCost, doc='per unit transportation cost')
def eCostMst(mMaster_Bd):
return sum(mFCTP.pF[i,j]*mMaster_Bd.vY[i,j] for i,j in mFCTP.i*mFCTP.j) + mMaster_Bd.vTheta
mMaster_Bd.eCostMst = Objective(rule=eCostMst, sense=minimize, doc='total cost')
def eCostSubp(mFCTP):
return sum(mFCTP.pC[i,j]*mFCTP.vX[i,j] for i,j in mFCTP.i*mFCTP.j) + sum(mFCTP.vDNS[j]*1000 for j in mFCTP.j)
mFCTP.eCostSubp = Objective(rule=eCostSubp, sense=minimize, doc='transportation cost')
Solver = SolverFactory('gurobi')
Solver.options['LogFile'] = 'mFCTP.log'
mFCTP.dual = Suffix(direction=Suffix.IMPORT)
# initialization
Z_Lower = float('-inf')
Z_Upper = float(' inf')
BdTol = 1e-6
# solving subproblem
SolverResultsSbp = Solver.solve(mFCTP)
Z2 = mFCTP.eCostSubp()
Z2_L[l] = Z2
mMaster_Bd.vTheta.free()
Delta[l] = 1
mMaster_Bd.vY.unfix()
mFCTP.eCostSubp.deactivate()
mFCTP.vY.unfix()
def eCost(mFCTP):
return sum(mFCTP.pF[i,j]*mFCTP.vY[i,j] for i,j in mFCTP.i*mFCTP.j) + sum(mFCTP.pC[i,j]*mFCTP.vX[i,j] for i,j in mFCTP.i*mFCTP.j) + sum(mFCTP.vDNS[j]*1000 for j in mFCTP.j)
mFCTP.eCost = Objective(rule=eCost, sense=minimize, doc='total cost')
Disclaimer:
This model is a work in progress and will be
updated accordingly.
# Developed by
# Andres Ramos
# Instituto de Investigacion Tecnologica
# Escuela Tecnica Superior de Ingenieria - ICAI
# UNIVERSIDAD PONTIFICIA COMILLAS
# Alberto Aguilera 23
# 28015 Madrid, Spain
# [email protected]
# https://pascua.iit.comillas.edu/aramos/Ramos_CV.htm
# with the very valuable collaboration from David Dominguez ([email protected]) and Alejandro Rodriguez ([email protected]), our local Python gurus
#%% Libraries
import argparse
import os
import pandas as pd
import time # count clock time
import psutil # access the number of CPUs
import pyomo.environ as pyo
from pyomo.environ import Set, Var, Binary, NonNegativeReals, RealSet, Constraint, ConcreteModel, Objective, minimize, Suffix, DataPortal
from pyomo.opt import SolverFactory
print('\n #### Academic research license - for non-commercial use only #### \n')
StartTime = time.time()
DIR = os.path.dirname(__file__)
CASE = '16g'
SOLVER = 'gurobi'
# compute the demand as the mean over the time step load levels and assign it to active load levels. Idem for operating reserve, variable max power, variable min and max storage
capacity and inflows
pDemand = pDemand.rolling (pTimeStep).mean()
pOperReserveUp = pOperReserveUp.rolling (pTimeStep).mean()
pOperReserveDw = pOperReserveDw.rolling (pTimeStep).mean()
pVariableMinPower = pVariableMinPower.rolling (pTimeStep).mean()
pVariableMaxPower = pVariableMaxPower.rolling (pTimeStep).mean()
pVariableMinStorage = pVariableMinStorage.rolling(pTimeStep).mean()
pVariableMaxStorage = pVariableMaxStorage.rolling(pTimeStep).mean()
pEnergyInflows = pEnergyInflows.rolling (pTimeStep).mean()
if pTimeStep > 1:
# assign duration 0 to load levels not being considered, active load levels are at the end of every pTimeStep
for i in range(pTimeStep-2,-1,-1):
pDuration[range(i,len(mSDUC.nn),pTimeStep)] = 0
#%% defining subsets: active load levels (n), thermal units (t), ESS units (es), all the lines (la), candidate lines (lc) and lines with losses (ll)
mSDUC.n = Set(initialize=mSDUC.nn, ordered=True , doc='load levels' , filter=lambda mSDUC,nn: nn in mSDUC.nn and pDuration [nn] > 0 )
mSDUC.n2 = Set(initialize=mSDUC.nn, ordered=True , doc='load levels' , filter=lambda mSDUC,nn: nn in mSDUC.nn and pDuration [nn] > 0 )
mSDUC.g = Set(initialize=mSDUC.gg, ordered=False, doc='generating units', filter=lambda mSDUC,gg: gg in mSDUC.gg and pRatedMaxPower[gg] > 0.0)
mSDUC.t = Set(initialize=mSDUC.g , ordered=False, doc='thermal units', filter=lambda mSDUC,g : g in mSDUC.g and pLinearVarCost [g] > 0.0)
mSDUC.r = Set(initialize=mSDUC.g , ordered=False, doc='RES units', filter=lambda mSDUC,g : g in mSDUC.g and pLinearVarCost [g] == 0.0 and pRatedMaxStorage[g] == 0.0)
mSDUC.es = Set(initialize=mSDUC.g , ordered=False, doc='ESS units', filter=lambda mSDUC,g : g in mSDUC.g and pRatedMaxStorage[g] > 0.0)
# non-RES units
mSDUC.nr = mSDUC.g - mSDUC.r
if pTimeStep > 1:
# drop levels with duration 0
pDuration = pDuration.loc [mSDUC.sc*mSDUC.n]
pDemand = pDemand.loc [mSDUC.sc*mSDUC.n]
pOperReserveUp = pOperReserveUp.loc [mSDUC.sc*mSDUC.n]
pOperReserveDw = pOperReserveDw.loc [mSDUC.sc*mSDUC.n]
pVariableMinPower = pVariableMinPower.loc [mSDUC.sc*mSDUC.n]
pVariableMaxPower = pVariableMaxPower.loc [mSDUC.sc*mSDUC.n]
pVariableMinStorage = pVariableMinStorage.loc[mSDUC.sc*mSDUC.n]
pVariableMaxStorage = pVariableMaxStorage.loc[mSDUC.sc*mSDUC.n]
pEnergyInflows = pEnergyInflows.loc [mSDUC.sc*mSDUC.n]
# values < 1e-5 times the maximum system demand are converted to 0
pEpsilon = pDemand.max()*1e-5
# these parameters are in GW
pDemand [pDemand < pEpsilon] = 0.0
pOperReserveUp [pOperReserveUp < pEpsilon] = 0.0
pOperReserveDw [pOperReserveDw < pEpsilon] = 0.0
pMinPower [pMinPower < pEpsilon] = 0.0
pMaxPower [pMaxPower < pEpsilon] = 0.0
pMaxPower2ndBlock[pMaxPower2ndBlock < pEpsilon] = 0.0
pMaxCharge [pMaxCharge < pEpsilon] = 0.0
pEnergyInflows [pEnergyInflows < pEpsilon/pTimeStep] = 0.0
# these parameters are in GWh
pMinStorage [pMinStorage < pEpsilon] = 0.0
pMaxStorage [pMaxStorage < pEpsilon] = 0.0
# fixing the ESS inventory at the last load level at the end of the time scope
for sc,es in mSDUC.sc*mSDUC.es:
mSDUC.vESSInventory[sc,mSDUC.n.last(),es].fix(pInitialInventory[es])
#%% definition of the time-steps leap to observe the stored energy at ESS
pCycleTimeStep = pUpTime*0
for es in mSDUC.es:
if pStorageType[es] == 'Daily' :
pCycleTimeStep[es] = 1
if pStorageType[es] == 'Weekly' :
pCycleTimeStep[es] = int( 24/pTimeStep)
if pStorageType[es] == 'Monthly' :
pCycleTimeStep[es] = int( 168/pTimeStep)
# fixing the ESS inventory at the end of the following pCycleTimeStep (weekly, yearly), i.e., for daily ESS is fixed at the end of the week, for weekly/monthly ESS is fixed at the end of the year
for sc,n,es in mSDUC.sc*mSDUC.n*mSDUC.es:
if pStorageType[es] == 'Daily' and mSDUC.n.ord(n) % ( 168/pTimeStep) == 0:
mSDUC.vESSInventory[sc,n,es].fix(pInitialInventory[es])
if pStorageType[es] == 'Weekly' and mSDUC.n.ord(n) % (8736/pTimeStep) == 0:
mSDUC.vESSInventory[sc,n,es].fix(pInitialInventory[es])
if pStorageType[es] == 'Monthly' and mSDUC.n.ord(n) % (8736/pTimeStep) == 0:
mSDUC.vESSInventory[sc,n,es].fix(pInitialInventory[es])
def eTotalECost(mSDUC):
return mSDUC.vTotalECost == sum(pScenProb[sc] * pCO2Cost * pCO2EmissionRate[nr] * mSDUC.vTotalOutput[sc,n,nr] for sc,n,nr in mSDUC.sc*mSDUC.n*mSDUC.nr)
mSDUC.eTotalECost = Constraint(rule=eTotalECost, doc='total system emission cost [MEUR]')
def eTotalTCost(mSDUC):
return mSDUC.vTotalVCost + mSDUC.vTotalECost
mSDUC.eTotalTCost = Objective(rule=eTotalTCost, sense=minimize, doc='total system cost [MEUR]')
#%% constraints
def eOperReserveUp(mSDUC,sc,n):
if pOperReserveUp[sc,n]:
return sum(mSDUC.vReserveUp [sc,n,nr] for nr in mSDUC.nr) >= pOperReserveUp[sc,n]
else:
return Constraint.Skip
mSDUC.eOperReserveUp = Constraint(mSDUC.sc, mSDUC.n, rule=eOperReserveUp, doc='up operating reserve [GW]')
def eOperReserveDw(mSDUC,sc,n):
if pOperReserveDw[sc,n]:
return sum(mSDUC.vReserveDown[sc,n,nr] for nr in mSDUC.nr) >= pOperReserveDw[sc,n]
else:
return Constraint.Skip
mSDUC.eOperReserveDw = Constraint(mSDUC.sc, mSDUC.n, rule=eOperReserveDw, doc='down operating reserve [GW]')
def eBalance(mSDUC,sc,n):
return sum(mSDUC.vTotalOutput[sc,n,g] for g in mSDUC.g) - sum(mSDUC.vESSCharge[sc,n,es] for es in mSDUC.es) + mSDUC.vENS[sc,n] == pDemand[sc,n]
mSDUC.eBalance = Constraint(mSDUC.sc, mSDUC.n, rule=eBalance, doc='load generation balance [GW]')
def eESSInventory(mSDUC,sc,n,es):
if mSDUC.n.ord(n) == pCycleTimeStep[es]:
return pInitialInventory[es] + sum(pDuration[n2]*(pEnergyInflows[es][sc,n2] - mSDUC.vTotalOutput[sc,n2,es] + pEfficiency[es]*mSDUC.vESSCharge[sc,n2,es]) for n2
in list(mSDUC.n2)[mSDUC.n.ord(n)-pCycleTimeStep[es]:mSDUC.n.ord(n)]) == mSDUC.vESSInventory[sc,n,es] + mSDUC.vESSSpillage[sc,n,es]
elif mSDUC.n.ord(n) > pCycleTimeStep[es] and mSDUC.n.ord(n) % pCycleTimeStep[es] == 0:
return mSDUC.vESSInventory[sc,mSDUC.n.prev(n,pCycleTimeStep[es]),es] + sum(pDuration[n2]*(pEnergyInflows[es][sc,n2] - mSDUC.vTotalOutput[sc,n2,es] + pEfficiency[es]*mSDUC.vESSCharge[sc,n2,es]) for n2
in list(mSDUC.n2)[mSDUC.n.ord(n)-pCycleTimeStep[es]:mSDUC.n.ord(n)]) == mSDUC.vESSInventory[sc,n,es] + mSDUC.vESSSpillage[sc,n,es]
else:
return Constraint.Skip
mSDUC.eESSInventory = Constraint(mSDUC.sc, mSDUC.n, mSDUC.es, rule=eESSInventory, doc='ESS inventory balance [GWh]')
def eMinOutput2ndBlock(mSDUC,sc,n,nr):
if pOperReserveDw[sc,n] and pMaxPower2ndBlock[nr][sc,n]:
return (mSDUC.vOutput2ndBlock[sc,n,nr] + mSDUC.vReserveDown[sc,n,nr]) / pMaxPower2ndBlock[nr][sc,n] >= 0.0
else:
return Constraint.Skip
mSDUC.eMinOutput2ndBlock = Constraint(mSDUC.sc, mSDUC.n, mSDUC.nr, rule=eMinOutput2ndBlock, doc='min output of the second block of a committed unit [p.u.]')
def eTotalOutput(mSDUC,sc,n,nr):
if pMinPower[nr][sc,n] == 0.0:
return mSDUC.vTotalOutput[sc,n,nr] == mSDUC.vOutput2ndBlock[sc,n,nr]
else:
return mSDUC.vTotalOutput[sc,n,nr] / pMinPower[nr][sc,n] == mSDUC.vCommitment[n,nr] + mSDUC.vOutput2ndBlock[sc,n,nr] / pMinPower[nr][sc,n]
mSDUC.eTotalOutput = Constraint(mSDUC.sc, mSDUC.n, mSDUC.nr, rule=eTotalOutput, doc='total output of a unit [GW]')
def eUCStrShut(mSDUC,n,nr):
if n == mSDUC.n.first():
return mSDUC.vCommitment[n,nr] - pInitialUC[nr] == mSDUC.vStartUp[n,nr] - mSDUC.vShutDown[n,nr]
else:
return mSDUC.vCommitment[n,nr] - mSDUC.vCommitment[mSDUC.n.prev(n),nr] == mSDUC.vStartUp[n,nr] - mSDUC.vShutDown[n,nr]
mSDUC.eUCStrShut = Constraint(mSDUC.n, mSDUC.nr, rule=eUCStrShut, doc='relation among commitment startup and shutdown')
#%%
def eRampUp(mSDUC,sc,n,t):
if pRampUp[t] and pRampUp[t] < pMaxPower2ndBlock[t][sc,n] and n == mSDUC.n.first():
return (mSDUC.vOutput2ndBlock[sc,n,t] - max(pInitialOutput[t]-pMinPower[t][sc,n],0.0) + mSDUC.vReserveUp [sc,n,t]) / pDuration[n] / pRampUp[t] <= mSDUC.vCommitment[n,t] - mSDUC.vStartUp[n,t]
elif pRampUp[t] and pRampUp[t] < pMaxPower2ndBlock[t][sc,n]:
return (mSDUC.vOutput2ndBlock[sc,n,t] - mSDUC.vOutput2ndBlock[sc,mSDUC.n.prev(n),t] + mSDUC.vReserveUp [sc,n,t]) / pDuration[n] / pRampUp[t] <= mSDUC.vCommitment[n,t] - mSDUC.vStartUp[n,t]
else:
return Constraint.Skip
mSDUC.eRampUp = Constraint(mSDUC.sc, mSDUC.n, mSDUC.t, rule=eRampUp, doc='maximum ramp up [p.u.]')
def eRampDw(mSDUC,sc,n,t):
if pRampDw[t] and pRampDw[t] < pMaxPower2ndBlock[t][sc,n] and n == mSDUC.n.first():
return (mSDUC.vOutput2ndBlock[sc,n,t] - max(pInitialOutput[t]-pMinPower[t][sc,n],0.0) - mSDUC.vReserveDown[sc,n,t]) / pDuration[n] / pRampDw[t] >= - pInitialUC[t] + mSDUC.vShutDown[n,t]
elif pRampDw[t] and pRampDw[t] < pMaxPower2ndBlock[t][sc,n]:
return (mSDUC.vOutput2ndBlock[sc,n,t] - mSDUC.vOutput2ndBlock[sc,mSDUC.n.prev(n),t] - mSDUC.vReserveDown[sc,n,t]) / pDuration[n] / pRampDw[t] >= - mSDUC.vCommitment[mSDUC.n.prev(n),t] + mSDUC.vShutDown[n,t]
else:
return Constraint.Skip
mSDUC.eRampDw = Constraint(mSDUC.sc, mSDUC.n, mSDUC.t, rule=eRampDw, doc='maximum ramp down [p.u.]')
def eMinDownTime(mSDUC,n,t):
if pDwTime[t] > 1 and mSDUC.n.ord(n) >= pDwTime[t]:
return sum(mSDUC.vShutDown[n2,t] for n2 in list(mSDUC.n2)[mSDUC.n.ord(n)-pDwTime[t]:mSDUC.n.ord(n)]) <= 1 - mSDUC.vCommitment[n,t]
else:
return Constraint.Skip
mSDUC.eMinDownTime = Constraint(mSDUC.n, mSDUC.t, rule=eMinDownTime, doc='minimum down time [h]')
#%% fix values of binary variables to get dual variables and solve it again
for n,nr in mSDUC.n*mSDUC.nr:
mSDUC.vCommitment[n,nr].fix(round(mSDUC.vCommitment[n,nr]()))
mSDUC.vStartUp [n,nr].fix(round(mSDUC.vStartUp [n,nr]()))
mSDUC.vShutDown [n,nr].fix(round(mSDUC.vShutDown [n,nr]()))
mSDUC.dual = Suffix(direction=Suffix.IMPORT)
SolverResults = Solver.solve(mSDUC, tee=True) # tee=True displays the output of the solver
SolverResults.write() # summary of the solver results
fig, fg = plt.subplots()
for r in mSDUC.r:
fg.plot(range(len(mSDUC.sc*mSDUC.n)), RESCurtailment[:,:,r], label=r)
fg.set(xlabel='Hours', ylabel='MW')
fg.set_ybound(lower=0)
plt.title('RES Curtailment')
fg.tick_params(axis='x', rotation=90)
fg.legend()
plt.tight_layout()
#plt.show()
plt.savefig(_path+'/oUC_Plot_RESCurtailment_'+CaseName+'.png', bbox_inches=None)
OutputResults = pd.Series(data=[sum(OutputResults[sc,n,es] for es in mSDUC.es if (gt,es) in mSDUC.t2g) for sc,n,gt in mSDUC.sc*mSDUC.n*mSDUC.gt if sum(1 for es in mSDUC.es if (gt,es) in mSDUC.t2g)], index=pd.MultiIndex.from_tuples([(sc,n,gt) for
sc,n,gt in mSDUC.sc*mSDUC.n*mSDUC.gt if sum(1 for es in mSDUC.es if (gt,es) in mSDUC.t2g)]))
OutputResults.to_frame(name='MW' ).reset_index().pivot_table(index=['level_0','level_1'], columns='level_2', values='MW' ).rename_axis(['Scenario','LoadLevel'], axis=0).rename_axis([None],
axis=1).to_csv(_path+'/oUC_Result_TechnologyCharge_'+CaseName+'.csv', sep=',')
TechnologyCharge = OutputResults.loc[:,:,:]
OutputResults = pd.Series(data=[sum(OutputResults[sc,n,es] for es in mSDUC.es if (gt,es) in mSDUC.t2g) for sc,n,gt in mSDUC.sc*mSDUC.n*mSDUC.gt if sum(1 for es in mSDUC.es if (gt,es) in mSDUC.t2g)], index=pd.MultiIndex.from_tuples([(sc,n,gt) for
sc,n,gt in mSDUC.sc*mSDUC.n*mSDUC.gt if sum(1 for es in mSDUC.es if (gt,es) in mSDUC.t2g)]))
OutputResults.to_frame(name='GWh').reset_index().pivot_table(index=['level_0','level_1'], columns='level_2', values='GWh').rename_axis(['Scenario','LoadLevel'], axis=0).rename_axis([None],
axis=1).to_csv(_path+'/oUC_Result_ESSTechnologyEnergy_'+CaseName+'.csv', sep=',')
TechnologyOutput = OutputResults.loc[:,:,:]
fig, fg = plt.subplots()
for sc in mSDUC.sc:
fg.plot(range(len(mSDUC.n)), SRMC[sc], label=sc)
fg.set(xlabel='Hours', ylabel='EUR/MWh')
fg.set_ybound(lower=0, upper=100)
plt.title('SRMC')
fg.tick_params(axis='x', rotation=90)
fg.legend()
plt.tight_layout()
#plt.show()
plt.savefig(_path+'/oUC_Plot_SRMC_'+CaseName+'.png', bbox_inches=None)
if __name__ == '__main__':
main()
Disclaimer:
This model is a work in progress and will be
updated accordingly.