0% found this document useful (0 votes)
331 views9 pages

Optimizer Methods HYSYS PDF

The document describes several optimization methods available in HYSYS: 1) SQP is a rigorous sequential quadratic programming method that can handle inequality and equality constraints. 2) BOX and Mixed methods can handle inequality constraints but not equality constraints, with Mixed attempting to leverage BOX and SQP. 3) SQP, Fletcher-Reeves, and Quasi-Newton methods calculate derivatives but only SQP can handle constraints.

Uploaded by

faser04
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
331 views9 pages

Optimizer Methods HYSYS PDF

The document describes several optimization methods available in HYSYS: 1) SQP is a rigorous sequential quadratic programming method that can handle inequality and equality constraints. 2) BOX and Mixed methods can handle inequality constraints but not equality constraints, with Mixed attempting to leverage BOX and SQP. 3) SQP, Fletcher-Reeves, and Quasi-Newton methods calculate derivatives but only SQP can handle constraints.

Uploaded by

faser04
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

HYSYS Optimizer Methods

• Hyprotech SQP is a rigorous sequential quadratic programming (SQP)


optimization solver
• The MDC Optim optimization solver is only available with a
HYSYS.RTO license. The MDC Optim model requires an
associated Derivative Analysis.
• Selection Optimization consists of algorithms that solve Mixed Integer
Non-Linear Programming (MINLP) problems. There are two MINLP
methods available: Stochastic (also known as the simulated
annealing method), and Branch and Bound. These methods use Non-
Linear Programming (NLP) optimizers to solve sub-problems.
• The DMO solver implements a variant of the successive quadratic
programming (SQP) algorithm to solve small or large-scale optimization
problems. It performs the optimization by solving a sequence of quadratic
programming sub-problems.
• LSSQP (Large-scale Sparse Successive Quadratic Programming algorithm)
implements a variant of a class of successive quadratic programming (SQP)
algorithms, for large scale optimization. It performs the optimization by
solving a sequence of quadratic programming subproblems.
• BOBYQA (Bound Optimization BY Quadratic Approximation) method is for
flowsheet optimization for simultaneous convergence of optimization
problems with constraints (equality or inequality).
BOX Method
• The procedure is loosely based on the “Complex” method of BOX1;
the Downhill Simplex algorithm of Press et al and the BOX algorithm
of Kuester and Mize.
• The BOX Method only handles inequality constraints. The BOX
method is a sequential search technique which solves problems with
non-linear objective functions, subject to non-linear inequality
constraints. No derivatives are required. It handles inequality
constraints but not equality constraints. The BOX method is not very
efficient in terms of the required number of function evaluations. It
generally requires a large number of iterations to converge on the
solution. However, if applicable, this method can be very robust.
SQP Method
• The Sequential Quadratic Programming (SQP) Method handles
inequality and equality constraints.
• SQP is considered to be the most efficient method for minimization
with general linear and non-linear constraints, provided a reasonable
initial point is used and the number of primary variables is small.
• The implemented procedure is based entirely on the Harwell
subroutines VF13 and VE17. The program follows closely the
algorithm of Powell.
Mixed Method
• The Mixed method attempts to take advantage of the global
convergence characteristics of the BOX method and the efficiency of
the SQP method. It starts the minimization with the BOX method
using a very loose convergence tolerance (50 times the desired
tolerance). After convergence, the SQP method is then used to locate
the final solution using the desired tolerance.
• The Mixed Method handles inequality constraints only.
Fletcher Reeves Method
• The procedure implemented is the Polak-Ribiere modification of the
Fletcher-Reeves conjugate gradient scheme. The approach closely
follows that ofPress et al, with modifications to allow for lower and
upper variable bounds. This method is efficient for general
minimization with no constraints.
• The Fletcher Reeves (Conjugate Gradient) Method does not handle
constraints.
Quasi-Newton Method
• The Quasi-Newton method of Broyden-Fletcher-Goldfarb-
Shanno (BFGS) according to Press et al has been implemented. In
terms of applicability and limitations, this method is similar to the of
Fletcher-Reeves method.
• The Quasi-Newton Method does not handle constraints. The Quasi-
Newton method calculates the new search directions from
approximations of the inverse of the Hessian Matrix.
Methods Functional Summary
Method Unconstrained Constrained Constrained Calculates
Problems Problems: Problems: Derivatives
Inequality Equality

BOX X X

Mixed X X X

SQP X X X X

Fletcher- X X
Reeves
Quasi- X X
Newton

You might also like