5 Optimization Techniques
5 Optimization Techniques
Techniques
Contents
• Types of optimization-Constrained and Unconstrained
optimization
• Methods of Optimization
• Numerical Optimization
• Bracketing Methods-Bisection Method
• False Position Method
• Newton’s Method
• Steepest Descent Method
• Penalty Function Method.
Queen Dido’s problem
What is optimization?
• “Optimization” comes from the same root as “optimal”, which means
best. When you optimize something, you are “making it best”.
• But “best” can vary.
1.Transportation Routing: Companies like Uber and Ola use optimization to
calculate the most efficient routes for drivers picking up and dropping off
passengers, minimizing travel time and costs.
2.Supply Chain Management: Businesses optimize their supply chain
logistics to minimize costs while ensuring timely delivery of products to
consumers.
3.Product Design: Engineers optimize the design of cars, airplanes, and
other vehicles to improve performance, fuel efficiency, and safety while
considering manufacturing constraints.
• Both maximizing and minimizing are types of optimization problems.
What is optimization?
• Mathematical optimization is the process of finding the
maximum or minimum value of an objective function by
systematically choosing the best available values from a
set of permissible inputs.
Application areas
• Manufacturing • Production
• Inventory control • Transportation
• Scheduling • Networks
• Finance • Engineering
• Mechanics • Economics
• Control engineering • Marketing
• Policy Modeling
Ingredients of Optimization
• An objective function
• Decision variable
• Constraints
Ingredients of Optimization
• The objective function, f(x), which is the output you’re
trying to maximize or minimize
• Examples:
• yield per time in a chemical reaction
• mileage per liter in a car
• Revenue in a product of TV sets
• Tensile strength of a rope
• Cost per unit production of a radio
• Operating cost of a power plant
• Time of travel from one city to another
Ingredients of Optimization
• Decision/Control variables: Variables, x1 x2 x3 and so
on, which are the inputs – things you can control. They
are abbreviated xn to refer to individuals or x to refer to
them as a group.
• Example: Efficiency of air conditioning system: pressure,
temperature, moisture content, area, etc.
• In optimization theory, we develop methods for optimal
choices of control variable to maximize( or minimize)
the objective function
Ingredients of Optimization
• Constraints are equations that place limits on how big
or small some variables can get. Constraints arise from
the nature of problems an the variables
• Example: If x1 is production cost, x1 >= 0.
• Constraint can be in form of equalities or inequalities
• Equality constraints are usually noted hn (x) and
inequality constraints are noted gn (x)
Statement of an Optimization
Problem
• Find X=, which minimize f (x)
• Subject to
• hi(x1,..., xn ) =0 i=1,2 .. m (m equality
constraints)
• gj (x1,..., xn ) <=0 j=1,2 .. n (n equality
constraints)
Examples:
• For each of the following tasks, write an objective
function (“maximize ____”) and at least two constraints
(“subject to _____ ≤ c1 ”, or ≥ or =)
1. A student must create a poster project for a class.
2. A shipping company must deliver packages to
customers.
3. A grocery store must decide how to organize the store
layout
Regression
Types of optimization
• Optimization problems can be classified based on the
type of constraints, nature of design variables, physical
structure of the problem, nature of the equations
involved, deterministic nature of the variables,
permissible value of the design variables, separability of
the functions and number of objective functions.
Types of optimization
• Constrained optimization problems: which are subject to one
or more constraints.
• Unconstrained optimization problems: in which no constraints
exist.
• Example:
• Find the path between two points that minimizes the distance
traveled
• We need to enclose a field with a fence. We have 500 ft of
fencing material. There is a building on one side of the field,
for which no fencing is needed. Determine maximum area of
the rectangular shape field that can be enclosed by the fence.
Methods of Optimization: Numerical
Optimization-Unconstrained
Optimization
• A point x* on a function is said to be a critical point if f '
(x*) = 0.
• This is the first order condition for x* to be a
maximum/minimum.
• Second order condition:
• x* is a maximum of f(x) if f ' '(x*) < 0;
• x* is a minimum of f(x) if f ' '(x*) > 0;
• x* can be a maximum, a minimum or neither if f ' '(x*)
=0
Example
• Suppose you have the following function:
• f(x) = x3 – 6x2 + 9x
• Then the first order condition to find the critical
points is:
• f’(x) = 3x2 - 12x + 9 = 0
• This implies that the critical points are at x = 1 and x
4
= 3.
0
-2
-4
-6
-8
-0.5 0 0.5 1 1.5 2 2.5 3 3.5 4
Example
• The next step is to determine whether the critical
points are maximums or minimums.
• These can be found by using the second order condition.
• f ' '(x) = 6x – 12 = 6(x-2)
• Testing x = 1 implies:
• f ' '(1) = 6(1-2) = -6 < 0.
• Hence at x =1, we have a maximum.
• Testing x = 3 implies:
• f ' '(3) = 6(3-2) = 6 > 0.
• Hence at x =3, we have a minimum.
Local and global minima/maxima
• A local maximum is a point that f(x*) ≥ f(x) for all x in
some open interval containing x* and a local minimum
is a point that f(x*) ≤ f(x) for all x in some open interval
containing x*;
• A global maximum is a point that f(x*) ≥ f(x) for all x in
the domain of f and a global minimum is a point that
f(x*) ≤ f(x) for all x in the domain of f.
• For the previous example, f(x) as x and f(x) -
as x -. Neither critical point is a global max or min of
f(x).
Local and global minima/maxima
• When f ''(x)≥0 for all x, i.e., f(x) is a convex function,
then the local minimum x* is the global minimum of f(x)
• When f ''(x)≤0 for all x, i.e., f(x) is a concave function,
then the local maximum x* is the global maximum of
f(x)
Conditions for a Minimum or a
Maximum Value of a Function of
Several Variables
• Correspondingly, for a function f(x) of
several independent variables x
• Calculatef x and set it to zero.
• Solve the equation set to get a solution
vector x*.
• Calculate2 f x .
• Evaluate it at x*.
• Inspect the Hessian matrix at point x*.
H x 2 f x
Hessian Matrix of f(x)
3 3
f ( x, y ) x y 9 xy
Firstly, computing the first order partial derivatives (i.e.,
gradient of f(x,y)) and setting them to zero
f
x 3 x 9 y
2
f ( x, y ) 0
f 2
3y 9x
y
critical points x*, y *is (0,0) and ( 3, -3 ).
Example (Cont.)
We now compute the Hessian of f(x,y)
2 f 2 f
2
2 x xy 6 x 9
f ( x, y ) 2 2 9 6 y .
f f
yx y 2
The first order leading principal minor is 6x and the
second order principal minor is -36xy-81.
At (0,0), these two minors are 0 and -81, respectively.
Since the second order leading principal minor is
negative, (0,0) is a saddle of f(x,y), i.e., neither a max
nor a min.
At (3, -3), these two minors are 18 and 243. So, the
Hessian is positive definite and (3,-3) is a local min of
f(x,y).
Is (3, -3) a global min?
Global Maxima and Minima of a
Function of Several Variables
• Let f(x) be a C2 function in Rn, then
• Given that the initial interval [a,b] meets the above conditions, we
can now proceed with the bisection method and get the optimal
root values.
Working of Bisection
Algorithm
• Suppose an interval [a,b] cointains at least one root, i.e, f(a) and
f(b) have opposite signs, then using the bisection method, we
determine the roots as follows:
• Bisect−−−−− the initial interval and set the new values to x0,
i.e. x0=b+a / 2.
• Note: x0 is the midpoint of the interval [a,b].
• Setting we obtain