0% found this document useful (0 votes)
14 views29 pages

Fundamentals of Optimization Detailed

Optimization is the process of maximizing or minimizing a function under constraints, applicable in various fields such as engineering and economics. Common problems include design optimization, resource allocation, and statistical analysis, with methods like Lagrange multipliers and game theory for strategic decision-making. Problems can be classified based on constraints, nature of variables, equations involved, and the number of objectives, impacting the solution approach.

Uploaded by

farazkazi1470
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views29 pages

Fundamentals of Optimization Detailed

Optimization is the process of maximizing or minimizing a function under constraints, applicable in various fields such as engineering and economics. Common problems include design optimization, resource allocation, and statistical analysis, with methods like Lagrange multipliers and game theory for strategic decision-making. Problems can be classified based on constraints, nature of variables, equations involved, and the number of objectives, impacting the solution approach.

Uploaded by

farazkazi1470
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 29

Fundamentals of Optimization

What is Optimization?

• Optimization involves maximizing or minimizing a function relative to constraints.

• Applications: Cost minimization, profit maximization, optimal design, and control.

• Key Areas: Linear programming, nonlinear programming, integer programming.

• Optimization is essential in engineering, economics, machine learning, and logistics.


Common Optimization Problems in Engg.
• Design aircraft for minimum weight and maximum machinery while minimizing heat generation.
strength. • Shortest route of salesperson visiting various cities during
• Optimal trajectories of space vehicles. one sales trip.

• Design civil engineering structures for minimum cost. • Optimal planning and scheduling.

• Design water-resource projects like dams to mitigate flood • Statistical analysis and models with minimum error.
damage while yielding maximum hydropower. • Optimal pipeline networks.
• Predict structural behavior by minimizing potential energy. • Inventory control.

• Material-cutting strategy for minimum cost. • Maintenance planning to minimize cost.


• Design pump and heat transfer equipment for maximum • Minimize waiting and idling times.
efficiency. • Design waste treatment systems to meet water-quality
• Maximize power output of electrical networks and standards at least cost.
Problem Formulation
Setting up an optimization problem

• Define the objective function (minimize or maximize).

• Specify the constraints (equality and inequality constraints).

• Identify feasible and optimal solutions.

• Differentiate between local and global optima.

• Ensure well-posedness for numerical stability.


Unconstrained Minimization
Optimization without constraints
• Focuses on finding minima of a function without any restrictions.
• Uses first-order and second-order conditions.
• Methods:
– Gradient Descent,
– Newton's Method,
– Quasi-Newton Methods.
• Applications:
– Machine learning,
– physics,
– economics.
Constrained Minimization
Optimization with constraints
• Constraints define feasible regions for optimization.

• Methods: Lagrange multipliers, Karush-Kuhn-Tucker (KKT) conditions.
• Used in structural engineering, economics, and resource allocation.
Lagrange Multipliers
Handling constraints in optimization
• A method to solve constrained optimization problems.
• Introduces auxiliary variables (multipliers) to incorporate constraints.
• Transforms the problem into an unconstrained optimization problem.
• Key equation: ∇ ∇g(x) = 0.
• Used in physics, engineering, and machine learning.
Games and Duality
Optimization and strategic decision-making
• Game theory: Models competition and cooperation in optimization.
• Nash Equilibrium: A key concept in non-cooperative games.
• Duality: Provides an alternative formulation of an optimization problem.
• Strong and weak duality theorems help in solving complex problems.
• Applications: Economics, finance, network design, and AI.
Exercises
Practice problems and case studies

• Solve unconstrained and constrained optimization problems.


• Apply Lagrange multipliers to real-world scenarios.
• Analyze game theory models and duality principles.
• Implement numerical optimization techniques.
• Test optimization algorithms in engineering and economics.
Practice Example
• determine the terminal velocity of a free-falling
body near the earth’s surface
if the parachutist is initially at rest (v = 0 at t = 0), calculus can be used to
solve Eq.
• Problem Statement. A parachutist of mass 68.1 kg jumps out of a stationary hot air
balloon. Use Eq. (1.10) to compute velocity prior to opening the chute. The drag
coefficient is equal to 12.5 kg/s.
Assignment problem to figure out a solution by Next
interaction
• You are an engineer working for an agency planning to airlift supplies to refugees in a
war zone. The supplies will be dropped at low altitude (500 m) so that the drop is not
detected and the supplies fall as close as possible to the refugee camp. The chutes open
immediately upon leaving the plane. To reduce damage, the vertical velocity on impact
must be below a critical value of vc = 20 m/s
Week 2
Lecture 2

CLASSIFICATION OF OPTIMIZATION
PROBLEMS
Classification Based on the Existence of Constraints
• Any optimization problem can be classified as constrained or unconstrained, depending on whether constraints exist
in the problem.
• In this category, the problem is to find values to a set of design parameters that make some prescribed function of these
parameters minimum subject to certain constraints.
• For example, the problem of minimum-weight design of a prismatic beam shown in Figure subject to a limitation on
the maximum deflection can be stated as follows:

where 𝜌 is the density and 𝛿tip is the tip deflection of the beam.
Such problems are called parameter or static optimization problems.
Classification Based on the Nature of the Design
Variables
• In the second category of problems, the objective is to find a set of design parameters, which are all continuous
functions of some other parameter, that minimizes an objective function subject to a set of constraints.
• If the cross-sectional dimensions of the rectangular beam are allowed to vary along its length as shown in Figure, the
optimization problem can be stated as:

subject to the constraints,

Here the design variables are functions of the length parameter t.


This type of problem, where each design variable is a function of one or more parameters, is known as a trajectory or
dynamic optimization problem.
Classification Based on the Nature of the Design
Variables
• A rocket is designed to travel a distance of 12 s in a vertically upward direction. The thrust of the rocket can be
changed only at the discrete points located at distances of 0, s, 2s, 3s, ... , 12s. If the maximum thrust that can be
developed at point i either in the positive or negative direction is restricted to a value of Fi;, formulate the problem of
minimizing the total time of travel under the following assumptions: 1. The rocket travels against the gravitational
force. 2. The mass of the rocket reduces in proportion to the distance traveled. 3. The air resistance is proportional to
the velocity of the rocket.
Classification Based on the Physical Structure of the
Problem
• Depending on the physical structure of the problem, optimization problems can be classified as optimal control and
nonoptimal control problems.
• An optimal control (OC) problem is a mathematical programming problem involving a number of stages, where each
stage evolves from the preceding stage in a prescribed manner.
• It is described by two types of variables: the control (design) and the state variables.
• The control variables define the system and govern the evolution of the system from one stage to the next,
• and the state variables describe the behavior or status of the system in any stage.
• The problem is to find a set of control or design variables such that the total objective function (also known as the
performance index, PI) over all the stages is minimized subject to a set of constraints on the control and state variables.

• An OC problem can be stated as follows:


Find X which minimizes :
Classification Based on the Nature of the Equations
Involved
• Optimization problems can be classified as linear, nonlinear, geometric, and quadratic programming problems.
• This classification is extremely useful from the computational point of view since there are many special methods
available for the efficient solution of a particular class of problems
• This will, in many cases, dictate the types of solution procedures to be adopted in solving the problem.

• Nonlinear Programming Problem - If any of the functions among the objective and constraint functions in Eq. (1.1)
is nonlinear, the problem is called an NLP problem. This is the most general programming problem and all other
problems can be considered as special cases of the NLP problem.
• Geometric Programming Problem A GMP problem is one in which the objective function and constraints are
• expressed as posynomials in X
• A function h(X) is called a posynomial if h can be expressed as the sum of power terms each of the form

Classification Based on the Nature of the Equations
Involved
• Quadratic Programming Problem - A quadratic programming problem is a NLP problem with a quadratic objective
function and linear constraints.

• Linear Programming Problem - If the objective function and all the constraints in are linear functions of the design
variables, the mathematical programming problem is called a linear programming (LP) problem.
A manufacturing firm produces two products, A and B, using two limited resources. The maximum amounts of
resources 1 and 2 available per day are 1000 and 250 units, respectively. The production of 1 unit of product A
requires 1 unit of resource 1 and 0.2 unit of resource 2, and the production of 1 unit of product B requires 0.5 unit of
resource 1 and 0.5 unit of resource 2. The unit costs of resources 1 and 2 are given by the relations (0.375 -
0.00005u1) and (0.75 - 0.0001u2), respectively, where
ui denotes the number of units of resource i used (i = 1, 2). The selling prices per unit
of products A and B, pA and pB, are given by

pA = 2.00 - 0.0005xA - 0.00015xB


pB = 3.50 - 0.0002xA - 0.0015xB

where xA and xB indicate, respectively, the number of units of products A and B sold. Formulate the problem of
maximizing the profit assuming that the firm can sell all the units it manufactures.
Classification Based on the Permissible Values of the
Design Variables
• Depending on the values permitted for the design variables, optimization problems can be classified as integer and
real-valued programming problems.
• Integer Programming Problem If some or all of the design variables x1, x2, . . . ,xn of an optimization problem are
restricted to take on only integer (or discrete) values, the problem is called an integer programming problem .
• On the other hand, if all the design variables are permitted to take any real value, the optimization problem is called a
real-valued programming problem .
Classification Based on the Deterministic Nature of the
Variables
• Stochastic Programming Problem - A stochastic programming problem is an optimization problem in which some
or all of the parameters (design variables and/or preassigned parameters) are probabilistic (nondeterministic or
stochastic)
• Deterministic Programming Problem - refers to a computational problem where, given a specific set of input data,
the solution will always be the same, meaning the algorithm used to solve it will produce a predictable, unique output
every time it runs, with no element of randomness involved; essentially, the outcome is completely determined by the
input and the defined set of operations.
Classification Based on the Separability of the
Functions
• Optimization problems can be classified as separable and non-separable programming problems based on the
separability of the objective and constraint functions.
• Separable Programming Problem
– A function f (X) is said to be separable if it can be expressed as the sum of n single-variable functions, f1(x1), f2(x2), . . . , fn(xn)
– A separable programming problem is one in which the objective function and the constraints are separable .
• A retail store stocks and sells three different models of TV sets. The store cannot afford to have an inventory worth more
than $45 000 at any time. The TV sets are ordered in lots. It costs $a j for the store whenever a lot of TV model j is
ordered.
The cost of one TV set of model j is cj. The demand rate of TV model j is dj units per year. The rate at which the
inventory costs accumulate is known to be proportional to the investment in inventory at any time, with qj = 0.5, denoting
the constant of proportionality for TV model j. Each TV set occupies an area of sj = 0.40 m2 and the
maximum storage space available is 90 m2. The data known from the past experience are given below.

• Formulate the problem of minimizing the average annual cost of ordering and storing the TV sets.
Classification Based on the Number of Objective
Functions
• Depending on the number of objective functions to be minimized, optimization problems can be classified as single-
and multi-objective programming problems.
• Multi-objective Programming Problem A multi-objective programming problem can be stated as follows:

Find X which minimizes f1(X), f2(X), . . . , fk(X)

subject to

gj(X , j = 1, 2, . . . , m

where f1, f2, . . . , fk denote the objective functions to be minimized simultaneously

You might also like