0% found this document useful (0 votes)
9 views30 pages

RM 6 Optimization

This document provides an overview of various research methodology optimization methods. It begins with an introduction to optimization, defining key terms like design variables, objective functions, and constraints. It then classifies optimization methods based on different criteria like constraints, design variable types, and problem structure. Specific methods covered include sequential uniform sampling, Monte Carlo simulation, the simplex method, and gradient optimization. Examples of applying optimization in areas like engineering and manufacturing are also provided.

Uploaded by

meyu721
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views30 pages

RM 6 Optimization

This document provides an overview of various research methodology optimization methods. It begins with an introduction to optimization, defining key terms like design variables, objective functions, and constraints. It then classifies optimization methods based on different criteria like constraints, design variable types, and problem structure. Specific methods covered include sequential uniform sampling, Monte Carlo simulation, the simplex method, and gradient optimization. Examples of applying optimization in areas like engineering and manufacturing are also provided.

Uploaded by

meyu721
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 30

Research Methodology

OPTIMIZATION
METHODS
Outline

Introduction
Classification of optimization methods
Sequential uniform sampling
Monte Carlo Method
Simplex Method
Examples
Introduction

Optimization is the process of obtaining the best result


under given circumstances.
Optimization can be defined as the process of finding
the conditions that give the maximum or minimum of a
function.
The optimum seeking methods are also known as
mathematical programming techniques and are
generally studied as a part of operations research or
numerical methods.
The eventual intention behind is to obtain a best
possible solution to a problem mathematically, which
improves or optimizes the performance of the system.
Design variables:
The formulation of an optimization problem begins with
identifying the underlying design variables, which are
primarily varied during the optimization process.
A design problem usually involves many design
parameters, of which some are highly sensitive to the
proper working of the design. These parameters are
called design variables in optimization procedures.
Other (not so important) design parameters usually
remain fixed or vary in relation to the design variables.
Introduction

Objective Function (also called Cost function)


The conventional design procedures aim at finding an
acceptable or adequate design which merely satisfies the
functional and other requirements of the problem.
In general, there will be more than one acceptable
design, and the purpose of optimization is to choose the
best one of the many acceptable designs available.
Thus a criterion has to be chosen for comparing the
different alternative acceptable designs and for selecting
the best one.
The criterion with respect to which the design is
optimized, when expressed as a function of the design
variables, is known as the objective function.
Introduction

In mechanical/electrical engineering, the maximization of


the efficiency may be the obvious choice of an objective
function.
In aerospace structural design problems, the objective
function for minimization may be generally taken as
weight.
In civil engineering, the objective may be taken as the
minimization of the cost.
In some situations, there may be more than one criterion to
be satisfied simultaneously. In such cases the optimization
may be simplified by combining them using weighting
factors.
Introduction

 Constraints
 The constraints represent some functional relationships among the
design variables and other design parameters satisfying certain
physical phenomenon and certain resource limitations. The nature and
number of constraints to be included in the formulation depend on
the user. Constraints may have exact mathematical expressions or not.
 For example, maximum stress is a constraint of a structure. If a
structure has regular shape they have an exact mathematical relation
of maximum stress with dimensions. But in case irregular shape, finite
element simulation software may be necessary to compute the
maximum stress.
 The following two types of constraints emerge from most
considerations:
 Inequality type constraints.
 Equality type constraints.
Introduction

Variable Ranges:
The final task of the formulation procedure is to set
the minimum and the maximum bounds or ranges
on each design variable. Certain optimization
algorithms do not require this information. In these
problems, the constraints completely surround the
feasible region. Other problems require the search
algorithm within these bounds.
Classification of Optimization Methods

Classification based on:


 Constraints
 Constrained optimization problem
 Unconstrained optimization problem

 Nature of the design variables


 Static optimization problems
 Dynamic optimization problems
Classification of Optimization Methods

Classification based on:


 Physical structure of the problem
 Optimal control problems
 Non-optimal control problems

 Nature of the equations involved


 Linear programming problem
 Nonlinear programming problem
 Geometric programming problem
 Quadratic programming problem
Classification of Optimization Methods

Classification based on:


 Permissable values of the design variables
 Integer programming problems
 Real valued programming problems

 Deterministic nature of the variables


 Stochastic programming problem
 Deterministic programming problem
Classification of Optimization Methods

Classification based on:

 Separability of the functions


 Separable programming problems
 Non-separable programming problems

 Number of the objective functions


 Single objective programming problem
 Multiobjective programming problem
Sequential Uniform Sampling

Let us consider an objective function or Cost


function C to be optimized with respect to two
variables L and ω.
The most obvious approach to optimization is to test
all possible values of L and ω.
It will be necessary to choose the range limits for L
and ω, a relatively small increment for both L and ω,
and sequentially step through all possible values of
both parameters in the solution space range. This
technique is referred to as sequential uniform
sampling.
Sequential Uniform Sampling

The flow chart in Figure outlines the


algorithm used for sequential
uniform sampling.
Sequential uniform sampling
algorithm for two parameter
optimization.
The two parameters to be optimized
are L and ω. C is the cost function.
The optimum values of L and ω
correspond to the minimum value
of the cost function.
Sequential Uniform Sampling

The first attempt should use a coarse grid with large


incremental steps in each of the parameters. Once a
possible optimized parameter range has been obtained,
a smaller range with smaller incremental steps can be
used to improve the resolution of the solution.
There are flaws in this method of approach in that ‘local
minimum’ values in the cost function might not be
defined and investigated. It is possible that the best
minimum cost function value will be missed if the initial
incremental values are set too large for all parameters.
Monte Carlo Method

The Monte Carlo optimization


technique is based on random
selection of parameter values
within the defined range.
The routine checks C for the
randomly chosen values of each
parameter until the predefined
total number of samples has been
completed. The optimum values
correspond to the minimum value
of the cost function.
Monte Carlo Method

The Monte Carlo method is an unguided


optimization technique.
If the search range for L and ω is restricted to
ranges closer to the correct value, then the
probability of gaining useful results increases
significantly. This is only possible if an approximate
solution is known.
In this case the Monte Carlo method is used to
improve the accuracy of the solution.
Simplex Method

This optimization method employs a directed strategy


to determine the minimum value of the cost function.
The start point for L and ω in the range is selected
randomly. The cost function for two adjacent points is
calculated. The first is a δω step in the direction from
the start point. The second is a δL step in the L
direction from the start point.
These three adjacent points are called a simplex. In
three-dimensional analysis the simplex has four points
and in an N-dimensional problem the simplex has N+1
points.
Simplex Method

The path towards the optimal solution is guided by


comparing the cost functions for each of the three
adjacent points in the simplex.
From this information, a new point is selected by
stepping in the direction of the minimum C value
(away from the position of maximum C in the
simplex).
This new point is added to the simplex and the point
with the highest C value is eliminated. This forms a
new three point simplex. The process is repeated
until no reduction in C is observed.
Simplex Method

The simplex algorithm flow


chart. The start point L i, j , ω i, j is
selected randomly. The next two
points in the simplex are
determined.
The cost function is calculated
for all three points and the point
with the maximum cost
function is replaced. The
process continues until a
minimum value of the cost
function C is determined.
Gradient Optimization Method

This method is similar to the simplex method, but


uses a slightly more direct route to the solution by
placing more emphasis on the relative values of the
three points of the cost function in the simplex.
The slope of the path through the solution space is
calculated using the gradient vector
Gradient Optimization Method

The gradient optimization algorithm.


Using a randomly selected start
position, the cost function for all
points on the simplex is calculated.
The gradient from these points
defines the direction of the path
through the solution space.
When the cost function no longer
decreases, a new simplex is defined
and the process is repeated until no
further decrease in C is possible.
Other Methods

Other methods that you may refer


Quadratic Method
Newton Method
Quasi – Newton Method
Optimization Solver in Excel
Optimization Formulation Case
Optimization Solver in Excel

PANEL MATERIAL OPTIMIZATION


Panel Manufacture
Four panel options which require different inputs of
glue, pressing time, pine chips and oak chips.
The profit per pallet depends on the panel type.
The resources available are limited.
There is a need to find out the number of pallets
from each panel type that will maximize the profit
with the constraint of the available resources.
Panel
Type
T P S A
Total
Pallets 0 0 0 0 Profit
Profit 450 1150 800 400 0

Resource need per pallet Used Available


Glue 50 50 100 50 0 5800quarts
Pressing 5 15 10 5 0 730hours
Pine Chips 500 400 300 200 0 29200pounds
Oak chips 500 750 250 500 0 60500pounds
Result

Panel
Type
T P S A
Total
Pallets 23 15 39 0 Profit
Profit 450 1150 800 400 58800

Resource need per pallet


Used Available
Glue 50 50 100 50 5800 5800quarts
Pressing 5 15 10 5 730 730hours
Pine
Chips 500 400 300 200 29200 29200pounds
Oak chips 500 750 250 500 32500 60500pounds

You might also like