0% found this document useful (0 votes)
84 views

Lecture 3 - Optimization PDF

This document summarizes optimization techniques for finding the optimum values of variables in single-variable and multi-variable functions. It introduces the golden-section search and parabolic interpolation methods for locating the optimum of a single-variable function, as well as direct and gradient methods for multi-variable functions. Examples are provided to illustrate the application of golden-section search and parabolic interpolation to optimize specific functions. MATLAB code for implementing golden-section search is also demonstrated.

Uploaded by

Haziq Khairi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
84 views

Lecture 3 - Optimization PDF

This document summarizes optimization techniques for finding the optimum values of variables in single-variable and multi-variable functions. It introduces the golden-section search and parabolic interpolation methods for locating the optimum of a single-variable function, as well as direct and gradient methods for multi-variable functions. Examples are provided to illustrate the application of golden-section search and parabolic interpolation to optimize specific functions. MATLAB code for implementing golden-section search is also demonstrated.

Uploaded by

Haziq Khairi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 26

Lecture 3: Optimization

Dr. Nor Alafiza Yunus


Faculty of Chemical and Energy Engineering
Universiti Teknologi Malaysia, 81310 UTM
Johor Bahru, Malaysia
1
 Locate the optimum of a single variable function
with the golden-section search and parabolic
interpolation methods.
 Locate the optimum of several variables
function using direct and gradient methods.
 Recognize the trade-offs among these
approaches, with particular attention to initial
guesses and convergence.

2
 Introduction to optimization
 One-Dimensional Unconstrained
 Golden-Section Search
 Parabolic Interpolation

 Multidimensional-Dimensional
Unconstrained
 Direct Methods
 Gradients Methods

 MATLAB program
3
 Optimization is the process of creating something that is
as effective as possible.
 Engineers are always confronting optimization problems
that attempt to balance performance and limitations.
 From a mathematical perspective, optimization deals
with finding the maxima and minima of a function that
depends on one or more variables.
 The goal is to determine the values of the variables that
yield maxima or minima for the function.

4
 One-dimensional problems involve functions that depend
on a single dependent variable -for example, f(x).
 Multidimensional problems involve functions that depend
on two or more dependent variables - for example, f(x,y)

5
 A global optimum represents the very best
solution while a local optimum is better than its
immediate neighbours. Cases that include local
optima are called multimodal.
 Generally desire to find the global optimum.

6
 Introduction to optimization
 One-Dimensional Unconstrained
 Golden-Section Search
 Parabolic Interpolation

 Multidimensional-Dimensional
Unconstrained
 Direct Methods
 Gradients Methods

 MATLAB program
7
 Search algorithm for finding a minimum on
an interval [xl , xu] with a single minimum
(unimodal interval)
 Uses the golden ratio Φ, to determine
location of two interior points x1 and x2; by
using the golden ratio, one of the interior
points can be re-used in the next iteration.

𝟏+ 𝟓
∅= = 𝟏. 𝟔𝟏𝟖𝟎𝟑𝟑𝟗𝟖𝟖
𝟐

8
To determine the minimum of one-
dimensional function,
𝑥1 = 𝑥𝑙 + 𝑑
𝑥2 = 𝑥𝑢 − 𝑑

where, 𝑑 = ∅ − 1 𝑥𝑢 − 𝑥𝑙

 If f(x1)<f(x2), f(x1) is the minimum


 x2 becomes the new lower limit
 x1 becomes the new x2 (as in fig. (a)).
 If f(x2)<f(x1), f(x2) is the minimum
 x1 becomes the new upper limit
 x2 becomes the new x1.

 In either case, only one new interior point is


needed and the function is only evaluated
one more time.
9
 Finding minimum
 If f(x1)<f(x2), x2 => xl and x1 => x2
 If f(x2)<f(x1), x1 => xu and x2 => x1

 Finding maximum
 If f(x1)>f(x2), x2 => xl and x1 => x2
 If f(x2)>f(x1), x1 => xu and x2 => x1

 Relative approx. Error, εa


𝑥𝑢 − 𝑥𝑙
𝜀𝑎 = 2 − ∅ × 100%
𝑥𝑜𝑝𝑡 10
 Use the golden-section search to find the
minimum of
𝑥2
𝑓 𝑥 = − 2sin 𝑥
10

within the interval from xl = 0 to xu = 4.

11
 Iteration 1
𝑑 = ∅ − 1 𝑥𝑢 − 𝑥𝑙 = 0.6180 4 − 0 = 2.4721
𝑥1 = 𝑥𝑙 + 𝑑 = 0 + 2.4721 = 2.4721
𝑥2 = 𝑥𝑢 − 𝑑 = 4 − 2.4721 = 1.5279
2.47212
𝑓 𝑥1 = − 2 sin 2.4721 = −0.6300
10
1.52792
𝑓 𝑥2 = − 2 sin 1.5279 = −1.7647 (min )
10

if 𝑓 𝑥2 < 𝑓 𝑥1 , then x1=xu , x2=x1


Xopt=x2 = 1.5279,
4−0
𝜀𝑎 = 2 − 1.6180 × 100% = 100
1.5279 12
 Iteration 2
𝑑 = ∅ − 1 𝑥𝑢 − 𝑥𝑙 = 0.6180 2.4721 − 0 = 1.5279
𝑥1 = 𝑥𝑙 + 𝑑 = 0 + 1.5279 = 1.5279
𝑥2 = 𝑥𝑢 − 𝑑 = 2.4721 − 1.5279 = 0.9443
1.52792
𝑓 𝑥1 = − 2 sin 1.5279 = −1.7647 (min )
10
0.94432
𝑓 𝑥1 = − 2 sin 0.9443 = −1.5310
10

if 𝑓 𝑥1 < 𝑓 𝑥2 , then x2=xl , x1=x2


Xopt=x1 = 1.5279,
2.4721 − 0
𝜀𝑎 = 2 − 1.6180 × 100% = 61.8030
1.5299 13
i xl x2 x1 xu f(x2) f(x1) d ea
1 0.0000 1.5279 2.4721 4.0000 -1.7647 -0.6300 2.4721 100.0000

2 0.0000 0.9443 1.5279 2.4721 -1.5310 -1.7647 1.5278 61.8030

3 0.9443 1.5279 1.8885 2.4721 -1.7647 -1.5432 0.9443 38.1961

4 0.9443 1.3050 1.5279 1.8885 -1.7595 -1.7647 0.5836 23.6063

5 1.3050 1.5279 1.6656 1.8885 -1.7647 -1.7136 0.3607 13.3829

6 1.3050 1.4427 1.5279 1.6656 -1.7755 -1.7647 0.2229 9.5490

7 1.3050 1.3901 1.4427 1.5279 -1.7742 -1.7755 0.1378 5.9022

8 1.3901 1.4427 1.4753 1.5279 -1.7755 -1.7732 0.0851 3.6477

14
 Given the formula

𝑦 𝑡 = 𝑡 2 − 8𝑡 + 12

a) Determine the minimum and the


corresponding value of x for this function
analytically
b) Verify the results in a using golden-section
search based on initial t=0 and t=5

15
 Using the golden-section search, solve for the
value of x that maximizes f (x) in problem

f (x) = −1.5x6 − 2x4 + 12x

 Employ initial guesses of xl = 0 and xu = 2, and


perform three iterations.

16
i xl x2 x1 xu f(x2) f(x1) d ea
1 0.0000 0.7639 1.2361 2.0000 8.1879 4.8144 1.2361 100.0000
2 0.0000 0.4721 0.7639 1.2361 5.5497 8.1879 0.7639 61.8030
3 0.4721 0.7639 0.9443 1.2361 8.1879 8.6779 0.4721 30.9019
4 0.7639 0.9443 1.0557 1.2361 8.6779 8.1074 0.2918 19.0980
5 0.7639 0.8754 0.9443 1.0557 8.6552 8.6779 0.1803 11.8031
6 0.8754 0.9443 0.9868 1.0557 8.6779 8.5599 0.1115 7.2947
7 0.8754 0.9180 0.9443 0.9868 8.6979 8.6779 0.0689 4.6375
8 0.8754 0.9017 0.9180 0.9443 8.6920 8.6979 0.0426 2.8658
9 0.9017 0.9180 0.9280 0.9443 8.6979 8.6947 0.0263 1.7711
10 0.9017 0.9117 0.9180 0.9280 8.6972 8.6979 0.0163 1.0946
11 0.9117 0.9180 0.9218 0.9280 8.6979 8.6973 0.0100 0.6765

17
 Another algorithm uses parabolic interpolation of three
points to estimate optimum location.
 The location of the maximum/minimum of a parabola
defined as the interpolation of three points (x1, x2, and
x3) is:
1  x2  x1   f  x2   f  x3    x2  x3   f  x2   f  x1 
2 2
x4  x2 
2  x2  x1  f  x2   f  x3    x2  x3  f  x2   f  x1 

 The new point x4 and the two


surrounding it (either x1 and x2
or x2 and x3) are used for the
next iteration of the algorithm.

18
 Use the golden-section search parabolic
interpolation to find the minimum of
𝑥2
𝑓 𝑥 = − 2sin 𝑥
10

with initial guess x1 = 0 , x2 = 1 and x3 = 4.

19
x4=x2-0.5*A/B

𝑥4,𝑛𝑒𝑤 − 𝑥4,𝑜𝑙𝑑
𝜀𝑎 = × 100%
𝑥4,𝑛𝑒𝑤

i x1 x2 x3 A B x4 f(x1) f(x2) f(x3) f(x4) ea


1 0 1 4 9.5499 -9.4454 1.5055 0 -1.5829 3.1136 -1.7691 -
2 1 1.5055 4 -0.0896 -2.9327 1.4903 -1.58294 -1.7691 3.1136 -1.7714 1.0255
3 1 1.4903 1.5055 -0.0005 -0.0040 1.4256 -1.58294 -1.7714 -1.7691 -1.7757 4.5325
4 1 1.4256 1.4903 0.0000 -0.0143 1.4266 -1.58294 -1.7757 -1.7714 -1.7757 0.0677
5 1.4256 1.4266 1.4903 0.0000 0.0000 1.4275 -1.77572 -1.7757 -1.7714 -1.7757 0.0663
6 1.4266 1.4275 1.4903 0.0000 0.0000 1.4276 -1.77572 -1.7757 -1.7714 -1.7757 0.0002

20
Consider the following function:

f (x) = x4 + 2x3 + 8x2 + 5x

Emply the following method to find the


minimum of the function
a) Golden-section search (xl = −2, xu = 1, ɛs
= 1%)
b) Parabolic interpolation (x1 = −2, x2 = −1,
x3 = 1, iterations = 5)

21
 Introduction to optimization
 One-Dimensional Unconstrained
 Golden-Section Search
 Parabolic Interpolation

 Multidimensional-Dimensional
Unconstrained
 Direct Methods
 Gradients Methods

 MATLAB program
23
% golden-section search
Line 1. Function
Line 2. Lower and upper value
Line 3. Call for goldmin function

Example of MATLAB command

>> fun=@(x) x^2/10-2*sin(x);


>> xl=0; xu=4;
>>[x,fx,ea,iter]=goldmin(fun,xl,xu)

24
25
26
 MATLAB has a built-in function, fminbnd, which
combines the golden-section search and the parabolic
interpolation.
[xmin, fval] = fminbnd(function, x1, x2)
 where x and fval are the location and value of the
minimum, function is the name of the function being
evaluated, and x1 and x2 are the bounds of the
interval being searched.

>> f=@(x) x^2/10-2*sin(x);


>> x1=0; x2=4;
>> [xmin,fval]=fminbnd(f, x1, x2)
27

You might also like