MathEng5-M - Part 7-1
MathEng5-M - Part 7-1
Optimization
A. Bracketing Method
1. Golden-Section Search
2. Quadratic Interpolation
B. Open Method
Newton’s Method
3. The function is evaluated at theses two interior points. Two results can occur.
a. If 𝑓 𝑥1 > 𝑓 𝑥2 , then the domain of x to the left of 𝑥2 , from 𝑥𝑙 to 𝑥2 , can be eliminated
because it does not contain the maximum. For this case, 𝑥2 becomes the new 𝑥𝑙 for the
next round.
b. If 𝑓 𝑥2 > 𝑓 𝑥1 , then the domain of x to the right of 𝑥1 , from 𝑥1 to 𝑥𝑢 , can be
eliminated because it does not contain the maximum. For this case, 𝑥1 becomes the new 𝑥𝑢
for the next round.
4. Termination criteria
𝑥𝑢 −𝑥𝑙 5−1
𝜀𝑎 = 1 − 𝑅 100% R=
𝑥𝑜𝑝𝑡 2
𝑰 𝒙𝒍 𝒇 𝒙𝒍 𝒙𝟐 𝒇 𝒙𝟐 𝒙𝟏 𝒇 𝒙𝟏 𝒙𝒖 𝒇 𝒙𝒖 𝒅
1 0.00000000 0.00000000 0.76393202 8.18788519 1.23606798 4.81418201 2.00000000 -104.00000000 1.23606798
… … … … … … … … … …
26 0.91691033 8.69792982 0.91691489 8.6979298252 0.91691770 8.6979298250 0.91692225 8.69792982 0.00000737
Steps:
1. Guess an initial value of the root, 𝑥0 , 𝑥𝑙 , and 𝑥𝑢 . 𝑓 𝑥𝑙 𝑓 𝑥𝑢 < 0.
2. Solve for 𝑥3
𝑓 𝑥0 𝑥12 −𝑥22 +𝑓 𝑥1 𝑥22 −𝑥02 +𝑓 𝑥2 𝑥02 −𝑥12
𝑥3 =
2𝑓 𝑥0 𝑥1 −𝑥2 +2𝑓 𝑥1 𝑥2 −𝑥0 +2𝑓 𝑥2 𝑥0 −𝑥1
3. A strategy similar to golden-section search can be employed to determine
which point should be discarded.
I 𝐱𝟎 𝐟 𝐱𝟎 𝐱𝟏 𝐟 𝐱𝟏 𝐱𝟐 𝐟 𝐱𝟐 𝐱𝟑 𝐟 𝐱𝟑
1 0.00000000 0.00000000 1.00000000 8.50000000 2.00000000 -104.00000000 0.57024793 6.57990854
… … … … … … … … …
9 0.91687266 8.69792978 0.91690077 8.69792982 1.00000000 8.50000000 0.91691250 8.69792983
𝑓′ 𝑥𝑖
Formula: 𝑥𝑖+1 = 𝑥𝑖 −
𝑓" 𝑥𝑖
Problem:
Solve for the value of x that maximizes 𝑓 𝑥 = −1.5𝑥 6 − 2𝑥 4 + 12𝑥 using the golden-
section search. Employ initial guesses of 𝑥0 = 2 and perform three iterations.
Solution:
𝑓 𝑥 = −1.5𝑥 6 − 2𝑥 4 + 12𝑥
𝑓′ 𝑥 = −9𝑥 5 − 8𝑥 3 + 12 𝑓" 𝑥 = −45𝑥 4 − 24𝑥 2
… … … … …
4 1.04771560 8.17861564 -8.56281128 -80.56831818
5 0.94143547 8.68184520 -1.33088363 -56.62002371
6 0.91792994 8.69790303 -0.05284773 -52.17080295
7 0.91691697 8.69792983 -0.00009395 -51.98540229
8 0.91691516 8.69792983 0.00000000 -51.98507199
• Direct Method
Random Search
• Gradient Methods
1. Gradients and Hessians
2. Steepest Ascent Method
Direct methods vary from simple brute force approaches to a more elegant techniques that
attempt to exploit the nature of the function.
Random Search – a simple example of brute force approach. This method repeatedly
evaluates the function at randomly selected values of the independent variables. If sufficient
number of samples are conducted, the optimum will eventually be located.
For a given 𝑓(𝑥, 𝑦), let r be the random number generator
𝑥 = 𝑥𝑙 + 𝑥𝑢 − 𝑥𝑙 𝑟 𝑦 = 𝑦𝑙 + 𝑦𝑢 − 𝑦𝑙 𝑟