Lec 5
Lec 5
Nonlinear programming:
One dimensional minimization methods
Optimality Criteria
• Local optimal point: A point or solution x∗ is said to be a local
optimal point, if there exists no point in the neighbourhood of x∗
which is better than x∗. In the parlance of minimization problems, a
point x∗ is a locally minimal point if no point in the neighbourhood
has a function value smaller than f (x∗).
• Analytical method
• The direct root methods are root finding methods that can be
considered to be equivalent to quadratic interpolation
Bracketing Method – Exhaustive Search
• The exhaustive search begins from a lower bound on the variable
and three consecutive function values are compared at a time based
on the assumption of unimodality of the function.
• Based on the outcome of comparison, the search is either
terminated or continued by replacing one of the three points by a
new point. The search continues until the minimum is bracketed.
2. If f (x1) ≥ f (x2) ≤ f (x3), the minimum point lies in (x1, x3), Terminate;
Else x1 = x2, x2 = x3, x3 = x2 + ∆x, and go to Step 3.
1 0 0.5 1 108.25 55
3 1 1.5 2 55 38.25 31
5 2 2.5 3 31 27.85 27
K 0 1 2 3 4
x 0.6 1.1 2.1 4.1 8.1
f(x) 90.36 50.3 30.1 29.98 72.27
f1 < f2 f1 > f2 f1 = f2
Unimodal function
• If the outcome is f1 < f2, the
minimum of x can not lie to
the right of x2
• In Fig (c), two more experiments are to be placed in the new interval in
order to find a reduced interval of uncertainty.
Based on the relative values of the objective function at the two points,
almost half of the interval of uncertainty is eliminated.
Dichotomous Search
• The intervals of uncertainty at the ends of different
pairs of function evaluations are given in the following
table.
Number of 2 4 6
evaluations
Final interval
1 L0 1 L0
of uncertainty (L0+ )/2
2 2 2 2 4 2 2
4.If f (x2) < f (xo) set a = xo; xo = x2; go to Step 5; Else set
a = x1, b = x2; go to Step 5.
𝑓 ′ (𝜆) = 0
Three root finding methods will be considered here:
• Newton method
• Quasi-Newton method
• Secant methods
Newton method
Consider the quadratic approximation of the function f ()
at = i using the Taylor’s series expansion:
1 ″
𝑓(𝜆) = 𝑓(𝜆𝑖 ) + 𝑓 ′ (𝜆 𝑖 )(𝜆 − 𝜆𝑖 ) + 𝑓 (𝜆𝑖 )(𝜆 − 𝜆𝑖 )
2
2
0.75 −1
1
𝑓(𝜆) = 0.65 − − 0.65𝜆 tan
1 + 𝜆2 𝜆
Iteration 1
1=0.1, f (1) = -0.188197, f’ (1) = -0.744832, f’’
(1)=2.68659
𝑓 ′ (𝜆1 )
𝜆2 = 𝜆1 − ″ = 0.377241
𝑓 (𝜆1 )
′
𝑓(𝜆𝑖 + Δ𝜆) − 𝑓(𝜆𝑖 − Δ𝜆)
𝑓 (𝜆𝑖 ) =
2Δ𝜆
″
𝑓(𝜆𝑖 + Δ𝜆) − 2𝑓(𝜆𝑖 ) + 𝑓(𝜆𝑖 − Δ𝜆)
𝑓 (𝜆𝑖 ) =
Δ𝜆2
′
𝑓(𝜆𝑖 + Δ𝜆) − 𝑓(𝜆𝑖 − Δ𝜆)
𝑓 (𝜆𝑖 ) =
2Δ𝜆
″
𝑓(𝜆𝑖 + Δ𝜆) − 2𝑓(𝜆𝑖 ) + 𝑓(𝜆𝑖 − Δ𝜆)
𝑓 (𝜆𝑖 ) =
Δ𝜆2
𝑓 ′ (𝜆𝑖 )
into 𝜆𝑖+1 = 𝜆𝑖 − ″
𝑓 (𝜆𝑖 )
leads to
Δ𝜆[𝑓(𝜆𝑖 + Δ𝜆) − 𝑓(𝜆𝑖 − Δ𝜆)]
𝜆𝑖+1 = 𝜆𝑖 −
2[𝑓(𝜆𝑖 + Δ𝜆) − 2𝑓(𝜆𝑖 ) + 𝑓(𝜆𝑖 − Δ𝜆)]
Quasi-Newton Method
This iterative process is known as the quasi-Newton method.
To test the convergence of the iterative process, the following
criterion can be used:
′
𝑓(𝜆𝑖+1 + Δ𝜆) − 𝑓(𝜆𝑖+1 − Δ𝜆)
𝑓 (𝜆𝑖+1 ) = <𝜀
2Δ𝜆
Remarks:
Δ𝜆[𝑓(𝜆𝑖 + Δ𝜆) − 𝑓(𝜆𝑖 − Δ𝜆)]
The equation 𝜆𝑖+1 = 𝜆𝑖 −
2[𝑓(𝜆𝑖 + Δ𝜆) − 2𝑓(𝜆𝑖 ) + 𝑓(𝜆𝑖 − Δ𝜆)]
0.75 −1
1
𝑓(𝜆) = 0.65 − − 0.65𝜆 tan
1 + 𝜆2 𝜆
using the quasi-Newton method with the starting point 1=0.1 and the
step size =0.01 in central difference formulas. Use =0.01 in equation
𝑓 ′ (𝜆𝑖 ) 𝑓 ′ (𝐴)(𝐵 − 𝐴)
𝜆𝑖+1 = 𝜆𝑖 − =𝐴− ′
𝑠 𝑓 (𝐵) − 𝑓 ′ (𝐴)
1. Set 1=A=0 and evaluate f’(A). The value of f’(A) will be negative.
Assume an initial trial step length t0.
2. Evaluate f’(t0).
3. If f’(t0)<0, set A= i=t0, f’(A)= f’(t0), new t0=2t0, and go to step 2.
4. If f’(t0)≥0, set B= t0, f’(B)= f’(t0), and go to step 5.
5. Find the new approximate solution of the problem as:
𝑓 ′ (𝐴)(𝐵 − 𝐴)
𝜆𝑖+1 =𝐴− ′
𝑓 (𝐵) − 𝑓 ′ (𝐴)
Secant method
6. Test for convergence:
𝑓 ′ (𝜆𝑖+1 ) ≤ 𝜀
8. If f’(İ+1) < 0, set new A= İ+1, f’(A) =f’(İ+1), i=i+1, and go to step 5.
Secant method
Remarks:
• When the first derivatives of the function being minimized are available,
the cubic interpolation method or the secant method are expected to be
very efficient.
• On the other hand, if both the first and the second derivatives of the
function are available, the Newton method will be the most efficient one in
finding the optimal step length, *.