Opt ch6
Opt ch6
University of Basrah
Fall 2024
1/6
Chapter Six
2/6
6.1 Optimality Conditions
Consider the unconstrained optimization problem,
minimize f (x) ; (1)
x2Rn
Example: Let f (2x) = 3 2x3 x2 x3 + x1 + x2 + x3 . Find the stationary points and local minima.
x1 2 2 2
@ f (x)
=0 2 1 + 2x 3
66 @xf 1(x) 7 = 0
rf (x) = 0 ) 64 x2 = 0775 ) 4 x3 + 2x2 = 05. Stationary point is x = (x1 = 12 ; x2 =
1
2
3
; x3 = 4
3
).
@ f (x) 2 x2 + 2x3 = 0
=0
22 x30 0
3
r2 f (x ) = 40 2 15 , Principal minors: 2; 20 02 = 4; 2 21 2
1 = 6.
0 1 2
All the principal minors are positive ) r2 f (x) is positive de ne.
3/6
6.2 Iterative Descent Algorithms
In very limited cases, we can nd an analytic solution for the problem (1) by solving the optimality conditions (2) and
(3).
Usually, the problem (1) must be solved by an iterative algorithm that computes a sequence of points x0 ; x1 ; with
f (xk ) ! f as k ! 1.
An iterative decent algorithm uses the information rf (x) and/or r2 f (x), and applies the following recursion formula to
reach a local minimum,
xk +1 = xk + sk dk ; k = 1; 2; ; (4)
-) k = 1; 2; ; is the iteration index.
-) sk > 0 is the step size at the k th iteration.
-) dk 2 Rn is a decent direction at the the k th iteration.
-) xk 2 Rn is the point at the k th iteration and xk +1 2 Rn is the next point such that f (xk +1 ) < f (xk ).
The descent direction is chosen such that,
rf (xk )T dk < 0 : (5)
Often, a descent direction has the common form,
dk = Dk rf (xk ) ; (6)
Dk = I ; Gradient Step
Dk = ( r f (xk )) ; Newton Step :
2 1
4/6
6.2 Iterative Descent Algorithms (cont.)
Selection of the step size sk ,
1 Exact line search,
= minimize f (xk + s dk ) : (7)
sk
s 0
In exact line search, sk provides the greatest reduction for the objective function f () along the direction dk .
2 Backtracking line search,
-) Use parameters 2 (0; 1=2); 2 (0; 1).
-) Start at t = 1 and repeat t = t until,
f (xk + t dk ) < f (xk ) + t rf (xk )T dk : (8)
Once you reach t that satis es (8), set sk = t .
3 Diminishing step size,
sk = pak ; a > 0 (9)
Example: Apply the iterative descent method with gradient step to minimize f (x) = x1 x2 + 2x12 + 2x1 x2 + x22 starting
from the point (x1 = 0; x2 = 0).
-) The rst derivative rf (x) = 11++42xx1 ++22xx2 .
-) The gradient algorithm: xk +1 = xk sk rf (xk ).
1 2
s1 = minimize f (x1
s rf (x )) ;
1
s
s 0
= minimize f
s 0 s
;
= minimize 2
2s = 1 :
s
s
0
0 1 1
Then, x2 = x1 s1 rf (x ) ) x
1 2 = 0 1 = 1 . 5/6
6.2 Iterative Descent Algorithms (cont.)
Example: Apply the iterative descent method with gradient step to minimize f (x) = x1 x2 + 2x12 + 2x1 x2 + x22 starting
from the point (x1 = 0; x2 = 0).
-) The rst derivative rf (x) = 11++42xx1 ++22xx2 .
-) The gradient algorithm: xk +1 = xk sk rf (xk ).
1 2
s 0
= minimize f 1+s ;
s 0 1+s
= minimize 5s 2
2s 1 = 1=5 :
s 0
1 0:8
Then, x3 = x2 s2 rf (x2 ) ) x3 = 11 1
= 1:2 .
0:2 5 1
-) Third iteration, k = 3: rf (x3 ) = 0:2 ,
= minimize f (x3s rf (x3 )) ;
s3
s
0:8 0:2s
0
= minimize f
s 0 1:2 + 0:2s ;
= minimize 0:04s 2
0:08s 1:2 = 1 :
s 0
0:8 0:2 1
Then, x4 = x3 s3 rf (x3 ) ) x4 =
1:2 0:2 = 1:4 .
-) Proceed in the iterative procedure until you reach the optimum x = 1:51 .
6/6