0% found this document useful (0 votes)
13 views21 pages

Cauchy Gradient Based Technique Lecture 5

The document discusses the Steepest Descent (Cauchy) Method for unconstrained optimization problems, detailing its algorithm and steps for minimizing a function. It explains the process of calculating search direction, determining optimal step length, and checking for optimality through iterative updates. Additionally, it provides illustrative examples and numerical methods for optimizing the step length.

Uploaded by

gi7282
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views21 pages

Cauchy Gradient Based Technique Lecture 5

The document discusses the Steepest Descent (Cauchy) Method for unconstrained optimization problems, detailing its algorithm and steps for minimizing a function. It explains the process of calculating search direction, determining optimal step length, and checking for optimality through iterative updates. Additionally, it provides illustrative examples and numerical methods for optimizing the step length.

Uploaded by

gi7282
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 21

Steepest Descent (Cauchy) Method

for Unconstrained Optimization


Problems

(Gradient Based Technique)

By: Dr. R. S. Bisht

Email: [email protected],
[email protected]
Unconstrained Minimization Methods

Direct Search Methods Descent Methods


(Zeroth-order method) (Gradient method)

Random Search Method Univariate Newton's method


Method Pattern Search method Gradient descent
Powell's method Hooke Jeeves Steepest Descent (Cauchy) Methods
method Simplex method Fletcher-Reeves method
Quasi-Newton method Davidson-
Fletcher-Powell method
Newton’s Method for single & multiple
variables
General strategy for Gradient methods:

Minimize 𝑓(𝑥)

Step 1: Calculate the search direction 𝑆𝑖 .

Step 2: Select a step length in that direction to


reduce 𝑓 𝑥
We know that, if we move
𝑋𝑖+1 = 𝑋𝑖 + 𝜆𝑖 𝑆𝑖 𝑋𝑖+1 = 𝑋𝑖 − 𝜆𝑖 𝛻𝑓𝑖 along gradient direction from
any point to n-dimensional
space, the function values
increases at the fastest rate.
Steepest Descent:
Search Direction is 𝑆𝑖 = −𝛻𝑓𝑖
𝜆𝑖 optimal step length
along the search
direction 𝑆𝑖
Gradient Descent (Cauchy) Method:

• It uses the negative of the gradient vector as the


direction for Minimization

• Start from the initial point 𝑋1 and iteratively move


along the steepest descent directions until the
optimum point is reached.
Algorithm:
Steepest Descent
Minimize 𝑓(𝑥)
Algorithm: Steepest Descent
Minimize 𝑓(𝑥)

Start with arbitrary 𝑋1 .Set iteration number 𝑖 = 1.

Step 1:Find The search direction 𝑆𝑖 as

𝑆𝑖 = −𝛻𝑓𝑖 = −𝛻𝑓(𝑋𝑖 )

Step 2: Determine the optimal step length 𝜆𝑖 in the


Direction of 𝑆𝑖 and set

𝑋𝑖+1 = 𝑋𝑖 + 𝜆𝑖 𝑆𝑖 𝑋𝑖+1 = 𝑋𝑖 − 𝜆𝑖 𝛻𝑓𝑖

Step 3: Test the optimality for the new point 𝑆𝑖+1 .


How to pick λ? ?

Analytically.

• Numerically.

Task of𝜆𝑖 is to minimize a function


in a search direction 𝑆 .
Numerical Method:

Using searching technique to

Fixed𝜆𝑖 (= 1) or variable λ(= 1, 2, 1/2,etc.)

Options for optimizing λ:

1 Use interpolation such as quadratic, cubic

2) Region Elimination (Golden Search)

3) Newton, Secant, Quasi-Newton

4) Random

5) Analytical optimization

(1), (3), and (5) are preferred. However, it may not be desirable
to exactly optimize 𝜆𝑖 (better to generate new search directions)
Analytical Method:
How does one minimize 𝜕 function in a search direction using an analytical
method ? It means 𝑆 is fixed and you want to pick 𝜆, the step length to
minimize 𝑓 𝑋 . Note 𝛻𝑋𝑖 = 𝛻𝑋𝑖 = 𝜆𝑖 𝑆𝑖

𝑓 𝑋+1𝑖 = 𝑓 𝑋𝑖 + 𝜆𝑖 𝑆𝑖
1
≅ 𝑓 𝑋𝑖 + 𝛻 𝑇 𝑓 𝑋𝑖 ∆𝑋𝑖 + (∆𝑋𝑖 )𝑇 𝐻(𝑋𝑖 )(∆𝑋𝑖 )
2

Thus, Expand by Taylor


𝑑𝑓(𝑋𝑖 + 𝜆𝑖 𝑆𝑖 ) Series formula
=0
𝑑𝛾
𝛻𝑓𝑖 𝑆𝑖 + 𝜆𝑖 (𝑆𝑖 )𝑇 𝐻𝑖 𝑆𝑖 = 0

𝛻𝑓𝑖 𝑆𝑖 𝑆𝑖𝑇 𝑆𝑖
𝜆𝑖 = − 𝑇 = 𝑇
𝑆𝑖 𝐻𝑖 𝑆𝑖 𝑆𝑖 𝐻𝑖 𝑆𝑖

This yields 𝜕 minimum of the approximating function.


Summarize:

Steps of Steepest Descent Method


to minimize the function 𝑓(𝑥)

(3 steps rule to solve the problems)


Steps for Numerical Examples:

Step1:Calculate𝑆𝑖 at 𝑋𝑖 by 𝑆𝑖 = −𝛻𝑓𝑖

𝑆𝑖𝑇 𝑆𝑖
Step 2:Calculate 𝛾𝑖 by using 𝜆𝑖 = and the new
𝑆𝑖𝑇 𝐻𝑖 𝑆𝑖
Point
𝑋𝑖+1 = 𝑋𝑖 + 𝜆𝑖 𝑆𝑖 𝑋𝑖+1 = 𝑋𝑖 − 𝜆𝑖 𝛻𝑓𝑖

Step3: Check the optimum of 𝑋𝑖+1 by 𝛻𝑓(𝑋𝑖+1 )≅ 0.

If met stop, otherwise Implement Step I


again for this new point 𝑋𝑖+1 .
Illustrative Examples
Example: Minimize

𝑓 𝑥1 , 𝑥2 = 𝑥1 − 𝑥2 + 2𝑥12 + 2𝑥1 𝑥2 + 𝑥22

Starting from the point 𝑋1 = 0,0 .

Solution:

The gradient of 𝑓 𝑖𝑠
Hessian Matrix is
𝜕𝑓
𝛻𝑓 = 𝜕𝑥1 𝜕2𝑓 𝜕2𝑓
𝜕𝑓 𝜕𝑥12 𝜕𝑥1 𝜕𝑥2
𝜕𝑥2 𝐴=
𝜕2𝑓 𝜕2𝑓
= 1 + 4𝑥1 + 2𝑥2 𝜕𝑥2 𝜕𝑥1 𝜕𝑥22
−1 + 2𝑥1 + 2𝑥2
4 2
=
2 2
Iteration 1: At 𝑋1 = 0,0

Step 1: 𝛻𝑓1 =
1 Iteration 2: AT 𝑋2 :
−1
1
𝑆1 = −𝛻𝑓1 = 1
−1 Step 1: 𝑆2 = −𝛻𝑓 𝑋2 =
Step 2: Compute 𝜆1 at 𝑋1 1

𝑆2𝑇 𝑆2 2 1
𝛻𝑓1𝑇 𝛻𝑓1 2 Step 2:𝜆2 = = =
𝑆2𝑇 𝐻𝑆2 10 5
𝜆𝑖 = = =1
𝑆1𝑇 𝐴𝑆1 2

Hence ,the new point is Hence, the new point is


= 𝑋2 + 𝜆2 𝑆2
−1 1 1
= +5
1 1
𝑋2 = 𝑋1 + 𝜆1 𝑆1
0 −1 −1 −0.8
= +1 = =
0 1 1 1.2
0.2 0
Step 3: Since 𝛻𝑓(𝑋3 ) = ≠
−0.2 0
Step 3:Check the optimum.
So. 𝑋3 is not optimum, so move next iteration.
−1 0
𝛻𝑓2 = ≠
−1 0
So, 𝑋2 is not optimum so move next Iteration.
Iteration 1: At 𝑋3 Iteration 1: At 𝑋4

−0.2 0.2
Step 1: 𝑆3 = −𝛻𝑓 𝑋3 = Step 1: 𝑆4 = −𝛻𝑓 𝑋4 =
0.2 0.2

𝑆3𝑇 𝑆3
Step 2:𝜆3 = =1 𝑆4𝑇 𝑆4 1
𝑆2𝑇 𝐻𝑆3 Step 2:𝜆4 = =5
𝑆4𝑇 𝐻𝑆4

Hence ,the new point is Hence ,the new point is

𝑋4 = 𝑋3 + 𝜆3 𝑆3 𝑋5 = 𝑋4 + 𝜆4 𝑆4

−0.8 −0.2 −1.0 1 0.2


= +1 = +
1.2 0.2 1.4 5 0.2
−0.96
−1.0 =
= 1.44
1.4
0.04 0
−0.2 0 Step 3: Since 𝛻𝑓(𝑋5 ) = ≠
Step 3: Since 𝛻𝑓(𝑋4 ) = ≠ −0.04 0
−0.2 0 So. 𝑋5 is optimum.
So. 𝑋4 is not optimum, so move next iteration.
This process has to be continued until the optimum point,

−1.0 Graphical
𝑋∗ =
1.5 technique
Additional Questions

Perform at the most three iterations of


the steepest descent method for the
following functions

Example 1).𝑓(𝑋) = 3(𝑥1 − 2)2 + 4(𝑥2 − 3)2 +

2(𝑥1 + 5)2 , 𝑋 = 0, 0 .

Example 2).𝑓(𝑋) = 4𝑥12 + 6𝑥22 − 8𝑥1 𝑥2 𝑤𝑖𝑡ℎ 𝑋 =

(1, 1)
Thankyou for
your attention

You might also like