Cauchy Gradient Based Technique Lecture 5
Cauchy Gradient Based Technique Lecture 5
Email: [email protected],
[email protected]
Unconstrained Minimization Methods
Minimize 𝑓(𝑥)
𝑆𝑖 = −𝛻𝑓𝑖 = −𝛻𝑓(𝑋𝑖 )
Analytically.
• Numerically.
4) Random
5) Analytical optimization
(1), (3), and (5) are preferred. However, it may not be desirable
to exactly optimize 𝜆𝑖 (better to generate new search directions)
Analytical Method:
How does one minimize 𝜕 function in a search direction using an analytical
method ? It means 𝑆 is fixed and you want to pick 𝜆, the step length to
minimize 𝑓 𝑋 . Note 𝛻𝑋𝑖 = 𝛻𝑋𝑖 = 𝜆𝑖 𝑆𝑖
𝑓 𝑋+1𝑖 = 𝑓 𝑋𝑖 + 𝜆𝑖 𝑆𝑖
1
≅ 𝑓 𝑋𝑖 + 𝛻 𝑇 𝑓 𝑋𝑖 ∆𝑋𝑖 + (∆𝑋𝑖 )𝑇 𝐻(𝑋𝑖 )(∆𝑋𝑖 )
2
𝛻𝑓𝑖 𝑆𝑖 𝑆𝑖𝑇 𝑆𝑖
𝜆𝑖 = − 𝑇 = 𝑇
𝑆𝑖 𝐻𝑖 𝑆𝑖 𝑆𝑖 𝐻𝑖 𝑆𝑖
Step1:Calculate𝑆𝑖 at 𝑋𝑖 by 𝑆𝑖 = −𝛻𝑓𝑖
𝑆𝑖𝑇 𝑆𝑖
Step 2:Calculate 𝛾𝑖 by using 𝜆𝑖 = and the new
𝑆𝑖𝑇 𝐻𝑖 𝑆𝑖
Point
𝑋𝑖+1 = 𝑋𝑖 + 𝜆𝑖 𝑆𝑖 𝑋𝑖+1 = 𝑋𝑖 − 𝜆𝑖 𝛻𝑓𝑖
Solution:
The gradient of 𝑓 𝑖𝑠
Hessian Matrix is
𝜕𝑓
𝛻𝑓 = 𝜕𝑥1 𝜕2𝑓 𝜕2𝑓
𝜕𝑓 𝜕𝑥12 𝜕𝑥1 𝜕𝑥2
𝜕𝑥2 𝐴=
𝜕2𝑓 𝜕2𝑓
= 1 + 4𝑥1 + 2𝑥2 𝜕𝑥2 𝜕𝑥1 𝜕𝑥22
−1 + 2𝑥1 + 2𝑥2
4 2
=
2 2
Iteration 1: At 𝑋1 = 0,0
Step 1: 𝛻𝑓1 =
1 Iteration 2: AT 𝑋2 :
−1
1
𝑆1 = −𝛻𝑓1 = 1
−1 Step 1: 𝑆2 = −𝛻𝑓 𝑋2 =
Step 2: Compute 𝜆1 at 𝑋1 1
𝑆2𝑇 𝑆2 2 1
𝛻𝑓1𝑇 𝛻𝑓1 2 Step 2:𝜆2 = = =
𝑆2𝑇 𝐻𝑆2 10 5
𝜆𝑖 = = =1
𝑆1𝑇 𝐴𝑆1 2
−0.2 0.2
Step 1: 𝑆3 = −𝛻𝑓 𝑋3 = Step 1: 𝑆4 = −𝛻𝑓 𝑋4 =
0.2 0.2
𝑆3𝑇 𝑆3
Step 2:𝜆3 = =1 𝑆4𝑇 𝑆4 1
𝑆2𝑇 𝐻𝑆3 Step 2:𝜆4 = =5
𝑆4𝑇 𝐻𝑆4
𝑋4 = 𝑋3 + 𝜆3 𝑆3 𝑋5 = 𝑋4 + 𝜆4 𝑆4
−1.0 Graphical
𝑋∗ =
1.5 technique
Additional Questions
2(𝑥1 + 5)2 , 𝑋 = 0, 0 .
(1, 1)
Thankyou for
your attention