t1 Sol
t1 Sol
!
1
1. Starting from initial point x0 = , compute x1 using the steepest descent approach
−1
with optimal stepsize (exact line search) to find the minimum of the function
!
3x21 − 4x1
∇f (x1 , x2 ) =
3x22 + 6x2
!
−1
∇f (1, −1) =
−3
!
1
d = −∇f (1, −1) =
3
!
1+λ
x + λd =
−1 + 3λ
!
3(1 + λ)2 − 4(1 + λ)
∇f (x + λd) =
3(−1 + 3λ)2 + 6(−1 + 3λ)
′
f (λ) = ∇f (x + λd)T d = 3(1 + λ)2 − 4(1 + λ) + 9(−1 + 3λ)2 + 18(−1 + 3λ)
= −10 + 2λ + 84λ2 = 0
x1 = x0 + λ 1 d
! !
1 1 1
= +
−1 3 3
!
4
= 3
0
!
0
∇f (x1 ) =
0
2. Use the steepest descent method to approximate the optimal solution to the problem
1
!
0.5
Starting from the point x0 = , show the first few iterations.
0.5
!
2x1 − 2x2
g(x1 , x2 ) = ∇f (x1 , x2 ) =
4x2 − 2x1 − 2
The first few iterations:
k x1 x2 g1 g2 f(x1,x2)
1 0.50000 0.50000 0.00000 -1.00000 -0.75000
2 0.50000 0.60000 -0.20000 -0.60000 -0.83000
3 0.52000 0.66000 -0.28000 -0.40000 -0.86480
4 0.54800 0.70000 -0.30400 -0.29600 -0.88690
5 0.57840 0.72960 -0.30240 -0.23840 -0.90402
.....
96 0.99969 0.99981 -0.00024 -0.00015 -1.00000
97 0.99972 0.99982 -0.00022 -0.00013 -1.00000
98 0.99974 0.99984 -0.00020 -0.00012 -1.00000
99 0.99976 0.99985 -0.00019 -0.00011 -1.00000
100 0.99978 0.99986 -0.00017 -0.00011 -1.00000
2
Compute the Hessian and its inverse, then update x :
!
12x1 (x31 − x2 ) + 6x21 (3x21 ) + 24(x2 − x1 )2 −6x21 − 24(x2 − x1 )2
H(x1 , x2 ) =
−6x21 − 24(x2 − x1 )2 2 + 24(x2 − x1 )2
!
24 −24
H(0, 1) =
−24 26
!
13 1
H −1 (0, 1) = 24
1
2
1
2 2
! ! !
13 1
−1 0 24 2 −8
x1 = x0 − H (0, 1)∇f (0, 1) = − 1 1
1 2 2 10
!
− 32
=
0
3
6. Consider the quadratic optimization problem:
min −15x1 − 30x2 − 4x1 x2 + 2x21 + 4x22
subject to
x1 + 2x2 ≤ 30
x1 ≥ 0
x2 ≥ 0
(a) Find the matrix Q and vector c such that the objective function above can be
expressed as 12 xT Qx + cT x.