Multi Variable Optimization: Min F (X, X, X, - X)
Multi Variable Optimization: Min F (X, X, X, - X)
x (0)
= (0,0) T f = f ((0,0) T ) = 170
−
(x (0)
− ∆1 x ) = (−0.5,0.5)
(0) T T
f = f ((−0.5,0.5) T ) = 171.81
x2 y1
x1
OR
FROM
x1 GET y1 ALONG (1,0)
y1 GET x2 ALONG (0,1)
x2 GET y2 ALONG (1,0)
TAKE (y2 - y1 )
Alternate to the above method
- One point ( x1 ) and both coordinate
directions ((1,0) T and (0,1) T )
- Can be used to create a pair of conjugate
directions (d and ( y 2 − y 1 ))
• Point ( y 1 ) obtained by unidirectional search
along (1,0) T from the point ( x1 ) .
• Point ( x 2 ) obtained by unidirectional search
along (0,1) T from the point ( y 1 ) .
• Point ( y 2 ) obtained by unidirectional search
along (1,0) T from the point ( x 2 ) .
• Solution
Step1 : Begin with a point x (0)
= ( 0, 4) T
ELSE s j = s j −1 ∀j &
s =d/d
1
GOTO 2
Powell’s method with N variables
• Start from x1
• Get y1 by doing search along s1
• Find y2 by doing search along s2, s3, s4 .. sn,
s1
• (y2 – y1) is conjugate to s1
• Replace sn by (y2-y1) and start same
procedure starting from s2
GRADIENT BASED METHODS
• The methods exploit the derivative information
of the function and are faster.
• Cannot be applied to problems where the
objective function is discrete or discontinuous.
• Efficient when the derivative information is easily
available
• Some algorithms require first order derivatives
while some require first and second order
derivatives of the objective function.
• The derivatives can be obtained by numerical
computations
• Methods in Gradient Search.
– Decent Direction
– Cauchy’s (steepest decent) method
– Newton’s Method
– Marquardt’s Method
– Conjugate gradient method
– Variable-metric method
• By definition , the first derivative ∇f ( x (t ) ) at any
point ( x (t ) ) represents the direction of the
maximum increase of the function value.
• If we are interested in finding a point with the
minimum function value, ideally we should be
searching along the opposite to the first
derivative direction, that is , we should search
along − ∇f ( x (t ) )
• Any search made in this direction will have
smaller function value.
• DECENT DIRECTION
A search direction (d (t ) ) is a decent direction at a
point ( x (t ) ) if the condition ∇f ( x ( t ) ).d ( t ) ≤ 0 is
(t )
satisfied in the vicinity of the point ( x )
is maximally negative
Note : ∇ = ⎡⎢ ∂ ∂ ∂
−−−−−
∂ ⎤
⎥
⎣ ∂x1 ∂x 2 ∂x 3 ∂x n ⎦
Example
• Consider the Himmelblau function:
⎝0⎠
• The above is a negative quantity, thus the search
direction is a Decent direction
• The amount of non-negativity suggests the extent
of decent in the direction
• If the search direction d (t ) = −∇f ( x (t ) ) T = (−46,−38) T
is used, the magnitude of the above dot product is
⎛ 46 ⎞
(−46,−38) ⎜ ⎟ = −3,560
T
⎝ 38 ⎠
• The direction (46,38) or (0.771,0.637) is more
descent than
• The above direction is the steepest direction at the
point x(t)
• In nonlinear function the steepest decent
direction at any point may not exactly pass
through the true minimum.
s k = −∇f ( x (k ) )
∇ f (x k
)
STEP 3: IF ∇ f (x k
) ≤∈ 1
TERMINATE
IF k ≥ M TERMINATE
k+1
STEP 5: IF
x −x k
k
≤∈1
x
ELSE k =k +1 GOTO STEP 2
Feasible
region Infeasible region
Minimum
point
x2
x1
Process
1. Choose ε1 , ε 2 , R, Ω.
2. Form modified objective function
P ( x k , R k ) = f ( x k ) + Ω( R k , g ( x k ), h( x k ))
- <A>=A if A<0.
- exterior.
- initially small R.
Direct search
• Variable elimination
- for equality constraints
- Express one variable as a function of others
and
- Eliminate one variable
- Remove all equality constraints
Complex search
- generate a set of points at random
- If a point is infeasible reflect beyond
centroid of remaining points
- Take worst point and push towards centroid
of feasible points.
Complex Search Algo
• S1: Assume a bound in x (xL, xU), a
reflection parameter α, and ε & δ
• S2: Generate a set of P (= 2N) initial points
– For each point
• Sample N times to determine xi(P), in the given
bound
• if x(P) is infeasible, calculate x (centroid) of the
current set of points and set xP = xP+1/2(x -xP) until
xP is feasible.
• If xP is feasible, continue till u get P points
Complex Search Algo (contd)
• S3: Reflection step
– select xR such that f(xR) = max (f(xP)) = Fmax
– calculate x, centroid of remaining points
– xm = x + α (x - xR)
– If xm is feasible and f(xm) > Fmax, retrtact half
the distance to x , and continue till f(xm) < Fmax
– If xm is feasible and f(xm) < Fmax, Go to S5
– If xm is infeasible, Goto S4
Complex Search Algo (contd)
• S4: Check for feasibility of the solution
– For all i, reset violated variable bounds
• if xim < xiL, xim = xiL
• if xim > xiU, xim = xiU
– If the resulting xim is infeasible, retract half
the distance to the centroid, repeat till xm is
feasible
Complex Search Algo (contd)
• S5: Replace xR by xm, check for
termination
– fmean = mean of f(xP),
– xmean = mean (xP)
∑p
( f ( x p
) − f mean ) 2
≤ε
∑x
2
p
− xmean ≤δ
p
Complex Search Algo (contd)
Characteristics of complex search
• For complex feasible region
• If the optimum is well inside the search space,
the algo is efficient
• Not so good if the search space is narrow, or
the optimum is close to the constraint
boundary