5-Improving Search
5-Improving Search
OPTIMIZATION
Moises Sudit
Bell Hall 415
‘-
[email protected]
Office: 716-645-2423
Cell: 716-316-6617 (emergency only)
Sudit-1
Introduction Improving Search
Two Important components of Improving Search:
- Moving Direction x
- Step Size for multiplier >0
10 X3=(1,9)
9
8 =2
X=(-1.5,2)
7
xt+1=xt+x 6
X2=(4,5)
5
4
=4
3
X=(0.75,0.5)
2
X1=(1,3)
0
0 1 2 3 4 5 6 7 8 9
2
Introduction Improving Search
Vector x is an improving direction at current solution xt if the
objective function value at xt+x is superior to that xt fo all >0
sufficiently small.
3
Introduction Improving Search
Vector x is a feasible direction at current xt if point xt+ x violates
no model contraint if >0 sufficiently small.
- x1 with x=(0,1)? 10
- x2 with x=(0,1)?
9
- x3 with x=(0,1)? 7
6
x3=(0.75,6)
4 x2=(4,4)
0
0 1 2 3 4 5 6 7 8 9
x1=(7,0)
4
Introduction Improving Search
How big of a Step Size >0?
Maximum Step >0 such that the selected move direction continues
to retain feasibility and improve the objective function.
Min 10w1+3w2 w19=(4,5)
s.t.: w1+w2 9 w=(-3,-8)
w1,w2 0
Since the objective will only improve as >0 on the given direction,
the only thing we need to be concern is the feasibility.
5
Introduction Improving Search
w20=w19+w=(4-3,5-8)
Replacing in the constraints:
w1+w2 9 (4-3)+(5-8) 9 0
w10 (4-3) 0 4/3
w20 (5-8) 0 5/8
So how big can be?
=min(4/3,5/8)=5/8
So w20=(2.125,0) which means that objective improves from 55 to
21.25
6
Introduction Improving Search
Algorithm
Step 0: Initialization. Choose any starting feasible solution x1, and
set solution index t0.
Step 1: Local Optimum. If no improving feasible direction x exists
at xt, stop with xt being a local optimum.
Step 2: Move Direction. Construct an improving feasible direction at
xt as xt+1.
Step 3: Step Size. Choose largest step size t+1 such that direction
xt+1 continues to both improve the objective function and retains
feasibility. If there is no limit stop with the model is unbounded.
Step4: Advance. Update
xt+1 xt + t+1 xt+1
Then, increment t t+1, and return to Step1.
7
Introduction Improving Search
Some important conclusions
No optimization model solution at which an improving feasible direction is
available can be a local optimum.
When a continuous improving search terminates at a solution admitting no
improving feasible direction, the point is a local optimum.
If an improving search discovers an improving feasible direction for a model
that can be pursued forever without ceasing to improve or losing feasibility,
the model is unbounded.
2/27/2025 Sudit-572 8
Introduction Improving Search
Algebraic Conditions for Improving and Feasible Directions
The gradient of f(x)=f(x1,…,xn), denoted f(x), is the vector partial
derivatives f(x)=(f/x1,…, f/xn) evaluated at x.
df/dx=d(xa)/dx=axa-1
d(g(x)/h(x))= [h(x)(dg/dx)-g(x)(dh/dx)]/h(x)2
The difference between “d” and “” is simply that partial derivatives show
rates of change with respect to single variables with all other held constant.
9
Introduction Improving Search
Conditions for Improving Directions
Given that f(x)•x 0, then we can conclude the following two facts:
10
Inverse of Gravity Function (1+ Square of distance)
DCLUB Example
11
2/27/2025 Sudit-572 12
Introduction Improving Search
Example for a Maximization Problem DCLUB
x=(2,0), p(2,0)=(-1.60,1.45), x=(-1,1)
8
x1=.75, x2=6
For x=(7,0) the only constraint that 7
can be violated is x20.
6
x26 2
0.4x1+0.2x2 1.5 1
0
0 1 2 3 4 5 6 7 8 9
x1=7, x2=0
15
Introduction Improving Search
Whether a direction is feasible at a solution x depends on
whether it would lead to immediate violation of any active
constraint (tight constraint, binding constraint), i.e. any
constraint satisfied as equality at x.
This gives us an algorithm to check feasible directions.
Take x=(0.75,6) and x=(x1,x2)
0.4x1+0.2x21.5 0.4(0.75)+0.2(6)=1.51.5
By taking a step in the desired improving direction we get:
x*=(0.75+x1,6+x2)
0.4(0.75+x1)+0.2(6+ x2)1.5
1.5+(0.4x1+0.2x2) 1.5 0.4x1+0.2x2 0
x26 6+ x2 6 x2 ≤0
16
Introduction Improving Search
So in General we can create a test to check if an Improving Direction
is also Feasible:
Direction x=(x1,…, xn) is feasible for linearly constrained optimization
model at solution x=(x1,…, xn) if and only if
For jajxjb n
a x ajxj 0
For a x b
j 1
j j j n
a x ajxj 0
For a x =b
j 1
j j j n
a x ajxj 0
j 1
2/27/2025 Sudit-572 17
Introduction Improving Search
So given an optimization problem we know how to
obtain an improving direction and how to check for
feasibility.
Also this makes it obvious that problems for which
every local optima is a global optima, becomes
tractable by using some kind of Improving Search
procedure.
To be able to determine if a problem is tractable
(every local optima is also a global optima) we
need to analyze the two components of an
optimization problem: objective function and
constraints.
18
Introduction Improving Search
Properties of Objective Function to be tractable
An objective function f(x) is unimodal if the straight line direction from every point in its
domain to every better point is an improving direction. That is, for every x 1 and every x2 with
a better objective function value, direction x=(x2-x1) should be improving at x1.
Examples:
19
Introduction Improving Search
Theorem: Linear objective functions are unimodal in both maximize and
minimize optimization models.
Proof:
Let x1 and x2 be two feasible points in a maximization Linear Program with f(x2)>f(x1)
n n
f ( x ) f ( x ) c j x c j x1j
2 1 2
j 0 (Since f(x2)>f(x1)
j 1 j 1
n
c j ( x 2j x1j ) 0
j 1
c ( x 2 x1 ) 0
f(x)=(f/x1,…, f/xn)
f ( x ) ( x x )
1 2 1
0 Ex: f(x)=2x1-4x2f(x)=(2,-4)=c
f ( x1 ) x 0 By defining (x2-x1) as
Improving direction x
20
Introduction Improving Search
Properties of Constraints to be tractable
The feasible set of an optimization problem is convex if the line segment between every pair
of feasible points falls entirely within the feasible region.
The line segment between vector solutions x1 and x2 consists of all points of the form
x1+(x2-x1) with 01.
2/27/2025 Sudit-572 21
Introduction Improving Search
Theorem: If all constraints of an optimization problem model are linear, its
feasible space is convex.
Proof: n n
Then we need to show that any point in that segment is also feasible. That is, a new point
x1+(x2-x1) with 01, is feasible.
n n n
j j
a
j 1
x 1
( x 2
j
j 1
x 1
j ) j j a j ( x 2j x1j )
a x
j 1
1
n n
a x a j ( x 2j x1j )
1
j j
j 1 j 1
n n n
a j x1j a j x 2j a j x1j
j 1 j 1 j 1
n n
(1 ) a j x1j a j x 2j
j 1 j 1
(1 )b b
b
22
Introduction Improving Search
For Every Linear Program the Objective
Function is Unimodal and the
Constraints form a Convex Feasible
Region. IMPLICATIONS ARE HUGE!!!
Linear Programs are TRACTABLE,
since every Local Optima is a Global
Optima.
23