Lecture 8
Lecture 8
k
[
F= Pi + intcone{r1, . . . , rt}
i=1
Valid Disjunctions
k n
Definition 1. Let {X i } i=1 be a collection of subset of R . Then if
k
S
1≤i≤k Xi ⊇ S, the disjunction associated with {Xi }i=1 is said to be
valid for an MILP with feasible set S.
Valid Inequalities
Optimality Conditions
.
• In other words, we can optimize over each subset separately.
• Idea: If we can’t solve the original problem directly, we might be able to
solve the smaller subproblems recursively.
• Dividing the original problem into subproblems is called branching.
• Taken to the extreme, this scheme is equivalent to complete enumeration.
Computational MILP Lecture 8 11
• There are many ways to construct a bounding problem and this will be
the topic of later lectures.
• The easiest of the these is to form the LP relaxation maxP∩Rn+∩Xi ,
obtained by dropping the integrality constraints.
• For the rest of the lecture, assume all variables have finite upper and
lower bounds.
Computational MILP Lecture 8 14
Key
Candidate 185.9
x5 ≤ 0.0 x5 ≥ 1.0
x34 ≤ 0.0 x34 ≥ 1.0 x0 ≤ 0.0 x0 ≥ 1.0 x0 ≤ 0.0 x0 ≥ 1.0 x0 ≤ 0.0 x0 ≥ 1.0
x2 ≤ 0.0 x2 ≥ 1.0 x15 ≤ 0.0 x15 ≥ 1.0 x34 ≤ 0.0 x34 ≥ 1.0 x35 ≤ 0.0 x35 ≥ 1.0 x34 ≤ 0.0 x34 ≥ 1.0 x35 ≤ 0.0 x35 ≥ 1.0 x20 ≤ 0.0 x20 ≥ 1.0 x14 ≤ 0.0 x14 ≥ 1.0
Pruned
182.7 182.9 182.5 182.6 181.9 180.9 178.3 181.6 181.1 180.5 178.5 183.1 180.4 182.9 181.5 183.8
Candidate
x15 ≤ 0.0 x15 ≥ 1.0 x0 ≤ 0.0 x0 ≥ 1.0 x22 ≤ 0.0 x22 ≥ 1.0 x2 ≤ 0.0 x2 ≥ 1.0 x26 ≤ 0.0 x26 ≥ 1.0 x11 ≤ 0.0 x11 ≥ 1.0 x35 ≤ 0.0 x35 ≥ 1.0
182.2 182.4 180.0 182.3 180.5 182.2 181.2 181.7 180.0 183.0 181.5 182.8 179.5 183.5
x17 ≤ 0.0 x17 ≥ 1.0 x4 ≤ 0.0 x4 ≥ 1.0 x35 ≤ 0.0 x35 ≥ 1.0 x20 ≤ 0.0 x20 ≥ 1.0 x18 ≤ 0.0 x18 ≥ 1.0 x9 ≤ 0.0 x9 ≥ 1.0 x18 ≤ 0.0 x18 ≥ 1.0
181.3 177.6 182.0 182.2 176.0 182.0 178.7 181.6 180.5 182.6 179.1 182.5 181.0 183.2
x10 ≤ 0.0 x10 ≥ 1.0 x14 ≥ 1.0 x14 ≤ 0.0 x14 ≤ 0.0 x14 ≥ 1.0 x26 ≤ 0.0 x26 ≥ 1.0
180.9 182.2
179.9 181.7
Computational MILP Lecture 8 22
Termination Conditions
A Thousand Words
A Thousand Words
A Thousand Words
Global Bounds
• The pictures show the evolution of the branch and bound process.
• Nodes are pictured at a height equal to that of their lower bound (we
are minimizing in this case!!).
– Red: candidates for processing/branching
– Green: branched or infeasible
– Turquoise: pruned by bound (possibly having produced a feasible
solution) or infeasible.
• The red line is the level of the current best solution (global upper bound).
• The level of the highest red node is the global lower bound.
• As the procedure evolves, the two bounds grow together.
• The goal is for this to happen as quickly as possible.
Computational MILP Lecture 8 29
Tradeoffs
• We will see that there are many tradeoffs to be managed in branch and
bound.
• Note that in the final tree:
– Nodes below the line were pruned by bound (and may or may not have
generated a feasible solution) or were infeasible.
– Nodes above the line were either branched or were infeasible or
generated an optimal solution.
• There is a tradeoff between the goals of moving the upper and lower
bounds
– The nodes below the line serve to move the upper bound.
– The nodes above the line serve to move the lower bound.
• It is clear that these two goals are somewhat antithetical.
• The search strategy has to achieve a balance between these two
antithetical goals.
Computational MILP Lecture 8 30
Tradeoffs in Practice
T = BBTree()
T.set_display_mode(’xdot’)
CONSTRAINTS, VARIABLES, OBJ, MAT, RHS = T.GenerateRandomMIP(rand_seed = 19)
T.BranchAndBound(CONSTRAINTS, VARIABLES, OBJ, MAT, RHS,
branch_strategy = PSEUDOCOST_BRANCHING,
search_strategy = BEST_FIRST,
display_interval = 10000)