0% found this document useful (0 votes)
5 views33 pages

Lecture 8

This lecture focuses on the branch and bound algorithm for solving mixed-integer linear programming (MILP) problems, detailing its recursive approach and the importance of obtaining good bounds and valid disjunctions. It discusses the concepts of valid inequalities and disjunctions, optimality conditions, and the process of branching and bounding to efficiently navigate the solution space. The lecture also outlines the steps involved in the LP-based branch and bound algorithm, emphasizing the significance of pruning subproblems to enhance computational efficiency.

Uploaded by

陳徐行
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views33 pages

Lecture 8

This lecture focuses on the branch and bound algorithm for solving mixed-integer linear programming (MILP) problems, detailing its recursive approach and the importance of obtaining good bounds and valid disjunctions. It discusses the concepts of valid inequalities and disjunctions, optimality conditions, and the process of branching and bounding to efficiently navigate the solution space. The lecture also outlines the steps involved in the LP-based branch and bound algorithm, emphasizing the significance of pruning subproblems to enhance computational efficiency.

Uploaded by

陳徐行
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 33

Computational Integer Programming

Lecture 8: Branch and Bound

Dr. Ted Ralphs


Computational MILP Lecture 8 1

Reading for This Lecture

• Nemhauser and Wolsey Sections II.3.1, II.3.6, II.4.1, II.4.2, II.5.4


• Wolsey Chapter 7
Computational MILP Lecture 8 2

Computational Integer Optimization

• We now turn to the details of how integer optimization problems are


solved in practice.
• Computationally, the most important aspects of solving integer
optimization problems are
– A method for obtaining good bounds on the value of the optimal
solution (usually by solving a relaxation or dual; and
– A method for generating valid disjunctions violated by a given
(infeasible) solution.
• In this lecture, we will motivate this fact by introducing the branch and
bound algorithm.
• We will then look at various methods of obtaining bounds.
• Later, we will examine branch and bound in more detail.
Computational MILP Lecture 8 3

Integer Optimization and Disjunction

• As we know, the difficulty in solving an integer optimization problem


arises from the requirement that certain variables take on integer values.
• Such requirements can be described in terms of logical disjunctions,
constraints of the form [
x∈ Xi
1≤i≤k
for Xi ⊆ Rn, i ∈ 1, . . . , k.
• The integer variables in a given formulation may represent logical
conditions that were originally expressed in terms of disjunction.
• In fact, the MILP Representability Theorem tells us that any MILP can
be re-formulated as an optimization problem whose feasible region is

k
[
F= Pi + intcone{r1, . . . , rt}
i=1

is the disjunctive set F defined above, for some appropriately chosen


polytopes P1, . . . , Pk and vectors r1, . . . , rt ∈ Zn.
Computational MILP Lecture 8 4

Two Conceptual Reformulations

• From what we have seen so far, we have to two conceptual reformulations


of a given integer optimization problem.
• The first is in terms of disjunction:
( k
!)
[
max c>x | x ∈ Pi + intcone{r1, . . . , rt} (DIS)
i=1

• The second is in terms of valid inequalities:

max c>x | x ∈ conv(S)



(CP)

where S is the feasible region.


• In principle, if we had a method for generating either of these
reformulations, this would lead to a practical method of solution.
• Unfortunately, these reformulations are necessarily of exponential size in
general, so there can be no way of generating them efficiently.
Computational MILP Lecture 8 5

Valid Disjunctions

• In practice, we dynamically generate parts of the reformulations (CP) and


(DIS) in order to obtain a proof of optimality for a particular instance.
• The concept of valid disjunction, arises from a desire to approximate the
feasible region of (DIS).

k n
Definition 1. Let {X i } i=1 be a collection of subset of R . Then if
k
S
1≤i≤k Xi ⊇ S, the disjunction associated with {Xi }i=1 is said to be
valid for an MILP with feasible set S.

Definition 2. If {Xi}ki=1 is a disjunction valid for S and Xi is polyhedral


for all i ∈ {1, . . . , k}, then we say the disjunction is linear.

Definition 3. If {Xi}ki=1 is a disjunction valid for S and Xi ∩ Xj = ∅


for all i, j ∈ {1, . . . , k}, we say the disjunction is partitive.

Definition 4. If {Xi}ki=1 is a disjunction valid for S that is both linear


and partitive, we call it admissible.
Computational MILP Lecture 8 6

Valid Inequalities

• Likewise, we can think of the concept of a valid inequality as arising from


our desire to approximate conv(S) (the feasible region of (CP)).
• The inequality denoted by (π, π0) is called a valid inequality for S if
π >x ≤ π0 ∀x ∈ S.
• Note (π, π0) is a valid inequality if and only if S ⊆ {x ∈ Rn | π >x ≤ π0}.
Computational MILP Lecture 8 7

Optimality Conditions

• Let us now consider an MILP (A, b, c, p) with feasible set S = P ∩ (Zp+ ×


n−p
R+ ), where P is the given formulation.
• Further, let {Xi}ki=1 be a linear disjunction valid for this MILP so that
Xi ∩ P ⊆ Rn is polyhedral.
• Then maxXi∩S c>x is an MILP for all i ∈ 1, . . . , k.
• For each i = 1, . . . , k, let Pi be a polyhedron such that Xi ∩ S ⊆ Pi ⊆
P ∩ Xi.
• In other words, Pi is a valid formulation for subproblem i, possibly
strengthened by additional valid inequalities.
• Note that {Pi}ki=1 is itself a valid linear disjunction.
• We will see why there is a distinction between Xi and Pi later on.
• Conceptually, we are combining and relaxing the formulations (CP) and
(DIS).
Computational MILP Lecture 8 8

Optimality Conditions (cont’d)

• From the disjunction on the previous slide, we obtain a relaxation of a


general MILP.
• This relaxation yields a practical set of optimality conditions.
• In particular,
max max n c>x ≥ zIP , (1)
i∈1,...,k x∈Pi ∩R+

which implies that if we have x∗ ∈ S such that

max max n c>x = c>x∗, (OPT)


i∈1,...,k x∈Pi ∩R+

then x∗ must be optimal.


Computational MILP Lecture 8 9

More on Optimality Conditions

• Although it is not obvious, these optimality conditions can be seen as a


generalization of those from LP.
• They are also the optimality conditions implicitly underlying many
advanced algorithms.
• There is an associated duality theory that we will see later.
• By parameterizing (1), we obtain a “dual function” that is the solution
to a dual that generalizes the LP dual.
Computational MILP Lecture 8 10

Branch and Bound

• Branch and bound is the most commonly-used algorithm for solving


MILPs.
• It is a recursive, divide-and-conquer approach.
• Suppose S is the feasible set for an MILP and we wish to compute
maxx∈S c>x.
• Consider a partition of S into subsets S1, . . . Sk . Then

max c>x = max {max c>x}


x∈S {1≤i≤k} x∈Si

.
• In other words, we can optimize over each subset separately.
• Idea: If we can’t solve the original problem directly, we might be able to
solve the smaller subproblems recursively.
• Dividing the original problem into subproblems is called branching.
• Taken to the extreme, this scheme is equivalent to complete enumeration.
Computational MILP Lecture 8 11

Branching in Branch and Bound

• Branching is achieved by selecting an admissible disjunction {Xi}ki=1 and


using it to partition S, e.g., Si = S ∩ Xi.
• We only consider linear disjunctions so that the subproblem remain
MILPs after branching.
• The reason for choosing partitive disjunctions is self-evident.
• The way this disjunction is selected is called the branching method and
is a topic we will examine in some depth.
• Generally speaking, we want x∗ 6∈ ∪1≤i≤k Xi, where x∗ is the (infeasible)
solution produced by solving the bounding problem associated with a
given subproblem.
• A typical disjunction is

X1 = {xj ≤ bx∗j c}, (2)


X2 = {xj ≥ dx∗j e}, (3)

where x∗ ∈ argmaxx∈P c>x.


Computational MILP Lecture 8 12

Bounding in Branch and Bound

• The bounding problem is a problem solved to obtain a bound on the


optimal solution value of a subproblem maxSi c>x.
• Typically, the bounding problem is either a relaxation or a dual of the
subproblem (these concepts will be defined formally in Lecture 7).
• Solving the bounding problem serves two purposes.
– In some cases, the solution x∗ to the relaxation may actually be a
feasible solution (x∗ ∈ S, in which case c>x∗ is a global lower bound
l(S).
– Bounding enables us to inexpensively a bound b(Si) on the optimal
solution value of subproblem i.
• If b(Si) ≤ l(S), then Si can’t contain a solution strictly better than the
best one found so far.
• Thus, we may discard or prune subproblem i.
Computational MILP Lecture 8 13

Constructing a Bounding Problem

• There are many ways to construct a bounding problem and this will be
the topic of later lectures.
• The easiest of the these is to form the LP relaxation maxP∩Rn+∩Xi ,
obtained by dropping the integrality constraints.
• For the rest of the lecture, assume all variables have finite upper and
lower bounds.
Computational MILP Lecture 8 14

LP-based Branch and Bound: Initial Subproblem

• In LP-based branch and bound, we first solve the LP relaxation of the


original problem. The result is one of the following:
1. The LP is infeasible ⇒ MILP is infeasible.
2. We obtain a feasible solution for the MILP ⇒ optimal solution.
3. We obtain an optimal solution to the LP that is not feasible for the
MILP ⇒ upper bound.
• In the first two cases, we are finished.
• In the third case, we must branch and recursively solve the resulting
subproblems.
Computational MILP Lecture 8 15

Branching in LP-based Branch and Bound

• In LP-based branch and bound, the most commonly used disjunctions


are the variable disjunctions, imposed as follows:
– Select a variable i whose value x̂i is fractional in the LP solution.
– Create two subproblems.
∗ In one subproblem, impose the constraint xi ≤ bx̂ic.
∗ In the other subproblem, impose the constraint xi ≥ dx̂ie.
• What does it mean in a 0-1 problem?
Computational MILP Lecture 8 16

The Geometry of Branching

Figure 1: The original feasible region


Computational MILP Lecture 8 17

The Geometry of Branching (cont’d)

Figure 2: Branching on disjunction x1 ≤ 2 OR x1 ≥ 3


Computational MILP Lecture 8 18

Continuing the Algorithm After Branching

• After branching, we solve each of the subproblems recursively.


• Now we have an additional factor to consider.
• As mentioned earlier, if the optimal solution value to the LP relaxation
is smaller than the current lower bound, we need not consider the
subproblem further.
• This is the key to the efficiency of the algorithm.
• Terminology
– If we picture the subproblems graphically, they form a search tree.
– Each subproblem is linked to its parent and eventually to its children.
– Eliminating a problem from further consideration is called pruning.
– The act of bounding and then branching is called processing.
– A subproblem that has not yet been considered is called a candidate
for processing.
– The set of candidates for processing is called the candidate list.
Computational MILP Lecture 8 19

The Geometry of Branching

Figure 3: Branching on disjunction x1 ≤ 4 OR x1 ≥ 5 in Subproblem 2


Computational MILP Lecture 8 20

LP-based Branch and Bound Algorithm

1. To start, derive a lower bound L using a heuristic method.


2. Put the original problem on the candidate list.
3. Select a problem S from the candidate list and solve the LP relaxation
to obtain the bound b(S).
• If the LP is infeasible ⇒ node can be pruned.
• Otherwise, if b(S) ≤ L ⇒ node can be pruned.
• Otherwise, if b(S) > L and the solution is feasible for the MILP ⇒
set L ← b(S).
• Otherwise, branch and add the new subproblem to the candidate list.
4. If the candidate list in nonempty, go to Step 2. Otherwise, the algorithm
is completed.
Computational MILP Lecture 8 21

Branch and Bound Tree

Key

Candidate 185.9

x5 ≤ 0.0 x5 ≥ 1.0

Infeasible 184.6 185.1

x16 ≤ 0.0 x16 ≥ 1.0 x22 ≤ 0.0 x22 ≥ 1.0

Solution 184.5 183.6 183.9 184.5

x34 ≤ 0.0 x34 ≥ 1.0 x0 ≤ 0.0 x0 ≥ 1.0 x0 ≤ 0.0 x0 ≥ 1.0 x0 ≤ 0.0 x0 ≥ 1.0

Pruned 183.8 183.0 182.7 182.2 182.2 183.2 183.3 184.1

x2 ≤ 0.0 x2 ≥ 1.0 x15 ≤ 0.0 x15 ≥ 1.0 x34 ≤ 0.0 x34 ≥ 1.0 x35 ≤ 0.0 x35 ≥ 1.0 x34 ≤ 0.0 x34 ≥ 1.0 x35 ≤ 0.0 x35 ≥ 1.0 x20 ≤ 0.0 x20 ≥ 1.0 x14 ≤ 0.0 x14 ≥ 1.0

Pruned
182.7 182.9 182.5 182.6 181.9 180.9 178.3 181.6 181.1 180.5 178.5 183.1 180.4 182.9 181.5 183.8
Candidate

x15 ≤ 0.0 x15 ≥ 1.0 x0 ≤ 0.0 x0 ≥ 1.0 x22 ≤ 0.0 x22 ≥ 1.0 x2 ≤ 0.0 x2 ≥ 1.0 x26 ≤ 0.0 x26 ≥ 1.0 x11 ≤ 0.0 x11 ≥ 1.0 x35 ≤ 0.0 x35 ≥ 1.0

182.2 182.4 180.0 182.3 180.5 182.2 181.2 181.7 180.0 183.0 181.5 182.8 179.5 183.5

x17 ≤ 0.0 x17 ≥ 1.0 x4 ≤ 0.0 x4 ≥ 1.0 x35 ≤ 0.0 x35 ≥ 1.0 x20 ≤ 0.0 x20 ≥ 1.0 x18 ≤ 0.0 x18 ≥ 1.0 x9 ≤ 0.0 x9 ≥ 1.0 x18 ≤ 0.0 x18 ≥ 1.0

181.3 177.6 182.0 182.2 176.0 182.0 178.7 181.6 180.5 182.6 179.1 182.5 181.0 183.2

x10 ≤ 0.0 x10 ≥ 1.0 x14 ≥ 1.0 x14 ≤ 0.0 x14 ≤ 0.0 x14 ≥ 1.0 x26 ≤ 0.0 x26 ≥ 1.0

182.0 180.5 181.4 179.0 179.5 182.5 180.3 181.9

x24 ≤ 0.0 x24 ≥ 1.0

180.9 182.2

x25 ≤ 0.0 x25 ≥ 1.0

179.9 181.7
Computational MILP Lecture 8 22

Termination Conditions

• Note that although we use multiple disjunctions to branch during the


algorithm, the tree can still be seen as encoding a single disjunction.
• To see this, consider the set T of subproblems associated with the leaf
nodes in the tree.
– Provided that we use admissible disjunctions for branching, the feasible
regions of these subproblems are a partition of S.
– Furthermore, we will see that there exists a collection of polyhedra
{Pi}i∈T , where
∗ Pi is a formulation for subproblem i; and
∗ {Pi}ki=1 is admissible with respect to S.
• When this disjunction, along with the best solution found so far satisfies
the optimality conditions (OPT), the algorithm terminates.
• We will revisit this more formally as we further develop the supporting
theory.
Computational MILP Lecture 8 23

Ensuring Finite Convergence

• For LP-based branch and bound, ensuring convergence requires a


convergent branching method.
• Roughly speaking, a convergent branching method is one which will
– produce a violated admissible disjunction whenever the solution to the
bounding problem is infeasible; and
– if applied recursively, guarantee that at some finite depth, any resulting
bounding problem will either
∗ produce a feasible solution (to the original MILP); or
∗ be proven infeasible; or
∗ be pruned by bound.
• Typically, we achieve this by ensuring that at some finite depth, the
feasible region of the bounding problem contains at most one feasible
solution.
• We will also revisit this result more formally as we develop the supporting
theory.
Computational MILP Lecture 8 24

Algorithmic Choices in Branch and Bound

• Although the basic algorithm is straightforward, the efficiency of it in


practice depends strongly on making good algorithmic choices.
• These algorithmic choices are made largely by heuristics that guide the
algorithm.
• Basic decisions to be made include
– The bounding method(s).
– The method of selecting the next candidate to process.
∗ “Best-first” always chooses the candidate with the highest upper
bound.
∗ This rule minimizes the size of the tree (why?).
∗ There may be practical reasons to deviate from this rule.
– The method of branching.
∗ Branching wisely is extremely important.
∗ A “poor” branching can slow the algorithm significantly.
• We will cover the last two topics in more detail in later lectures.
Computational MILP Lecture 8 25

A Thousand Words

Figure 4: Tree after 400 nodes

Note that we are minimizing here!


Computational MILP Lecture 8 26

A Thousand Words

Figure 5: Tree after 1200 nodes


Computational MILP Lecture 8 27

A Thousand Words

Figure 6: Final tree


Computational MILP Lecture 8 28

Global Bounds

• The pictures show the evolution of the branch and bound process.
• Nodes are pictured at a height equal to that of their lower bound (we
are minimizing in this case!!).
– Red: candidates for processing/branching
– Green: branched or infeasible
– Turquoise: pruned by bound (possibly having produced a feasible
solution) or infeasible.
• The red line is the level of the current best solution (global upper bound).
• The level of the highest red node is the global lower bound.
• As the procedure evolves, the two bounds grow together.
• The goal is for this to happen as quickly as possible.
Computational MILP Lecture 8 29

Tradeoffs

• We will see that there are many tradeoffs to be managed in branch and
bound.
• Note that in the final tree:
– Nodes below the line were pruned by bound (and may or may not have
generated a feasible solution) or were infeasible.
– Nodes above the line were either branched or were infeasible or
generated an optimal solution.
• There is a tradeoff between the goals of moving the upper and lower
bounds
– The nodes below the line serve to move the upper bound.
– The nodes above the line serve to move the lower bound.
• It is clear that these two goals are somewhat antithetical.
• The search strategy has to achieve a balance between these two
antithetical goals.
Computational MILP Lecture 8 30

Tradeoffs in Practice

• In a practical implementation, there are many more choices and tradeoffs


than those we have indicated so far.
• The complexity of the problem of optimizing the algorithm itself is
immense.
• We have additional auxiliary methods, such as preprocessing and primal
heuristics that we can choose to devote more or less effort to.
• We also have the choice of how much effort to devote to choosing a
good candidate for branching.
• Finally, we have the choice of how much effort to devote to proving a
good bound on the subproblem.
• It is the careful balance of the levels of effort devoted to each of these
algorithmic processes the leads to a good algorithmic implementation.
Computational MILP Lecture 8 31

Exercise: Install Graphviz, xdot, and GrUMPy

• pip install coinor.grumpy


• Graphviz
– Linux: Install with package manager
– OS X: brew install graphviz
– Windows: http://graphviz.org/Download.php
• xdot: pip install xdot
– Linux: Install with package manager
– OS X: brew install pygtk
– Windows: http://pygtk.org/downloads.html
• python -m coinor.grumpy.BB
Computational MILP Lecture 8 32

Exercise2: Branch and Bound


from coinor.grumpy import BBTree, PSEUDOCOST_BRANCHING, MOST_FRACTIONAL
from coinor.grumpy import DEPTH_FIRST, BEST_FIRST, BEST_ESTIMATE

T = BBTree()
T.set_display_mode(’xdot’)
CONSTRAINTS, VARIABLES, OBJ, MAT, RHS = T.GenerateRandomMIP(rand_seed = 19)
T.BranchAndBound(CONSTRAINTS, VARIABLES, OBJ, MAT, RHS,
branch_strategy = PSEUDOCOST_BRANCHING,
search_strategy = BEST_FIRST,
display_interval = 10000)

You might also like