Published
Published
net/publication/313992289
Branch‐and‐Bound Algorithms
CITATIONS READS
8 17,091
1 author:
Kiavash Kianfar
Texas A&M University
27 PUBLICATIONS 223 CITATIONS
SEE PROFILE
All content following this page was uploaded by Kiavash Kianfar on 22 February 2019.
Wiley Encyclopedia of Operations Research and Management Science, edited by James J. Cochran
Copyright © 2010 John Wiley & Sons, Inc.
1
2 BRANCH-AND-BOUND ALGORITHMS
Initialization (maximization):
Put Initial Problem S at the root node;
lower bound z = –∞;
Incumbent = void:
No STOP
Is there any
Incumbent Solution is optimal;
active node?
Optimal Objective = z
Yes
Choose the next active node
on the tree (say S i )
Yes
Prune by infeasibility Is S i infeasible?
No
i i
Calculate upper bound z for problem S ;
Yes No
Prune by bound z i ≤ z?
x~i to S i?
Branch: No
Create nodes S i k where S i = Uk S i k
i
and add them to the tree as children of S
Initialization (maximization):
Put Initial Problem S with formulation P at the root node;
lower bound z = –∞;
Incumbent = void:
No STOP
Is there any
Incumbent Solution is optimal;
active node?
Optimal Objective = z
Yes
Yes
Prune by infeasibility Is P i infeasible?
No
Yes No
Prune by bound z i ≤ z?
Branch:
i1 i2 No
Create two subproblems S and S
with formulations P i 1 and P i 2 and add
i
them to the tree as children of S
Solution to LP Relaxation S :
x1 = 1.6; x2 = 1.8; z = 8.4;
Initialize z = – ∞.
Ideas’’ with respect to branch-and-bound for above single variable branching is used in all
solving the IP problem. solvers.
With the single variable branching strat-
Bounding egy, an important question is which variable
should be chosen out of all variables with frac-
Solving the LP relaxation of any IP subprob- tional value? A recent comprehensive study
lem Si gives an upper bound on its objective on this issue is done in Ref. 7; also see Refs 2,
value. This is the bound that is most com- 3, 8, 9. An old and common rule is the most
monly used in practice. The LP is usually fractional variable. If C is the set of all inte-
solved using simplex-based algorithms. On ger variables with fractional LP relaxation
very large models interior point methods may value then the chosen variable is
be best for solution of the first LP [3]. A desir-
able feature of LP relaxation with simplex is
argmaxj∈C min fj , 1 − fj ,
that an optimal or near-optimal basis of the
problem can be stored so that the LP relax- where fj = xj − xj . When branching is based
ation in subsequent nodes can be reoptimized on this rule, it is recommended that the
rapidly. xj ≤ xj branch is considered first only if
fj ≤ 1 − fj . Otherwise the xj ≥ xj should be
Branching selected first. Example in Fig. 4 obeys this
An important question is how to branch, rule. Accordingly the most fractional vari-
meaning how to split a subproblem into able, that is, x1 , is chosen as the branching
smaller subproblems. Here we review the variable and then the right branch is selected
most common methods. after the root node because f1 = x1 − x1 =
0.6 > 0.4 = 1 − (x1 − x1 ) = 1 − f1 .
Single Variable Branching. The simplest The most fractional rule is simple to imple-
idea to split the feasible region of a sub- ment but in Ref. 7, it is shown computation-
problem Si with formulation Pi is to pick ally that this rule is not better than just any
an integer variable with fractional value in random variable selection rule. More effective
the LP relaxation optimal solution, say xj strategies have been proposed and studied
with value xj , and create branches by adding over the years, which are more sophisti-
simple linear constraints to formulation P i cated. The method of pseudocost branching
as follows: goes back to [10] and works based on cal-
culating a pseudocost by keeping a history
of success (change in LP relaxation value)
Si1 = Si ∩ x : xj ≤ xj
of the left and right branching performed on
Si2 = Si ∩ x : xj ≥ xj . each variable. The branching variable with
the highest pesudocost is chosen each time.
Si1 is usually referred to as the left (down) Different functions have been proposed for
branch and Si2 is referred to as the right (up) the pseudocost such as maximum of sum
branch. This is a desirable choice because of left and right degradations (decrease in
clearly we have Si = Si1 ∪ Si2 and Si1 ∩ Si2 = upper bound) [11], maximum of the smaller
∅, and furthermore, the current LP solution of the left and right degradations [10]; also
is not feasible for any of Si1 and Si2 so in see Ref. 8.
the absence of multiple optima for LP the Strong branching [8,12] is in a sense an
upper bound would strictly decrease in each extreme effort to find the best branching vari-
of these branches. This branching scheme able. In its full version, at any node one
was first introduced in Ref. 6. tentatively branches on each candidate vari-
A generalization of the above branching able with fractional value and solves the LP
scheme is using branches such as Si1 = Si ∩ relaxation for right and left branches for
{x : dx ≤ d0 } and Si2 = Si ∩ {x : dx ≥ d0 + 1} each variable either to optimality or for a
in which d and d0 are integer. However, in specified number of dual simplex iterations.
practice, only the simplest form which is the The degradation in bound for left and right
6 BRANCH-AND-BOUND ALGORITHMS
with xj ∈ {0, 1} for j = 1, . . . , n. If the single 1. For pruning the tree a good lower
variable branching on one of the variables bound is needed. The depth-first
xj , j = 1, . . . , n, is performed then the two method descends quickly in the tree to
branches will be Si1 = Si ∩ {x : xj = 0} and find a first feasible solution which gives
Si2 = Si ∩ {x : xj = 1}. Because of the GUB a lower bound that hopefully causes
constraint, {x : xj = 0} will leave n − 1 possi- the pruning of many future nodes.
bilities {x : xi = 1}i=j while {x : xj = 1} leaves 2. The depth-first method tends to mini-
only one possibility. So Si1 is usually much mize the memory requirements for stor-
larger than Si2 and the tree is unbalanced. ing the tree at any given time during
A more balanced split is desired and GUB the algorithm.
branching proposed in Ref. 13 is a way to get 3. Passing from a node to its immediate
a balanced tree which works as follows: The child has the advantage that the LP
user provides a special order of variables relaxation can be easily resolved by
in the GUB set, say j1 , . . . , jn . Then the two adding just an additional constraint.
BRANCH-AND-BOUND ALGORITHMS 7
However, the depth-first search can result root problem or the node subproblems to
in an extremely large search tree. This is the tighten the feasible region of the LP relax-
result of the fact that we may need to consider ation and hence obtain better bounds faster.
many nodes that would have been fathomed A branch-and-bound algorithm in which cut-
if we had a better lower bound. ting planes are used is known under the
general name of branch-and-cut [2,3,18].
Best-Bound Node Selection. In this rule In branch-and-cut at any node, after opti-
the next node is the active node with the mizing the LP relaxation, a separation prob-
best (largest) upper bound. With this rule lem is solved to find valid inequalities for
one would never branch a node whose upper feasible integer solutions, which are violated
bound is less than the optimal value. As by the LP relaxation solution. These valid
a result this rule minimizes the number inequalities are then added to the problem
of the nodes considered. However, the and the problem is reoptimized to improve
memory requirements for this method may the LP relaxation bound. Branching hap-
become prohibitive if good lower bounds pens when no further valid inequalities can
are not found early leading to relatively be found. Of course the amount of effort
little pruning. In terms of reoptimizing the spent to find valid inequalities is one of the
LP relaxations this method is also at a parameters of the algorithm that should be
disadvantage because one LP problem has decided. We refer the reader to the article
little relation to the next one. titled Branch and Cut in this encyclopedia
for further information.
Adaptive Node Selection. Adaptive meth- Another extension of branch-and-bound is
ods make intelligent use of information branch-and-price [2,3,19]. When the number
about the nodes to select the next node. of variables in an integer program is huge,
The estimate-based methods attempt to a solution method is using column gener-
select nodes that may lead to improved ation within the branch-and-bound frame-
integer feasible solutions. The best projection work. More specifically, implicit pricing of
criterion [14] and the best estimate criterion nonbasic variables is used to generate new
found in Refs 10 and 15 are among these columns or to prove LP optimality at a node of
methods. the branch-and-bound tree. Branching hap-
Many adaptive methods are two-phase pens when no columns price out to enter the
methods that essentially use a hybrid of basis and the LP solution does not satisfy
depth-first and best-bound searches. At the integrality constraints. We note that if cut-
beginning the depth-first strategy is used ting planes are also used in branch-and-price
to find a feasible solution and then the the algorithm is called branch-cut-price. We
best-bound method is used [16]. Variations refer the reader to the article titled Branch-
of two-phase methods using estimate-based Price-and-Cut Algorithms in this encyclo-
approaches are also proposed [15,17]. Some pedia for further information.
other suggested rules use an estimation of
the optimal IP solution to avoid considering
superfluous nodes, that is, the nodes in which PARALLEL BRANCH-AND-BOUND
zi < z. The tree is searched in a depth-first
fashion as long as zi is greater than the The divide and conquer nature of branch-
estimate. After that a different criterion such and-bound makes it a suitable framework
as best-bound is used [10,11]. for attacking huge problems using today’s
parallel computing capabilities. For a sur-
vey of parallel branch-and-bound algorithms
EXTENSIONS OF BRANCH-AND-BOUND refer to Ref. 20. There are three types of
parallelism that can be implemented for a
In solving integer programming problems, branch-and-bound algorithm: Type 1 is par-
pure branch-and-bound is seldom used. In allel execution of operations on generated
most cases cutting planes are added to the subproblems, for example, parallel bounding
8 BRANCH-AND-BOUND ALGORITHMS
operations for each subproblem to accelerate 11. Gauthier JM, Ribière G. Experiments in
execution. Type 2 consists of building the tree mixed-integer linear programming using
in parallel by performing operations on sev- pseudocosts. Math Program 1977;12:26–47.
eral subproblems simultaneously. Type 3 is 12. Applegate D, Bixby RE, Chvátal V, et al. The
building several trees in parallel. In each tree traveling salesman problem: a computational
some operation such as branching, bounding study. Princeton (NJ): Princeton University
Press; 2007.
or selection is performed differently but the
trees share their information and use the 13. Beale EML, Tomlin JA. Special facilities in a
generalized mathematical programming sys-
best bounds among themselves. For further
tem for nonconvex problems using ordered
reading refer to Ref. 20. sets of variables. In: Lawrence J, editor.
Proceedings of the 5th Annual Conference
REFERENCES on Operational Research. London: Tavistock
Publications; 1970. pp. 447–457.
1. Land AH, Doig AG. An automated method 14. Mitra G. Investigation of some branch-and-
of solving discrete programming problems. bound strategies for the solution of mixed
Econometrica 1960;28(2):497–520. integer linear programs. Math Program
2. Nemhauser GL, Wolsey LA. Integer and com- 1973;4:155–170.
binatorial optimization. New York: Wiley- 15. Forrest JJH, Hirst JPH, Tomlin JA. Practical
Interscience; 1988. solution of large scale mixed integer program-
3. Wolsey LA. Integer programming. New York: ming problems with UMPIRE. Manage Sci
Wiley; 1998. 1974;20:736–773.
4. Bertier P, Roy B. Procédure de résolution pour 16. Eckstein J. Parallel branch-and-bound algo-
une classe de problèmes pouvant avoir un rithms for general mixed integer pro-
caractère combinatoire. Cah Cent Étud Rech gramming on the CM-5. SIAM J Optim
Oper 1964;6:202–208. 1994;4:794–814.
5. Balas E. A note on the Branch-and-bound 17. Beale EML. Branch-and-bound methods for
Principle. Oper Res 1968;16(2):442–445. mathematical programming systems. In:
6. Dakin RJ. A tree search algorithm fo mixed Hammer PL, Johnson EL, Korte BH, editors.
integer programming. Comput J 1965;8: Discrete optimization II. Amsterdam: North
250–255. Holland Publishing Co.; 1979.
7. Achterberg T, Koch T, Martin A. Branching 18. Caprara A, Fischetti M. Branch-and-cut algo-
rules revisited. Oper Res Lett 2005;33:42–54. rithms. In: Dell’Amico M, Maffioli F, Martello
S, editors. Annotated bibliographies in combi-
8. Linderoth JT, Savelsbergh MWP. A compu-
natorial optimization. New York: Wiley; 1997.
tational study of search strategies for mixed
pp. 45–63.
integer programming. INFORMS J Comput
1999;11:173–187. 19. Barnhart C, Johnson EL, Nemhauser GL,
et al. Branch-and-price: column generation
9. Lodi A. Mixed integer programming compu-
for solving huge integer programs. Oper Res
tation. In: Junger M, Liebling T, Naddef D,
1998;46(3):316–329.
et al., editors. 50 years of integer program-
ming 1958–2008. Berlin: Springer; 2010. pp. 20. Gendron B, Crainic TG. Parallel branch-and-
619–645. bound algorithms: survey and synthesis. Oper
Res 1994;42(6):1042–1066.
10. Benichou M, Gauthier JM, Girodet P, et al.
Experiments in mixed-integer programming.
Math Program 1971;1:76–94.