The Lagrangian Relaxation Method For Solving Integer Programming Problems
The Lagrangian Relaxation Method For Solving Integer Programming Problems
Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at
http://www.jstor.org/page/info/about/policies/terms.jsp. JSTOR's Terms and Conditions of Use provides, in part, that unless
you have obtained prior permission, you may not download an entire issue of a journal or multiple copies of articles, and you
may use content in the JSTOR archive only for your personal, non-commercial use.
Please contact the publisher regarding any further use of this work. Publisher contact information may be obtained at
http://www.jstor.org/action/showPublisher?publisherCode=informs.
Each copy of any part of a JSTOR transmission must contain the same copyright notice that appears on the screen or printed
page of such transmission.
JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of
content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms
of scholarship. For more information about JSTOR, please contact [email protected].
INFORMS is collaborating with JSTOR to digitize, preserve and extend access to Management Science.
http://www.jstor.org
MANAGEMENT SCIENCE infIe
Vol.50,No. 12Supplement,
December2004,pp. 1861-1871 DOI10.1287/mnsc.1040.0263
ISSN0025-19091EISSN
1526-550110415012S11861 c 2004 INFORMS
Marshall L. Fisher
University of Pennsylvania,Philadelphia,Pennsylvania
One of the most computationallyuseful ideas of the 1970sis the observationthat many hard integerprogram-
ming problems can be viewed as easy problems complicatedby a relatively small set of side constraints.
Dualizing the side constraintsproduces a Lagrangianproblem that is easy to solve and whose optimal value
is a lower bound (for minimizationproblems) on the optimal value of the original problem. The Lagrangian
problemcan thus be used in place of a linearprogrammingrelaxationto provide bounds in a branchand bound
algorithm.This approachhas led to dramaticallyimproved algorithmsfor a number of importantproblems in
the areas of routing, location, scheduling, assignment and set covering. This paper is a review of Lagrangian
relaxationbased on what has been learned in the last decade.
Keywords:programming:integer algorithms;programming:integer algorithmbranch and bound;
programming:integer algorithms,heuristic
History: Accepted by Donald Erlenkotter,special editor;received June 13, 1979. This paper has been with the
author 5 months for 1 revision.
2. Basic Constructions and bound algorithm for (P). While this is the most
We begin with a combinatorialoptimization problem obvious use of (LRu),it has a number of other uses.
formulated as the integer program It can be a medium for selecting branching vari-
ables and choosing the next branch to explore. Good
Z = min cx feasible solutions to (P) can frequently be obtained
s.t. Ax = b, by perturbing nearly feasible solutions to (LR,).
Finally,Lagrangianrelaxationhas been used recently
Dx < e
(Cornuejolset al. 1977, Fisher et al. 1979) as an ana-
x > 0 and integral. (P) lytic tool for establishing worst-case bounds on the
performanceof certainheuristics.
where x is n x 1,b is m x 1, e is k xl and all
other matrices have conformable dimensions. Let
(LP) denote problem (P) with the integrality con-
3. Example
straint on x relaxed, and let ZLP denote the optimal The generalized assignment problem is an excel-
value of (LP). lent example for illustrating Lagrangian relaxation
We assume that the constraints of (P) have been because it is rich with readily apparentstructure.The
partitionedinto the two sets Ax = b and Dx < e so as generalized assignment problem (GAP) is the integer
to make it easy to solve the Lagrangianproblem program
mn
ZD(u)= min cx+u(Ax - b),
Z = minE cijx (1)
i=1 j=1
Dx e,
and integral,
x > 0 and integral, (LRu)
Ex, = l, j=1,...,n, (2)
i=1
where u = (u,..., Urn,)is a vector of Lagrange multi- n
pliers. By "easy to solve" we of course mean easy rel- i=
ative to (P). For all applicationsof which I am aware, j=1
b,, 1,..., m, (3)
the Lagrangianproblem has been solvable in polyno- E-aijxi
mial or pseudo-polynomial time. xi1= 0 or_<
1, all i and j. (4)
For convenience we assume that (P) is feasible and
that the set X = {x IDx < e, x > 0 and integrall of fea- There are two natural Lagrangian relaxations for
sible solutions to (LRu)is finite. Then ZD(u) is finite the generalized assignment problem. The first is
for all u. It is straightforwardto extend the develop- obtained by dualizing constraints(2).
ment when these assumptions are violated or when mn n M
inequality constraints are included in the set to be ZD1(u) = minEE cijxi + E Exi- 1
dualized. i=lj=1
j=1 i=l
It is well known that <Z. This is easy to
solution x* to (P) and subjectto (3) and (4)
show by assuming an optimal ZD(u)_
m n n
observing that = min : E(cij + U)xij -
EUj
ZD(u)< cx*+ u(Ax*- b) = Z.
i=1j=1 j=1
This relaxation is defined for v > 0, which is a theoreticalresults that have been obtained for specific
necessary condition for ZD2(v)< Z to hold. Since con- applications.
straints (2) are generalized upper bound (GUB) con-
straints,we will call a problem like (LR2t)a 0-1 GUB
problem. Such a problem is easily solved in time pro- 5. Existing Applications
portional to nm by determining + viaij) for Table 1 is a compilation of the applications of
each j and setting the associatedxijmini(cij
= 1. Remainingxij Lagrangianrelaxation of which I am aware. I have
are set to zero. not attempted to include algorithms, like those
given in Bilde and Krarup (1977) and Camerini and
Maffioli(1978),that are describedwithout referenceto
4. Issues Lagrangianrelaxation,but can be described in terms
A little thought about using (LR1,) or within of Lagrangianrelaxationwith sufficient insight. Nor
(LR2v)
a branch and bound algorithm for the generalized have I included referencesdescribing applications of
assignmentproblemquicklybrings to mind a number the algorithms in Table 1. For example, Mulvey and
of issues that need to be resolved. Foremost among Crowder (1979) describe a successful application of
these is: the Lagrangianrelaxationin Cornuejolset al. (1977)to
(1) How will we select an appropriatevalue for u? a specialized uncapacitatedlocation problem involv-
A closely related question is:
ing data clustering. Finally, the breadth and devel-
(2) Can we find a value for u for which ZD(u) is
oping nature of this field makes it certain that other
equal to or nearly equal to Z? omissions exist. I would be happy to learn of any
The generalized assignment problem also shows
that different Lagrangianrelaxations can be devised applications that I have overlooked.
This list speaks for itself in terms of the range of
for the same problem. Comparing (LR1u)and hard problemsthathave been addressedand the types
(LR2v),
we see that the first is harderto solve but might pro-
of embedded structuresthat have been exploited in
vide better bounds. Thereis also the question of how
either of these relaxationscompareswith the LP relax- Lagrangian problems. Most of these structures are
well-known but two require comment. The pseudo-
ation. This leads us to ask:
polynomial dynamic programmingproblems arising
(3) How can we choose between competing relax- in scheduling are similar to the 0-1 knapsack prob-
ations, i.e., different Lagrangianrelaxations and the lem if we regard the scheduling horizon as the knap-
linear programmingrelaxation?
sack size and the set of jobs to be scheduled as the
Lagrangianrelaxationsalso can be used to provide set of items available for packing. The notation VUB
good feasible solutions. For example, a solution to stands for "variableupper bound" (Schrage1975)and
(LR2v)will be feasible in the generalized assignment denotes a problem structurein which some variables
problem unless the "weight"of items assigned to one are upperbounded by other 0-1 variables.An example
or more of the "knapsacks"corresponding to con-
straints (3) exceeds the capacity bi. If this happens, of this structureis given in j7.
we could reassign items from overloaded knapsacks
to other knapsacks,perhaps using a variant of a bin-
6. Determining u
packing heuristic, to attempt to achieve primal feasi- It is clear that the best choice for u would be an opti-
bility. In general we would like to know: mal solution to the dual problem
(4) How can (LRu)be used to obtain feasible solu-
tions for (P)?
How good are these solutions likely to be? ZD= max ZD(u) (D)
Finally,we note that the ultimate use of Lagrangian Most schemes for determining u have as their objec-
relaxation is for fathoming in a branch and bound
tive finding optimal or near optimal solutions to (D).
algorithm, which leads us to ask:
Problem (D) has a number of important structural
(5) How can the lower and upper bounding capa-
bilities of the Lagrangian problem be integrated properties that make it feasible to solve. We have
within branch and bound? assumed that the set X = {x IDx < e, x > 0 and
The remainder of this paper is organized around integral}of feasible solutions for (LR,) is finite, so we
can representX as X = {xt, t = 1,..., T}. This allows
these five issues, which arise in any application of
us to express (D) as the following linearprogramwith
Lagrangian relaxation. A separate section is devoted
to each one. In some cases (issues (1) and (3)) gen- many constraints.
eral theoretical results are available. But more often,
the "answers" to the questions we have posed must ZD = max w,
be extrapolated from computational experience or w
cx't + u(Axt - b), t = 1,..., T. (D)
Fisher: TheLagrangian
Relaxation Problems
Methodfor SolvingIntegerProgramming
1864 ManagementScience50(12S),pp. 1861-1871,m2004INFORMS
Table1 ofLagrangian
Applications Relaxation
Problem Researchers Lagrangian
problem
salesman
Traveling
Symmetric HeldandKarp(1970,1971) Spanningtree
HelbigHansen
andKrarup (1974) Spanningtree
Asymmetric Bazarra
andGoode(1977) Spanningtree
Symmetric BalasandChristofides
(1976) Perfect
2-matching
Asymmetric BalasandChristofides
(1976) Assignment
Scheduling
tardiness
nImWeighted Fisher
(1973) Pseudo-polynomial
programming
dynamic
1 Machine tardiness
weighted Fisher
(1976) DP
Pseudo-polynomial
Power systems
generation Muckstadt
andKoenig
(1977) DP
Pseudo-polynomial
IP
General
Unboundedvariables Fisher
andShapiro(1974) Group
problem
Unboundedvariables Burdet
andJohnson (1977) Group
problem
0-1variables etal.(1978)
Etcheberry 0-1GUB
Location
Uncapacitated etal.(1977)
Cornuejols 0-1VUB
Erlenkotter
(1978) 0-1VUB
Capacitated andMcBride
Geoffrion (1978) 0-1VUB
incomputer
Databases networks Fisher
andHochbaum (1980) 0-1VUB
Generalized
assignment
RossandSoland (1975) Knapsack
ChalmetandGelders(1976) 0-1GUB
Knapsack,
Fisher
etal.(1980) Knapsack
Setcovering-partitioning
Covering Etcheberry
(1977) 0-1GUB
Partitioning andWeber
Nemhauser (1978) Matching
The LP dual of (D) is a linear program with many An m-vectory is called a subgradientof ZD(u)at i
columns. if it satisfies
T
ZD(u) < ZD(i) + y(u - U), for all u.
ZD= min tA,cxt,
t=l
It's apparent that ZD(u) is subdifferentiableevery-
T
where. The vector (Axt - b) is a subgradient at any
ZAtAxt = b, u for which xt solves (LRu).Any other subgradient
t=l
T
is a convex combinationof these primitive subgradi-
ents. Withthis perspective,the well-known result that
At= 1,
t=l u* and A*are optimal for (D) and (P) if and only if
At > O, t= T. (P)
they are feasible and satisfy a complementaryslack-
1,... ness condition can be seen to be equivalent to the
Problem(P) with At requiredto be integralis equiv- obvious fact that u* is optimal in (D) if and only if 0
alent to (P), although (P) and (LP) generally are not is a subgradientof ZD(u)at u*.
equivalent problems. Stimulated in large part by applications in
Both (D) and (P) have been important constructs Lagrangianrelaxation, the field of nondifferentiable
in the formulationof algorithmsfor (D). Problem (D) optimization using subgradientshas recentlybecome
makes it apparentthat ZD(u)is the lower envelope of an important topic of study in its own right with
a finite family of linear functions. The form of ZD(u) a large and growing literature.Our review of algo-
is shown in Figure 1 for m = 1 and T = 4. The func- rithms for (D) will be brief and limited to the fol-
tion ZD(u)has all the nice properties,like continuity lowing three approaches that have been popular
and concavity,that make life easy for a hill-climbing in Lagrangian relaxation applications: (1) the sub-
algorithm, except one-differentiability. The function gradient method, (2) various versions of the sim-
is nondifferentiableat any ii where (LR,) has multi- plex method implemented using column generation
ple optima. Although it is differentiablealmost every- techniques, and (3) multiplier adjustment methods.
where, it generally is nondifferentiableat an optimal Fisher et al. (1975) and Held et al. (1974) contain
point. general discussions on the solution of (D) within the
Fisher: TheLagrangian
RelaxationMethodfor SolvingIntegerProgramming
Problems
ManagementScience50(12S),pp. 1861-1871,02004 INFORMS 1865
1
Figure TheForm
ofZ,(u)
ZD(u)
w= Cx2+u(Ax2-b) -w=cX4 +
+u(Ax4-b)
xw= cx3+u(Ax3 - b)
u
10,
This line searchproblem is easily solved by Fibonacci for another location problem and found the method
methods. to work well, but not quite so well as the subgradient
Generally,the simplex-basedmethods are harderto method.
program and have not performed quite so well com- Fisher et al. (1980) have successfully developed
putationally as the subgradientmethod. They should a multiplier adjustment method for the generalized
not be counted out, however. Furtherresearchcould
assignment problemin which one multiplierat a time
produce attractive variants. We note also that the is increased. This method has led to a substantially
dual, primal-dual and BOXSTEPmethods can all be
used in tandem with the subgradientmethod by ini- improved algorithm for the generalized assignment
problem.
tiating them with a point determined by the subgra-
dient method. Using them in this fashion to finish off
a dual optimizationprobablybest exploits their com-
parative advantages. 7. How Good Are the Bounds?
The third approach, multiplier adjustment meth- The "answer"to this question that is available in the
ods, are specialized algorithmsfor (D) that exploit the literatureis completely problem specific and largely
structure of a particular application. In these meth- empirical. Most of the empirical results are summa-
ods, a sequence uk is generated by the rule uk+l = rized in Table 2. Each line of this table corresponds
uk + tkdk where tk is a positive scalar and dk is a to a paper on a particularapplication of Lagrangian
direction.To determine dkwe define a finite and usu- relaxation and gives the problem type, the source
ally small set of primitive directions S for which it is in which the computationalexperience is given, the
easy to evaluate the directional derivative of ZD(u). number of problems attempted, the percentage of
Usually directions in S involve changes in only one problems for which a u was discovered with ZD(u)=
or two multipliers. For directions in S, it should be
ZD= Z, and the average value of ZD(u*) x 100 divided
easy to determine the directionalderivative of ZD(u).
Directions in S are scanned in fixed order and dk by the average value of Z, where ZD(u*)denotes the
is taken to be either the first direction found along largest bound discovered for each problem instance.
which ZD(u) increases or the direction of steepest Except as noted for the generalized assignment prob-
ascent within S. The step size tkcan be chosen either lem, all samples included a reasonable number of
to maximize ZD(Uk+ tdk) or to take us to the firstpoint large problems. In some cases the sample included
at which the directionalderivative changes. If S con- significantly larger problems than had been previ-
tains no improving directionwe terminate,which, of ously attempted. Frequently,standard test problems
course, can happen prior to finding an optimal solu- known for their difficulty were included. Table 2 is
tion to (D). based on the results reportedin each referencefor all
Successful implementation of primitive-direction problems for which complete informationwas given.
ascent for a particular problem requires an artful Of course, Table 2 gives highly aggregated informa-
specification of the set S. S should be manageably tion, and interested readers are urged to consult the
small, but still include directions that allow ascent appropriatereferences.
to at least a near optimal solution. Held and Karp These results provide overwhelming evidence that
(1970) experimented with primitive-directionascent the bounds provided by Lagrangian relaxation are
in their early work on the traveling salesman prob- extremely sharp. It is natural to ask why Lagrangian
lem. They had limited success using a set S consisting bounds are so sharp. I am aware of only one analytic
of all positive and negative coordinate vectors. This result that even begins to answer this question. This
seemed to discourageother researchersfor some time, result was developed by Cornuejolset al. (1977) for
but recently Erlenkotter (1978) devised a multiplier
the K-median problem.
adjustment method for the Lagrangian relaxation Given n possible facilitylocations, m markets,and a
of the uncapacitated location problem given in
nonnegative value ci for serving market i from a facil-
Cornuejols et al. (1977) in the case where the number
of facilities located is unconstrained. Although dis- ity at location j, the K-median problem asks where
K facilities should be located to maximize total value.
covered independently, Erlenkotter's algorithm is a
Let
variation on a method of Bilde and Krarup that was
first described in 1967 in a Danish working paper and
1, if a facility is placed in location j,
later published in English as Bilde and Krarup (1977).
While there has been no direct comparison, Erlenkot- YJ= 0, otherwise;
ter's method appears to perform considerably better
than the subgradient method. Fisher and Hochbaum = 1, if market i is served from location j,
(1980) have experimented with multiplier adjustment 0, otherwise.
Fisher: TheLagrangian
Relaxation
Methodfor SolvingIntegerProgramming
Problems
ManagementScience50(12S),pp. 1861-1871,@2004INFORMS 1867
If yj = 0 we must have xij = 0 for all i. Thus the Hence, defining -5 = Em max(0, cij+ ui) optimal yj's
K-median problem can be formulated as the integer must solve
program n
max
m n E cji,
Z = max E cijxij, (5) i=1
i=1 j=1 n
-yj=K,
j=1
j=1
y = Oorl, j=1,...,n,
n
= max E
m n
m M
the best possible.
This is an interesting first step towards under-
ZD(u) cijxij Ui ij
i=1j=l i=1 i=1 standing why Lagrangianrelaxationhas worked well
on so many problems. Further study of this type is
subjectto (7), (8) and (9) needed to understandand better exploit the power of
m n m
= max Z (cij + ui)xij - Lagrangianrelaxation.
ui
i=1 j=1 i=1
able for solving the (generally large) LP relaxation The constraintsof (P) which are violated by x corre-
-
of (P). The importantmessage of these applicationsis spond to j E S US3.We wish to modify so that these
that combinatorialoptimization problems frequently constraintsare satisfied. This is easy for a j E S3.Sim-
can be formulated as a large IP whose LP relaxation ply remove item j from all but one knapsack.A vari-
closely approximatesthe IP and can be solved quickly ety of rules could be used to determine in which
by dual methods. To exploit this fact, future research knapsack to leave item j. For example, it would be
should be broadly construed to develop methods for reasonable to choose the knapsack that maximizes
solving the large structuredLPs arising from combi- (ui - cij)/aij.
natorial problems and to understand the properties To complete the constructionof a feasible solution
of combinatorialproblems that give rise to good LP it is only necessary to assign items in S, to knapsacks.
approximations.There has already been significant While there is no guaranteethat this can be done, the
researchon methods other than Lagrangianrelaxation chances of success should be good unless the knap-
for exploiting the special structure of LP's derived sack constraintsarevery tight. Many assignmentrules
from combinatorialproblems.Schrage(1975),Miliotis are plausible, such as the following one that is moti-
(1976a,b), and Christofidesand Whitlock (1978)have vated by bin packing heuristics. Order items in
by decreasing value of Eil aij and place each item S1
given clever LP solution methods that exploit certain
types of structurethat are common in formulationsof in turn into a knapsack with sufficient capacity that
combinatorialproblems. maximizes (ui - cij)/aij.
Several researchers have reported success using
Lagrangian problem solutions obtained during the
9. Feasible Solutions application of the subgradient method to construct
This section is concerned with using (LR,) to obtain primal feasible solutions. For example, this is easy to
feasible solutions for (P). It is possible in the course do for the K-medianproblem. Let x, Y denote a feasi-
of solving (D) that a solution to (LR,) will be dis- ble solution to the Lagrangianproblem defined in j7
covered that is feasible in (P). Because the dualized for the K-medianproblem. Let S = {jj|= 11 and for
constraintsAx = b are equalities, this solution is also each i set = 1 for a j that solves maxij, cij.Set = 0
=ij - ij
optimal for (P). If the dualized constraints contain for remaining ij. The solution =, is feasible and rep-
some inequalities,a Lagrangianproblem solution can resents the best assignment of x given P. Cornuejols
be feasible but nonoptimal for (P). However, it is et al. (1977) found that this approach performed as
rare that a feasible solution of either type is dis- well as the best of several other heuristics they tested.
covered. On the other hand, it often happens that a Fisher (1976) reports experience for the problem
solution to obtained while optimizing (D) will of sequencing n jobs on one machine to minimize
(LRu)
be nearly feasible for (P) and can be made feasible a tardiness function. A Lagrangiansolution is a set
with some judicious tinkering.Such a method might of start times x,,. ., Xn
for the n jobs that may vio-
be called a Lagrangian heuristic. After illustrating late the machine constraints.A primal feasible solu-
this approachfor the generalizedassignmentproblem tion is obtained by sequencing jobs on the machine
and (LR1u),we will discuss computationalexperience in order of increasingX, values. This rule was tested
with Lagrangianheuristics for other problems. on 63 problems. It was applied in conjunctionwith
It is convenient to think of the generalized assign- the subgradient method after an initial feasible solu-
ment problem as requiringa packing of n items into tion had been generated by a greedy heuristic. The
m knapsacks using each item exactly once. In (LR1,) greedy heuristic found an optimal solution for 18 of
the constraints Elmxi = 1, j = 1, ..., n requiring that the problems. The Lagrangianheuristic found opti-
each item be used exactly once are dualized and mal solutions to 21 of the remaining45 problems.On
may be violated. Let x denote an optimal solution to average the greedy value was 100.4% of the optimal
(LR1u).Partition N = {1,..., n} into three sets defined value while the value of the solution produced by
by the Lagrangian heuristic was 100.16% of the optimal
value.
m
S, = ie J i =10,
i=1
10. Using Lagrangian Relaxation in
s2 = i =jl
Branch and Bound
J xj The issues involved in designing a branchand bound
algorithmthat uses a Lagrangianrelaxationare essen-
S3 = j{iJeS3
i=
i>
v
>1
.m tially the same as those that arise when a linear pro-
i=1
gramming relaxationis used. Some of these issues are
Fisher: TheLagrangian Methodfor SolvingIntegerProgramming
Relaxation Problems
1870 ManagementScience50(12S),pp. 1861-1871,@2004INFORMS
2
Figure Partial TreefortheGeneralized
Branching Prob-
Assignment 11. Conclusions and Future Research
lemwithm= 3
Directions
Lagrangian relaxation is an important new com-
putational technique in the management scientist's
arsenal.This paper has documented a number of suc-
cessful applications of this technique, and hopefully
will inspire other applications. Besides additional
applications, what opportunities for further research
exist in this area?The most obvious is development of
=1
x21
= x3,= 1 x23 = 1 1 more powerful technology for optimizing the nondif-
ferentiabledual function. Nondifferentiableoptimiza-
13j3 1
= tion has become an important general research area
x=2 that surely will continue to grow. One corner of this
O 3O3
area that seems to hold great promise for Lagrangian
relaxation is the development of multiplier adjust-
illustrated here for the generalized assignment prob- ment methods of the type described at the end of
lem and (LR1,) derived in r3. b6. The enormous success that has been obtained
with this approach on the uncapacitated location
A natural branching tree for this problem is illus-
trated in Figure 2. This tree exploits the structureof (Erlenkotter 1978) and the generalized assignment
constraints(2) by selecting a particularindex j when problems (Fisher et al. 1980) suggests that it should
be tried on other problems. Two other researchareas
branching and requiring exactly one variable in the that deserve further attention are the development
set xij, i =1,..., m, to equal 1 along each branch. and analysis of Lagrangianheuristics as described in
A Lagrangian relaxation (presumably
LR1ugiven k9 and the analysis (worst-caseor probabilistic)of the
the discussion in m8)can be used at each node of this
quality of the bounds produced by Lagrangianrelax-
tree to obtain lower bounds and feasible solutions. ation as discussed in q7 and Cornuejolset al. (1977).
We note that the Lagrangian problem defined at
a particular node of this tree has the same struc-
Acknowledgments
ture as (LR1,) and is no harder to solve. This is an This paper was supported in part by NSF Grant ENG-
obvious property that must hold for any application. 7826500to the University of Pennsylvania.
Sometimes it is desirable to design the branching
rules to achieve this property (e.g., Held and Karp
1971). References
There are several tactical decisions that must be Balas,E., N. Christofides.1976.Talkpresentedat the Ninth Interna-
made in any branchand bound scheme such as which tional Symposiumon MathematicalProgramming,Budapest.
node to explorenext and what indices(jI,j2 and j3, Balinski,M. L., P. Wolfe,eds. 1975.Nondifferentiableoptimization.
Math.Programming Stud.3(November).
in Figure 2) to use in branching. Lagrangianrelax- Bazarra,M. S., J. J. Goode. 1977. The traveling salesman problem:
ation can be used in making these decisions in much A duality approach.Math.Programming 13 221-237.
the same way that linear programming would he Bilde, O., J. Krarup. 1977. Sharp lower bounds and efficientalgo-
used. For example, we might choose to branch on an rithms for the simple plant location problem. Ann. Discrete
Math.1 79-97.
index j for which uj(Em xij-1) is large in the current
Burdet,C. A., E. L. Johnson.1977.A subadditiveapproachto solve
Lagrangianproblem solution in order to strengthen linear integer programs.Ann. DiscreteMath.1 117-143.
the bounds as much as possible. Camerini,P.M., L. Fratta,F.Maffioli.1975.On improvingrelaxation
Finally, we note that the method for optimizing methods by modified gradienttechniques.Math.Programming
Stud.3 26-34.
(D) must be carefully integrated into the branch and
bound algorithm to avoid doing unnecessary work Camerini,P. M., F. Maffioli.1978. Heuristicallyguided algorithms
for K-paritymatroidproblems.DiscreteMath.103-116.
when (D) is reoptimized at a new node. A common
Chalmet,L. G., L. F.Gelders.1976.Lagrangianrelaxationsfor a gen-
strategy when using the subgradient method is to eralized assignment-typeproblem. Proc. SecondEur.Congress
take uo equal to the terminal value of u at the previ- Oper.Res.North-Holland,Amsterdam,103-109.
ous node. The subgradient method is then run for a Christofides,N., C. Whitlock.1978. An LP-basedTSP algorithm.
fixed number of iterationsthat depends on the type of ImperialCollege Report78-79.
node being explored. At the first node a large number Cornuejols,G., M. L. Fisher,G. L. Nemhauser. 1977. Location of
bank accountsto optimize float:An analyticstudy of exact and
of iterations is used. When branching down a small approximatealgorithms.Management Sci. 23 789-810.
number is used, and when backtracking,an interme- Erlenkotter,D. 1978. A dual-based procedure for uncapacitated
diate number. facility location.Oper.Res.26(1) 992-1009.
Fisher: TheLagrangian Relaxation Problems
Methodfor SolvingIntegerProgramming
Management Science50(12S), @2004INFORMS
pp. 1861-1871, 1871
Etcheberry,J. 1977.The set-coveringproblem:A new implicit enu- Lorie, J., L. J. Savage. 1955. Three problems in capital rationing.
merationalgorithm.Oper.Res.25 760-772. J. Bus.229-239.
Etcheberry,J., C. Conca, E. Stacchetti.1978. An implicit enumer- Mairs,T. G., G. W. Wakefield,E. L. Johnson,K. Spielberg.1978.On
ation approach for integer programmingusing subgradient a productionallocationand distributionproblem.Management
optimization.Pub. No. 78/04/c, Universidad de Chile, Santi- Sci. 24(15)1622-1630.
ago, Chile. Marsten,R. 1975. The use of the boxstep method in discrete opti-
Everett,H. 1963.GeneralizedLagrangemultipliermethod for solv- mization.Math.Programming Stud.3 127-144.
ing problemsof optimum allocationof resources.Oper.Res.11 Miliotis, T. 1976a.Integer programmingapproachesto the travel-
399-417.
ling salesmanproblemMath.Programming 10 367-378.
Fisher,M. L. 1973.Optimalsolution of schedulingproblemsusing 1976b. An arithmetic
Miliotis,T. all-integer LP-cuttingplanes code
Lagrangemultipliers:PartI. Oper.Res.21 1114-1127. applied to the travelling salesman problem. Workingpaper,
Fisher,M. L., J. F. Shapiro. 1974. Constructiveduality in integer London School of Economics,London,U.K.
programming.SIAMJ. Appl.Math.27 31-52. Muckstadt,J., S. A. Koenig. 1977. An application of Lagrangian
Fisher,M. L., W. D. Northup, J. F. Shapiro. 1975. Using duality relaxationto scheduling in power generation systems. Oper.
to solve discreteoptimizationproblems:Theoryand computa- Res.25 387-403.
tional experience.Math.Programming Stud.3 56-94.
Mulvey, J., H. Crowder.1979. Cluster analysis:An applicationof
Fisher,M. L. 1976.A dual algorithmfor the one-machineschedul- Lagrangianrelaxation.Management Sci. 25 329-340.
ing problem.Math.Programming 11 229-251.
Nemhauser,G. L., G. Weber.1978.Optimalset partitioning,match-
Fisher,M. L., D. S. Hochbaum.1980.Databaselocationin computer ings and Lagrangianduality.Talk delivered at the New York
networks.J. Assoc.Comput.Mach.27(4) 718-735. ORSA/TIMSMeeting (May).
Fisher,M. L., G. L. Nemhauser,L. A. Wolsey.1979.An analysis of Polak, B. T. 1967. A general method for solving extremumprob-
approximationsfor finding a maximum weight Hamiltonian lems. SovietMath.Dokl.8 593-597.
circuit.Oper.Res.27(4) 799-809.
Ross, G. T., R. M. Soland. 1975. A branch and bound algorithm
Fisher,M. L., R. Jaikumar,L. Van Wassenhove.1980. A multiplier for the generalizedassignmentproblem.Math.Programming 8
adjustmentmethod for the generalized assignment problem. 91-103.
Decision SciencesWorkingpaper,Universityof Pennsylvania.
Schrage,L. 1975.Implicitrepresentationof variableupper bounds
Geoffrion,A. M. 1974.Lagrangianrelaxationand its uses in integer in linearprogramming.Math.Programming Stud.4 118-132.
programming.Math.Programming Stud.2 82-114.
1971. Generalized
Shapiro,J.E. Lagrangemultipliersin integerpro-
Geoffrion,A. M., R. McBridge.1978.Lagrangianrelaxationapplied
to capacitated facility location problems. AIIE Trans. 10 gramming.Oper.Res.19 68-76.
40-47. Shapiro,J. F. 1979a.A survey of Lagrangiantechniquesfor discrete
Gilmore, P. C., R. E. Gomory. 1963. A linear programming optimization.Ann. DiscreteMath.5 113-138.
approachto the cutting-stockproblem, Part II. Oper.Res. 11 Shapiro,J. F. 1979b.Mathematical Programming: StructuresandAlgo-
863-888. rithms.Wiley,New York.
Goffin, J. L. 1977. On the convergencerates of subgradientopti- Williams,H. P. 1974.Experimentsin the formulationof integerpro-
mizationmethods.Math.Programming 13 329-347. grammingproblems.Math.Programming Stud.2 180-197.
Helbig Hansen, K., J. Krarup.1974. Improvementsof the Held- Williams,H. P. 1975.The reformulationof two mixed integer pro-
Karpalgorithmfor the symmetrictraveling-salesmanproblem. grammingproblems.Math.Programming 14 325-331.
Math.Programming 7 87-96.
Held, M., R. M. Karp. 1970. The traveling salesman problem and
minimumspanningtrees. Oper.Res.18 1138-1162.
Held, M., R. M. Karp. 1971. The traveling salesman problem and This articleoriginallyappearedin ManagementScience,
minimum spanningtrees:PartII. Math.Programming 1 6-25.
Held, M., P. Wolfe, H. D. Crowder. 1974. Validationof subgradient
January1981, Volume27, Number1, pp. 1-18, published
optimization.Math.Programming 6 62-88. by The Instituteof ManagementSciences.Copyrightis
Hogan, W. W.,R. E. Marsten,J. W. Blankenship.1975.The boxstep heldby theInstitutefor OperationsResearch
and theMan-
method for large scale optimization.Oper.Res.23 3. agement Sciences (INFORMS), Linthicum,Maryland.