Network 9
Network 9
i
ii
Chapter 9
628
629
Mackinaw City
3
Traverse City 105
4
170 210
140
Saginaw
2
135
5
Grand
Rapids
112
245 157 270
300
306 85 180
6 1
Benton Harbor Detroit
Figure 9.1:
3
2
2 3
2 4 4 4
5 2 4
Figure 9.2: The shortest path tree rooted at a node may not be a
minimum length spanning tree.
edge (r; s), a contradiction since 2 is a tree. So, the claim must be
true.
So, at least one of the 2 -edges on P1 has cost = crs , suppose it is
(j1 ; j2 ). Replace (j1 ; j2 ) from 1 by (r; s). This leads to a new spanning
tree 1 with the same cost as 1 , but it has one more edge in common
with 2 , and it can be verified that it also satisfies (9.1). If 1 and 2
are distinct, repeat this process again with them. After each repetition
we get another spanning tree satisfying (9.1) and having the same cost
as 1 , but containing one more edge in common with 2 . So, after at
most n − 1 repetitions of this process it must lead to 2 . Hence 1
and 2 have the same cost.
There are only a finite number of spanning trees in G, and hence
a minimum cost spanning tree exists in G. If 0 is a minimum cost
spanning tree, it satisfies (9.1) by Theorem 9.2. But we have just
proved that any pair of spanning trees satisfying (9.1) have the same
cost. Hence every spanning tree in G satisfying (9.1) is a minimum
cost spanning tree.
one among the spanning trees of G containing all the edges in F, and
it contains (p; q), proving the theorem.
Exercises
9.2. Min. Spanning Tree Algorithms 635
PRIM’S ALGORITHM
9.2. Min. Spanning Tree Algorithms 637
General step Among all out-of-tree nodes j at this stage, find one
with the smallest dj , suppose it is r with temporary label (Pr , dr ).
Delete this temporary label on r and give it the permanent label
Pr (its predecessor index in the tree), i.e., make (Pr , r) an in-tree
edge and r an in-tree node. If there are no out-of-tree nodes left,
terminate, the spanning tree defined by the permanent labels is
a minimum cost spanning tree in G. If there are some out-of-tree
nodes left update their temporary labels as follows: for each out-
of-tree node j with temporary label (Pj , dj ), change it to (r, crj )
only if (r; j) ∈ A and crj < dj ; leave it unchanged otherwise.
Then go to the next step.
Discussion
KRUSKAL’S ALGORITHM
General step Get the next least cost remaining edge in A. Suppose
it is (i; j). If both the nodes i, j on this edge lie in the same
component of the forest at this stage, discard this edge and go to
the next step. If i, j lie in different components of the forest at
this stage, include (i; j) as an in-forest edge and merge the two
components that it connects into a single component. If there
9.2. Min. Spanning Tree Algorithms 639
Discussion
Let (i; j) be the edge that has come up for examination at some
stage of this algorithm. Suppose i, j both appear in a component,
Fr = (Nr , Ar ) say, at this stage. All the arcs in Ar were examined
<
earlier, so the cost of every edge in Ar is = cij . So, if CC1 is the
fundamental cycle of (i; j) wrt Fr , then (i; j) is a maximum cost edge
on C C1 , and by Theorem 9.5 there exists a minimum cost spanning tree
not containing edge (i; j), among all the spanning trees containing all
the edges in the forest at this stage.
Suppose i, j appear in different components, Fr = (Nr , Ar ), and
Fs = (Ns , As ), say. Then (i; j) is a least cost edge in the cut (Nr ; N \Nr ),
and this cut contains none of the forest edges at this stage. So, by The-
orem 9.4, there exists a minimum cost spanning tree containing edge
(i; j) among the spanning trees containing all the edges in the forest at
this stage.
Applying these arguments in each step from the beginning, we con-
clude that the spanning tree obtained at the termination of this algo-
rithm is a minimum cost spanning tree in G. It can also be verified
that this spanning tree satisfies (9.1).
Computationally, the most expensive operation in this algorithm
is that of arranging the edges in A in ascending order of cost at the
beginning, which requires O(m log m) effort, where m = |A|. Also
since m < n2 , this is the same as O(m log n), where n = |N |. This is
the worst case computational complexity of this method.
It is not actually necessary to order all the edges in A in ascending
order of cost, since we will actually select only (n − 1) of the edges for
the final spanning tree. Any partial quick sort scheme which produces
the least cost remaining edge would be adequate. An ideal scheme
for this is a multipass sorting routine in which each pass produces
the next edge in the ordered sequence very efficiently. Even with all
these refinements, this method is clearly suitable to solve minimum
cost spanning tree problems in relatively sparse networks.
640 Ch. 9. Min. Cost Spanning Trees
BORUVKA’S ALGORITHM
F ep F
t-1
t t-1
..
ep ep
t r
F Fr
1
ep
r-1
ep
1
..
F2 ep F3
2
Figure 9.3: A situation for the selected edges that could lead to a cycle
in Boruvka’s algorithm.
Discussion
We need to show that the addition of all the new edges in a step
does not create a cycle. Suppose at the beginning of some step, the
trees in the forest are F1 = (N1 , A1 ), . . . , Fl = (Nl , Al ). Suppose a
cycle is created when all the new edges selected in this step are added
to this forest. Each of the selected edges joins a node in one of these
trees, to a node outside this tree. So, a cycle can only be created if a
subset of 2 or more selected edges, ep1 , . . . , ept , say, are such that : epr ,
the least cost edge in the cut (Nr , N \Nr ), joins Fr and another tree,
Fr+1 say, for r = 1 to t, with t + 1 being 1. So, the trees that these
selected edges join are as in Figure 9.3.
Both edges epr−1 and epr are in the cut (Nr , N \Nr ), but epr has been
selected as the least cost edge in this step. So, by the arrangement made
<
at the beginning of this algorithm, cpr = cpr−1 , and if cpr = cpr−1 then
< < <
pr < pr−1 for r = 1 to t with t + 1 being 1. So, cpt = cpt−1 = . . . =
<
cp1 = cpt which implies that cpt = cpt−1 = . . . = cp1 = cpt . Hence we
must have pt < pt−1 < . . . < p1 < pt , which is impossible. Hence cycles
can never be created in this algorithm.
The validity of this algorithm follows from Theorem 9.4. In this
642 Ch. 9. Min. Cost Spanning Trees
cij
j= 1 2 3 4 5 6 ūi
i=1 1 2 6 10 17 29 0
2 3 4 8 11 20 30 2
3 5 7 9 12 22 33 4
4 13 14 15 16 23 34 9
5 18 19 21 24 25 35 16
6 26 27 28 31 32 36 24
v̄j 1 2 5 7 9 12
It can be verified that the greedy method produces the unit matrix
with all allocations along the main diagonal, as the optimum assign-
ment. However, from the ū, v̄ vectors given in the above tableau, it can
<
be verified that ūi + v̄j = cij for all i, j; and ūi + v̄i = cii for i = 1 to
6; from the results in Chapter 3 this establishes that the unit matrix
is the minimum objective value assignment in this problem. Hence
the unit matrix is the worst candidate for the problem of maximiz-
ing the objective function considered here. So, on this problem, any
random selection is a better answer than that produced by the greedy
algorithm.
A square matrix C of order n is defined to be gullible (to in-
dicate that it will be an easy victim of a shady lady) if the greedy
algorithm produces the worst possible answer for the objective maxi-
mizing assignment problem with C as the objective coefficient matrix.
Let û = (û1 , . . . , ûn ), v̂ = (v̂1 , . . . , v̂n ) be any pair of vectors in which
9.3. Ranking Algorithm 645
of the in-tree arc er+j wrt , suppose it is (X; X). The edge er+j is
the unique in-tree edge in in the cutset (X; X), and it is a minimum
cost edge among edges of this cutset not in {b1 , . . . , bs } since is a
minimum cost spanning tree among those in τ . The subset τj is empty
iff the cutset (X; X) ⊂ {b1 , . . . , bs , er+j }, the set of excluded edges in
τj . If (X; X) ⊂ {b1 , . . . , bs , er+j }, let er+j be a minimum cost edge in
the cutset (X; X) that is not contained in {b1 , . . . , bs , er+j }. Then the
spanning tree j obtained by replacing er+j in by er+j ; is a minimum
cost spanning tree among those in τj . Clearly, the computational effort
needed to check whether τj = ∅ and finding a minimum cost spanning
tree j in it if it is nonempty is at most O(m) by this approach.
Each of the sets τ1 , . . . , τn−r−1 is again in the same form as in (9.4),
and clearly they are mutually disjoint, and their union is τ \{ }. These
sets are said to be the new sets generated when τ is partitioned using
the minimum spanning tree in it. The number of these sets is n −
1−(number of included edges in τ ), and some of these sets may be
empty. Clearly, a minimum cost spanning tree among those in τ \{ }
is the best among { j : j = 1 to n − r − 1 such that τj = ∅ }.
The ranking algorithm generates various sets of spanning trees in
G of the form defined in (9.4). Each nonempty set among these is
stored in a list, together with a minimum cost spanning tree in it, in
increasing order of the cost of this tree, going from top to bottom.
Each step of the algorithm generates one additional spanning tree in
the ranked sequence, and adds at most n − 1 new sets to the list. The
computational effort of each step is at most O(mn), and the method
can be terminated any time, after enough spanning trees in the ranked
sequence have been obtained.
RANKING ALGORITHM
spanning tree in them (as you go from top to bottom) in the list.
Discussion
.
(4, 4)
P
3
.
P
3
P2 .
(0, 3)
P2 .
. .
0 0
120 120
S S2
1
P
1
. P
1
.
(0, 1)
P
4 . P4 .
(4, 0)
5 5
1 5 4
5 1
Figure 9.5:
9.5 Exercises
9.3 Construct a directed network with positive arc lengths to show
that a shortest chain tree rooted at a node may not be a minimum cost
branching rooted at that node (try the network in Figure 9.5, with
node 1 as the root node. Arc lengths are entered on the arcs).
9.4 Given a minimum cost spanning tree in an undirected network,
develop an efficient procedure for doing cost ranging on it, i.e., to de-
termine the range of values of each cost coefficient which does not affect
the minimality of the tree (Tarjan [1982]).
9.5 We are given a minimum cost spanning tree in an undirected
connected network. Develop an efficient procedure to update into a
minimum cost spanning tree when a new node and incident edges are
added to the network (Spira and Pan [1975]).
9.6 Develop good heuristic algorithms based on the greedy approach
for the following combinatorial optimization problems.
652 Ch. 9. Min. Cost Spanning Trees
minimize cx
>
subject to Ax = e
x is a 0 − 1 vector
9.5. Exercises 653
1,4 50,100
1 1,4 4
8,10
3
Figure 9.6: Entries on the edges are aij , bij in that order.
cost spanning tree ˆ (α) is actually a minimum cost spanning tree for
every α in that interval. Hence h(α) is linear in each of these intervals.
Using this result, develop an efficient algorithm for finding a spanning
tree in G that minimizes z( ), and determine its computational
complexity (Chandrasekaran [1977]).
r on arc (i, j). Let x = (xi;j ), y = (yijr ). Consider the following system
of constraints.
Prove that the projection of the feasible solution set of (9.5) in the
x-space, is the convex hull of spanning tree incidence vectors in G (Kipp
Martin [1986], Wong [1984]).
1 1
a2 = 2
2 3
2 2 1
4 2 3 5
3 1 2 4
5 3
heuristic method, for finding a minimum cost feasible spanning tree for
the capacitated problem (Chandy and Lo [1973]).
< >
(xe : over e ∈ E(N)) = |N| − 1, for each N ⊆ N with |N| = 2
>
xe = 0, for all e ∈ A
(Edmonds [1969]).
9.17 Let lij denote the number of lines on the simple path between
nodes i, j in a tree . For each i in , define θi = max. {lij : over nodes
j in }. A center of is defined to be a node in that minimizes
θi . Show that the following algorithm finds a center for . Also, find
the center of the tree in Figure 9.8 using this algorithm.
<
Step 1 If the number of nodes in the tree is = 2, any node in the
tree is a center, terminate. Otherwise go to Step 2.
Step 2 Identify all the terminal nodes in the tree, and delete them
and the lines incident at them from the tree. Go to Step 3.
<
Step 3 If the number of remaining nodes is = 2, any of these is a
center for the original tree, terminate. Otherwise return to Step
2 with the remaining tree.
9.5. Exercises 659
Figure 9.8:
Figure 9.9:
that the following variant of Prim’s algorithm finds such a tree. In this
algorithm, X denotes the set of in-tree nodes, which grows by one in
each step.
Step 1 For each j ∈ N \X, define dj = min. {cij : over i ∈ X}. Find
a node q ∈ N \X which minimizes dj over j in this set. Let Y be
the set of all i in X which tie for the minimum in the definition
of dq . If Y is a singleton set, let p be the node in it, otherwise
define p to be the node in Y which has the maximum number of
in-tree edges incident at it at this stage. Select (p; q) as the next
in-tree edge, and transfer q into X from N \X. If X is now N ,
terminate. Otherwise, repeat this step.
r
lij = cost, as defined above, of the least cost path from i to j
that passes through only vertices in the set {1, 2, . . . , r}.
For any i, j ∈ N , the least cost path between i and j with no
intermediate vertex higher than r either passes through r (in this case
r r r r r−1
lij = max. {lir , lrj }) or does not (in this case lij = lij ). Using this,
r r−1 r−1 r−1
prove that lij = min. {lij , max. {lir , lrj }}.
Prove that the unique minimum cost spanning tree in G in this
case can be recovered from the costs of the least cost paths using the
rule: the edge (i; j) ∈ A is in the minimum cost spanning tree in G iff
0 n
lij = lij (Maggs and Plotkin [1988]).
2. If all the entries in the cost vector c are distinct, prove that there
is a unique minimum cost spanning tree in G.
3. Suppose that some of the edge cost coefficients are equal to each
other. Arrange the distinct values among {cij : (i; j) ∈ A} in
increasing order, and let them be α1 , . . . , αp . For t = 1 to p, let
St = {(i; j) : cij = αt }. Then S1 is the set of least cost edges in
G, S2 is the set of edges with the second best cost, etc.
9.6 References
S. AKL, 1986, “An Adaptive and Cost-Optimal Parallel Algorithm for Minimum
Spanning Trees,” Computing, 36(271-277).
I. ALI and J. KENNINGTON, 1986, “The Asymmetric M-Traveling Salesman
Problem: A Duality Based Branch and Bound Algorithm,” DAM, 13(259-276).
O. BORUVKA, 1926, “O Jistém Problému Minimalnim,” Pracá Moravské Pr̆irodovĕdecké
Spolec̆nosti, 3(37-58), in Czech.
P. M. CAMERINI, Jan. 1978, “The Min-Max Spanning Tree Problem and Some
Extensions,” IPL, 7, no. 1(10-14).
P. M. CAMERINI, L. FRATTA, and F. MAFFIOLI, 1979, “A Note on Finding
Optimum Branchings,” Networks, 9(309-312).
R. CHANDRASEKARAN, 1977, “Minimal Ratio Spanning Trees,” Networks, 7,
9.6. References 665
no. 4(335-342).
K. M. CHANDY and T. LO, 1973, “The Capacitated Minimum Spanning Tree,”
Networks, 3, no. 2(173-181).
D. CHERITON and R. E. TARJAN, 1976, “Finding Minimum Spanning Trees,”
SIAM Journal on Computing, 5(724-742).
F. CHIN and D. HOUCK, 1978, “Algorithm for Updating Minimum Spanning
Trees,” Journal of Computer and System Science, 16(333-344).
G. CHOQUET, 1938, “Etude de Certains Reseaux de Routes,” C. R. Acad. Sci.
Paris, 206(310-313).
J. EDMONDS, 1970, “Submodular Functions, Matroids and Certain Polyhedra,”
PP 69-87 in R. Guy(ed. ), Combinatorial Structures and Their Applications, Pro-
ceedings of the Calgary International Conference, Gordon Breach, N.Y.
H. N. GABOW, 1978, “A Good Algorithm for Smallest Spanning Tree With a De-
gree Constraint,” Networks, 8(201-208).
H. N. GABOW and R. E. TARJAN, Mar. 1984, “Efficient Algorithms for a Family
of Matroid Intersection Problems,” Journal of Algorithms, 5, no. 1(80-131).
R. G. GALLAGER, P. A. HUMBLET, and P. M. SPIRA, 1983, “A Distributed
Algorithm for Minimum-Weight Spanning Trees,” ACM Transactions on Program-
ming Languages and Systems, 5(66-77).
B. GAVISH, 1982, “Topological Design of Centralized Computer Networks - For-
mulations and Algorithms,” Networks, 12(355-377).
R. L. GRAHAM and P. HELL, Jan. 1985, “On the History of the Minimum Span-
ning Tree Problem,” Ann. History of Computing, 7, no. 1(43-57).
M. HELD and R. KARP, 1970, “The Traveling Salesman Problem and Minimum
Spanning Trees,” OR, 18(1138-1162).
T. C. HU, 1961, “The Maximum Capacity Route Problem,” OR, 9(898-900).
V. JARNIK, 1930, “O Jistém Problému Minimalnim,” Pracá Moravské Pr̆irodovĕdecké
Spolec̆nosti, 6(57-63), in Czech.
T. A. JENKYNS, 1986, “The Greedy Algorithm is a Shady Lady,” Congressus Nu-
merantium, 51(209-215).
D. B. JOHNSON, 1975, “Priority Queues with Update and Finding Minimum Span-
ning Trees,” IPL, 4(53-57).
A. KANG and A. AULT, Sept. 1975, “Some Properties of a Centroid of a True
Tree,” IPL, 4, no. 1(18-20).
A. N. C. KANG, R. C. T. LEE, C. CHANG, and S. K. CHANG, May 1977, “Stor-
age Reduction Through Minimal Spanning Trees and Spanning Forests,” IEEE
666 Ch. 9. Min. Cost Spanning Trees
Steiner 649
Points 649
Trees 649
668