04GreedyAlgorithmsII
04GreedyAlgorithmsII
G REEDY A LGORITHMS II
‣ Dijkstra′s algorithm
‣ minimum spanning trees
‣ Prim, Kruskal, Boruvka
‣ single-link clustering
‣ min-cost arborescences
‣ Dijkstra′s algorithm
‣ minimum spanning trees
‣ Prim, Kruskal, Boruvka
‣ single-link clustering
‣ min-cost arborescences
SECTION 4.4
Single-pair shortest path problem
1 15 3
5
4
source s 12
0 3
8
7 2 9
7
9 6 1
11
5
5
4 13
4 20 6
destination t
length of path = 9 + 4 + 1 + 11 = 25
3
fi
Single-source shortest paths problem
1 15 3
5
4
source s 12
0 3
8
7 2 9
7
9 6 1
11
5
5
4 13
4 20 6
shortest-paths tree
4
fi
Shortest paths: quiz 1
A. Add 17.
B. Multiply by 17.
C. Either A or B.
D. Neither A nor B. 24
7
—
s 1
— 2
— 3
— t
18 19 20
5
Shortest paths: quiz 2
6
Shortest path applications
・PERT/CPM.
・Map routing.
・Seam carving.
・Robot navigation.
・Texture mapping.
・Typesetting in LaTeX.
・Urban traf c planning.
・Telemarketer operator scheduling.
・Routing of telecommunications messages.
・Network routing protocols (OSPF, BGP, RIP).
・Optimal truck routing through given traf c congestion pattern.
7
fi
fi
Dijkstra′s algorithm (for single-source shortest paths problem)
ℓe v
d[u]
u
S
8
Dijkstra′s algorithm (for single-source shortest paths problem)
d[v]
ℓe v
d[u]
u
S
9
Dijkstra′s algorithm: proof of correctness
as it reaches y: s P
S u
v
ℓ(P) ≥ ℓ(P ) + ℓe ≥ d[x] + ℓe ≥ π (y) ≥ π (v) ▪
10
′
fi
′
′
fi
fi
Dijkstra′s algorithm: ef cient implementation
・More speci cally, suppose u is added to S and there is an edge e = (u, v) leaving
u. Then, it suf ces to update:
11
fi
fi
fi
fi
Dijkstra s algorithm: ef cient implementation
Implementation.
・Algorithm maintains π[v] for each node v.
・Priority queue stores unexplored nodes, using π[⋅] as priorities.
・Once u is deleted from the PQ, π[u] = length of a shortest s↝u path.
DIJKSTRA (V, E, ℓ, s)
_________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________
12
’
fi
Dijkstra′s algorithm: which priority queue?
node-indexed array
(A[i] = priority of i)
O(1) O(n) O(1) O(n2)
d-way heap
(Johnson 1975)
O(d logd n) O(d logd n) O(logd n) O(m logm/n n)
Fibonacci heap
(Fredman–Tarjan 1984)
O(1) O(log n) † O(1) † O(m + n log n)
assumes m ≥ n † amortized
13
Shortest paths: quiz 3
C. Either A or B.
D. Neither A nor B.
14
Shortest paths: quiz 3
15
It’s less well-known that very similar techniques still apply can be thought of as semirings, with multiplication as sequen
Describing
where instead of a problem using classical
real or complex numbers linear algebra
we have is a very
a closed well- andally
semir-
lack negation
addition as choice. or The
reciprocals.
distributive law then states, intuitiv
formu- a followed Such structures,
by a choicehaving between addition
b andand c ismultiplication
the same as a(wh ch
ing,known
whichproblem-solving
is a structure withtechnique. If yourofquestion
additioncan andbe
Extensions of Dijkstra s algorithm
lated asthat
plication a question
need notaboutsupport
some analogue
realsubtraction
or complexormatrices,
multi-
division.then the answer between
tributeainfollowed
are called
Plain
the usualbyway)
semirings.
semirings
b andbut
are aMany
not in general
a followed
very structures
by c. negation or recip
specifying
weak structure. We sequential
can find m
can
We often
definebea found by standard
typeclass in Haskelltechniques.
for describing closed semir-
It’s implement
less well-known that very can be thought of as semirings,
apply examples of them in the wild, but unlike fields whichwith multiplication as sequ
pro
ings, and a few functions forsimilar techniques
manipulating matricesstill and
where instead
polynomials overofthem.
real orWe complex numbers
then show howwe havefunctions
these a closed can semir- theand addition
toolbox as choice.
of linear The distributive
algebra, there isn’t law much thenwestates,
can do intu
a followed
multi- something by a only
knowing choice thatbetween b and c is the same as a
it is a semiring.
be ing,
usedwhich
Dijkstra’s algorithm andis a structure
to calculate proof with
transitive some
extend analogue
closures, to
find of addition
several
shortest or and related
longest problems:
between followed by b and
or plication
widest paths that need
in a not support
graph, subtraction
analyse the dataorflow division.
of imperative However,a we can build somea followed
useful tools by c.by introducing
・ polynomials over them. We then show how these functions can ingthe
semiring. andtoolbox
additionofaslinear
choice, algebra,
closuretherecan be isn’t much weas can
interpreted iterad
Maximumbecapacityused to calculate paths:transitive π[v]closures,≥ min { π[u],
find shortest something
c(u, v)As}.
or longest we see in the knowing
followingonlysections,
that it is ita is
semiring.
possible to use somet
Categories
or widestand Subject
in a Descriptors
graph, analyseD.1.1 [Programming Tech- akin to Gaussian elimination on an arbitrarytools
However, we can build some useful bysemiring,
introduc
・
paths the data flow of imperative closed
Maximum niques]: Applicative
reliability
programs, optimally (Functional)
paths: Programming;
π[v]
pack knapsacks, ≥and π[u]
performG.2.2discrete
[Discrete
γ(u, v) .
event ingclosed semiring,
us a means whichcertain
of solving is a semiring
“linear”equipped
equations with anany
over extra st
Mathematics]:
simulations,Graph all by Theory—graph appropriate underlying closed turetion
just providing analgorithms called
with closure.
suitable notionsWith of the intuition choice
sequencing, of multiplication as se
and iteration. F
・ … semiring.closed semirings; transitive closure; linear systems;
Keywords
shortest paths
Categories and Subject Descriptors D.1.1 [Programming Tech-
ing and addition as choice, closure can
though, we need to define the notion of semiring more precise be
As we see in the following sections, it is possible to use som
akin to Gaussian elimination on an arbitrary closed semirin
interpreted as it
niques]: Applicative (Functional) Programming; G.2.2 [Discrete 2. ingSemirings us a means of solving certain “linear” equations over an
1. Mathematics]:
Introduction Graph Theory—graph algorithms Weture witha suitable
define semiringnotions
formally of assequencing,
consistingchoice
of a setandR,iteratio
two di
Key algebraic structure.
Linear algebra closed
Keywords providesClosed
an incredibly
semirings; semiring
powerful
transitive closure; (min-plus,
linear systems;bottleneck,
problem-solving though,
guished we need
elements Viterbi, named…).
oftoRdefine the0 notion
and 1, of andsemiring
two binarymoreoperat
prec
toolbox.
shortestA paths
great many problems in computer graphics and vision, + and ·, satisfying the following relations for any a, b, c 2 R:
machine learning, signal processing and many other areas can be 2. Semirings a+b=b+a
solved by simply
1. Introduction expressing the problem as a system of linear
equations and solving using standard techniques. We define a semiring a + (b formally
+ c) =as(aconsisting
+ b) + c of a set R, two
Linear
Linear algebra is provides
defined an incredibly
abstractly powerful
in terms problem-solving
of fields, of which guished elements of Ranamed + 0 =0aand 1, and two binary ope
thetoolbox.
real andAcomplexgreat many problems
numbers are inthecomputer graphicsexamples.
most familiar and vision, + and ·, satisfying the following relations for any a, b, c 2
machine learning, signal a · (b · c) = (a · b) · c
Fields are sets equipped withprocessing
some notion and of many otherand
addition areas can be
multi- a+b=b+a
solvedasbywell
plication simply expressing
as negation the problem as a system of linear
and reciprocals. a·0=0·a=0
equations
Many discrete and solving using standard
mathematical structures techniques.
commonly encountered a + (b + c) = (a + b) + c
a·1=1·a=a
Linear algebra
in computer science isdodefined
not haveabstractly
sensible in terms ℓeof fields,
notions of which
of negation. v a+0=a
the realsets,
Booleans, and graphs,
complexregularnumbers are the most
π[u]expressions, familiar
imperative examples.
programs, a · (b + c) = a · b + a · c
a · (b · c) = (a · b) · c
Fields are
datatypes andsets equipped
various other with somecan
structures notion
all beof given
addition and no-
natural multi- (a + b) · c = a · c + b · c
plication
tions of productas well as negation
(interpreted uand reciprocals.
variously as intersection, sequencing a·0=0·a=0
S
3
Many discrete mathematical structures commonly encountered We often write a · b as ab, a and · 1a=· a1 ·· aa as
= aa .
in computer science do not have sensible notions of negation. Our focus will be on closed semirings [12], which are se
Booleans, sets, graphs, regular expressions, imperative programs, ings with an additional operation a · (b + c)called = a ·closure
b + a · (denoted
c ⇤
)w
s datatypes and various other structures can all be given natural no- satisfies the axiom: (a + b) · c = a · c + b · c
tions of product (interpreted variously as intersection, sequencing a⇤ = ⇤
· aand ⇤
We often write a ·1b + as aab, =a1·+ a ·aa ·asa a3 .
Ourhave
If we focus an will
affinebemap on xclosed
7! axsemirings
+ b in some [12],closed
whichsemiare
⇤
thenings
x= with
a⇤ banis additional
a fixpoint, operation
since a⇤ b called
= (aaclosure
⇤
+ 1)b(denoted
= a(a⇤ b)
So,satisfies
a closedthe axiom: can also be thought of as a semiring w
semiring
affine maps have fixpoints. a⇤ = 1 + a · a⇤ = 1 + a⇤ · a 16
[Copyright notice will appear here once ’preprint’ option is removed.] The definition of a semiring translates neatly to Haskell:
’
𐄂
GOOGLE’S FOO.BAR CHALLENGE
You have maps of parts of the space station, each starting at a prison exit and
ending at the door to an escape pod. The map is represented as a matrix of 0s
and 1s, where 0s are passable space and 1s are impassable walls. The door out
of the prison is at the top left (0, 0) and the door into an escape pod is at the
bottom right (w−1, h−1).
Write a function that generates the length of a shortest path from the prison door
to the escape pod, where you are allowed to remove one wall as part of your
t
17
GOOGLE’S FOO.BAR CHALLENGE
t1 t2
G1 G2
18
Edsger Dijkstra
19
fi
The moral implications of implementing shortest-path algorithms
https://www.facebook.com/pg/npcompleteteens
20
4. G REEDY A LGORITHMS II
‣ Dijkstra′s algorithm
‣ minimum spanning trees
‣ Prim, Kruskal, Boruvka
‣ single-link clustering
‣ min-cost arborescences
SECTION 6.1
Cycles
Def. A cycle is a path with no repeated nodes or edges other than the starting and
ending nodes.
2 3
1 6 5 4
7 8
path P = { (1, 2), (2, 3), (3, 4), (4, 5), (5, 6) }
cycle C = { (1, 2), (2, 3), (3, 4), (4, 5), (5, 6), (6, 1) }
22
Cuts
Def. A cut is a partition of the nodes into two nonempty subsets S and V – S.
Def. The cutset of a cut S is the set of edges with exactly one endpoint in S.
2 3
1 6 5 4
7 8
cut S = { 4, 5, 8 }
cutset D = { (3, 4), (3, 5), (5, 6), (5, 7), (8, 7) }
23
Minimum spanning trees: quiz 1
B. 1–7
C. 5–7
D. 2–3
1
3
5
7
2
4 6
24
Minimum spanning trees: quiz 2
A. 0
B. 2
C. not 1
D. an even number
25
Cycle–cut intersection
2 3
1 6 5 4
7 8
cycle C = { (1, 2), (2, 3), (3, 4), (4, 5), (5, 6), (6, 1) }
cutset D = { (3, 4), (3, 5), (5, 6), (5, 7), (8, 7) }
intersection C ∩ D = { (3, 4), (5, 6) }
26
Cycle–cut intersection
S
cycle C
27
Spanning tree de nition
graph G = (V, E)
spanning tree H = (V, T)
28
fi
Minimum spanning trees: quiz 3
Which of the following properties are true for all spanning trees H?
graph G = (V, E)
spanning tree H = (V, T) 29
Spanning tree properties
graph G = (V, E)
spanning tree H = (V, T) 30
A tree containing a cycle
https://maps.roadtrippers.com/places/46955/photos/374771356 31
Minimum spanning tree (MST)
Def. Given a connected, undirected graph G = (V, E) with edge costs ce,
a minimum spanning tree (V, T ) is a spanning tree of G such that the sum
of the edge costs in T is minimized.
24
4
6 23 9
18
16 5 11
8
7
10 14
21
MST cost = 50 = 4 + 6 + 8 + 5 + 11 + 9 + 7
Cayley’s theorem. The complete graph on n nodes has n n–2 spanning trees.
B. c (e) = 17 c(e).
33
𐄂
′
′
′
Applications
34
fi
fi
fl
fl
Fundamental cycle
graph G = (V, E)
spanning tree H = (V, T)
graph G = (V, E)
spanning tree H = (V, T)
Red rule.
・Let C be a cycle with no red edges.
・Select an uncolored edge of C of max cost and color it red.
Blue rule.
・Let D be a cutset with no blue edges.
・Select an uncolored edge in D of min cost and color it blue.
Greedy algorithm.
・Apply the red and blue rules (nondeterministically!) until all edges
are colored. The blue edges form an MST.
・Note: can stop once n – 1 edges colored blue.
37
Greedy algorithm: proof of correctness
Color invariant. There exists an MST (V, T*) containing every blue edge
and no red edge.
Pf. [ by induction on number of iterations ]
38
fi
Greedy algorithm: proof of correctness
Color invariant. There exists an MST (V, T*) containing every blue edge
and no red edge.
Pf. [ by induction on number of iterations ]
Induction step (blue rule). Suppose color invariant true before blue rule.
・let D be chosen cutset, and let f be edge colored blue.
・if f ∈ T*, then T* still satis es invariant.
・Otherwise, consider fundamental cycle C by adding f to T*.
・let e ∈ C be another edge in D.
・e is uncolored and ce ≥ cf since
- e ∈ T* ⇒ e not red
- blue rule ⇒ e not blue and ce ≥ cf T*
・Thus, T* ∪ { f } – { e } satis es invariant.
f
cut
e
39
fi
fi
Greedy algorithm: proof of correctness
Color invariant. There exists an MST (V, T*) containing every blue edge
and no red edge.
Pf. [ by induction on number of iterations ]
Induction step (red rule). Suppose color invariant true before red rule.
・let C be chosen cycle, and let e be edge colored red.
・if e ∉ T*, then T* still satis es invariant.
・Otherwise, consider fundamental cutset D by deleting e from T*.
・let f ∈ D be another edge in C.
・f is uncolored and ce ≥ cf since
- f ∉ T* ⇒ f not blue
- red rule ⇒ f not red and ce ≥ cf T*
・Thus, T* ∪ { f } – { e } satis es invariant. ▪
f
cut
e
40
fi
fi
Greedy algorithm: proof of correctness
Case 1
41
Greedy algorithm: proof of correctness
Case 2
42
4. G REEDY A LGORITHMS II
‣ Dijkstra′s algorithm
‣ minimum spanning trees
‣ Prim, Kruskal, Boruvka
‣ single-link clustering
‣ min-cost arborescences
SECTION 6.2
Prim′s algorithm
44
Prim′s algorithm: implementation
PRIM (V, E, c)
_________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________
S ← ∅, T ← ∅.
s ← any node in V.
FOREACH v ≠ s : π[v] ← ∞, pred[v] ← null; π[s] ← 0.
Create an empty priority queue pq.
FOREACH v ∈ V : INSERT(pq, v, π[v]).
WHILE (IS-NOT-EMPTY(pq)) π[v] = cost of cheapest
known edge between v and S
u ← DEL-MIN(pq).
S ← S ∪ { u }, T ← T ∪ { pred[u] }.
FOREACH edge e = (u, v) ∈ E with v ∉ S :
IF (ce < π[v])
DECREASE-KEY(pq, v, ce).
π[v] ← ce; pred[v] ← e.
45
Kruskal′s algorithm
46
fi
fi
Kruskal′s algorithm: implementation
KRUSKAL (V, E, c)
________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________
Start with all edges in T and consider them in descending order of cost:
・Delete edge from T unless it would disconnect T.
Theorem. The reverse-delete algorithm computes an MST.
Pf. Special case of greedy algorithm.
・Case 1. [ deleting edge e does not disconnect T ]
⇒ apply red rule to cycle C formed by adding e to another path
in T between its two endpoints no edge in C is more expensive
(it would have already been considered and deleted)
Fact. [Thorup 2000] Can be implemented to run in O(m log n (log log n)3) time.
48
Review: the greedy MST algorithm
Red rule.
・Let C be a cycle with no red edges.
・Select an uncolored edge of C of max cost and color it red.
Blue rule.
・Let D be a cutset with no blue edges.
・Select an uncolored edge in D of min cost and color it blue.
Greedy algorithm.
・Apply the red and blue rules (nondeterministically!) until all edges
are colored. The blue edges form an MST.
・Note: can stop once n – 1 edges colored blue.
49
Borůvka′s algorithm
7 11
8 12 5
13
50
Borůvka′s algorithm: implementation
7 11
8 12 5
13
51
Borůvka′s algorithm: implementation
Contraction version.
・After each phase, contract each blue tree to a single supernode.
・Delete self-loops and parallel edges (keeping only cheapest one).
・Borůvka phase becomes: take cheapest edge incident to each node.
graph G contract edge 2-5
6
1 6 2 8 3 1 3
5 8
3 5 4 2 3 2, 2
5
7 9
4 7 5 9 6 4 6
1 1
1 3
5 8
2,
3 2
5
7 9
4 6
Q. How to contract a set of edges? 1 52
CONTRACT A SET OF EDGES
Problem. Given a graph G = (V, E) and a set of edges F, contract all edges
in F, removing any self-loops or parallel edges.
graph G
contracted graph G′
53
CONTRACT A SET OF EDGES
Problem. Given a graph G = (V, E) and a set of edges F, contract all edges
in F, removing any self-loops or parallel edges.
Solution.
・Compute the n connected components in (V, F).
・Suppose id[u] = i means node u is in connected component i.
・The contracted graph G has n nodes.
・For each edge u–v ∈ E, add an edge i–j to G , where i = id[u] and j = id[v].
Removing self loops. Easy.
Borůvka–Prim algorithm.
・Run Borůvka (contraction version) for log2 log2 n phases.
・Run Prim on resulting, contracted graph.
Theorem. Borůvka–Prim computes an MST.
Pf. Special case of the greedy algorithm.
n n
O m+ log
log n log n
56
Does a linear-time compare-based MST algorithm exist?
Problem. Given a connected graph G with positive edge costs, nd a spanning tree
that minimizes the most expensive edge.
24
4 10 11 9
6 5 9
3 8 7 14 7
Solution 2.
・Sort the edges in increasing order of cost e ≤ e ≤ … ≤ e . 1 2 m
Note. This algorithm can be improved to O(m) time using linear-time median nding
and edge contractions.
I,
THE MD&MAXSPAM’QC~GTREE P!RQBLEMAND SOME EXTENSIONS
P.M. CWERINI
Itituto di &Mtmrecnica ed Flettronica, Pol’itecnicodi Milano,
RIma L. da Vinci 32, 20133 hf&no, Italy
‣ Dijkstra′s algorithm
‣ minimum spanning trees
‣ Prim, Kruskal, Boruvka
‣ single-link clustering
‣ min-cost arborescences
SECTION 4.7
Clustering
Goal. Given a set U of n objects labeled p1, …, pn, partition into clusters so that
objects in different clusters are far apart.
Applications.
・Routing in mobile ad-hoc networks.
・Document categorization for web search.
・Similarity searching in medical image databases
・Cluster celestial objects into stars, quasars, galaxies.
・... 62
Clustering of maximum spacing
distance between
two clusters
min distance between
two closest clusters
4-clustering 63
fi
Greedy clustering algorithm
C *r
65
Dendrogram of cancers in human
gene 1
gene n
gene expressed
Reference: Botstein & Brown group
gene not expressed
66
Minimum spanning trees: quiz 5
number of objects n
A. Kruskal (stop when there are k components). can be very large
C. Either A or B.
D. Neither A nor B.
67
4. G REEDY A LGORITHMS II
‣ Dijkstra′s algorithm
‣ minimum spanning trees
‣ Prim, Kruskal, Boruvka
‣ single-link clustering
‣ min-cost arborescences
SECTION 4.9
Arborescences
69
Minimum spanning arborescence: quiz 1
A. No directed cycles.
B. Exactly n − 1 edges.
70
Arborescences
⇐ Suppose T has no cycles and each node v ≠ r has one entering edge.
・To construct an r↝v path, start at v and repeatedly follow edges in the backward
direction.
・Since T has no directed cycles, the process must terminate.
・It must terminate at r since r is the only node with no entering edge. ▪
71
Minimum spanning arborescence: quiz 2
C. Either A or B.
D. Neither A nor B.
72
fi
fi
fi
Min-cost arborescence problem
2 6
5 1 3 8 9
4 7
73
fi
Minimum spanning arborescence: quiz 3
4 12
10 3 6
74
A suf cient optimality condition
1 6
5 2 3 8 9
4 7
75
fi
A suf cient optimality condition
2 6
5 1 3 8 9
4 7
76
fi
Reduced costs
Def. For each v ≠ r, let y(v) denote the min cost of any edge entering v.
De ne the reduced cost of an edge (u, v) as c (u, v) = c(u, v) – y(v) ≥ 0.
7 1 3 3 0 0
4 0
4 3 y(v)
77
fi
′
′
Edmonds branching algorithm: intuition
F* = thick edges
4 0 0
1 3 0 4 0 1 0
0 0 7
78
′
′
′
Edmonds branching algorithm: intuition
4 0
1 0
3 1
0
7
79
′
′
′
Edmonds branching algorithm
EDMONDS-BRANCHING (G, r , c)
_________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________
FOREACH v ≠ r :
y(v) ← min cost of any edge entering v.
c (u, v) ← c (u, v) – y(v) for each edge (u, v) entering v.
FOREACH v ≠ r : choose one 0-cost edge entering v and let F*
be the resulting set of edges.
IF (F* forms an arborescence) RETURN T = (V, F*).
ELSE
C ← directed cycle in F*.
Contract C to a single supernode, yielding G = (V , E ).
T ← EDMONDS-BRANCHING(G , r , c ).
Extend T to an arborescence T in G by adding all but one edge of C.
RETURN T.
_________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________
80
′
′
′
′
′
′
′
′
′
Edmonds branching algorithm
min-cost arborescence in G
cycle C
b
81
′
Edmonds branching algorithm: key lemma
Pf.
82
fi
Edmonds branching algorithm: key lemma
Case 2 construction of T *.
・Let (a, b) be an edge in T entering C that lies on a shortest path from r.
・We delete all edges of T that enter a node in C except (a, b).
・We add in all edges of C except the one that enters b. this path from r to C
uses only one node in C
T cycle C
b
83
Edmonds branching algorithm: key lemma
Case 2 construction of T *.
・Let (a, b) be an edge in T entering C that lies on a shortest path from r.
・We delete all edges of T that enter a node in C except (a, b).
・We add in all edges of C except the one that enters b. this path from r to C
uses only one node in C
84
Edmonds branching algorithm: analysis
Theorem. [Chu–Liu 1965, Edmonds 1967] The greedy algorithm nds a min-cost
arborescence.
Pf. [ by strong induction on number of nodes ]
・If the edges of F* form an arborescence, then min-cost arborescence.
・Otherwise, we use reduced costs, which is equivalent.
・After contracting a 0-cost cycle C to obtain a smaller graph G ,
the algorithm nds a min-cost arborescence T in G (by induction).
・Key lemma: there exists a min-cost arborescence T in G that corresponds to T .
▪
86