Data Structures and Algorithms
The Basic Toolbox
Corrections and Remarks
Kurt Mehlhorn and Peter Sanders
September 2, 2013
In this document, we collect corrections and remarks. Negative line numbers
count from the bottom of a page.
Corrections
Page 2, line 4 of second paragraph: section → chapter
Page 7, line 1: sc → c
Page 9, line 16: n-bit → n-digit
Page 9, line -5: a1 · b0 → a0 · b0
Page 11, line 17: n-bit → n-digit
Page 11, statement of Theorem 1.7: Then TK (n) ≤ 99nlog 3 + 48 · n + 48 · logn
→ Then TK (n) ≤ 207 · nlog 3 .
Page 16, line -3 to Page 17, line 17: We shall show that
TK (2k + 2) ≤ 69 · 3k − 24 · 2k − 12
for k ≥ 0. For k = 0, we have
TK (20 + 2) = TK (3) ≤ 3 · 32 + 2 · 3 = 33 = 69 · 30 − 24 · 20 − 12 .
1
For k ≥ 1, we have
TK (2k + 2) ≤ 3TK (2k−1 + 2) + 12 · (2k + 2)
≤ 3 · 69 · 3 − 24 · 2 − 12 + 12 · (2k + 2)
k−1 k
= 69 · 3k − 24 · 2k − 12 .
Again, there is no magic in coming up with the right induction hypothesis. It is
obtained by repeated substitution. Namely,
TK (2k + 2) ≤ 3TK (2k−1 + 2) + 12 · (2k + 2)
≤ 3k TK (20 + 2) + 12 · 30 (2k + 2) + 31 (2k−1 + 2) + . . . + 3k−1 (21 + 2)
k 3k − 1
k k (3/2) − 1
≤ 33 · 3 + 12 · 2 +2
3/2 − 1 3−1
≤ 69 · 3k − 24 · 2k − 12 .
It remains to extend the bound to all n. Let k be the minimal integer such that
n ≤ 2k + 2. Then k ≤ 1 + log n. Also, multiplying n-digit numbers is no more
costly than multiplying (2k + 2)-digit numbers, and hence
TK (n) ≤ 69 · 3k − 24 · 2k − 12
≤ 207 · 3logn
≤ 207 · nlog3 ,
where the equality 3log n = 2(log 3)·(log n) = nlog 3 has been used.
Page 17, Exercise 1.9 Replace the exercise by
Solve the recurrence
(
3n2 + 2n if n < n0 ,
TR (n) ≤
3 · TR (⌈n/2⌉ + 1) + 12n if n ≥ n0 ,
where n0 is a positive integer. Optimize n0 .
Page 22, line -4: constant → constant c.
Page 38, line -9: d = b = 4 → d = b = 2.
Page 39, line 1: W → We
2
Page 40, third paragraph: replace the third paragraph by the following:
We make the following additional assumptions: b ≥ 2 is integral and a ≤
c(n0 + 1) + da.
We first show R(n − 1) ≤ R(n) for all n by induction on n. If n ≤ n0 , the claim
is obvious. We have R(n0 ) = a ≤ c(n0 + 1) + da = R(n0 + 1) by the additional
assumption. For n > n0 + 1, we have R(n) = cn + R(⌈n/b⌉ + e) ≥ c(n − 1) +
R(⌈(n − 1)/b⌉ + e) = R(n − 1), where the inequality follows from the Induction
Hypthesis. Observe that ⌈n/b⌉ + e < n since n > n0 . We now have monotonicity
of R and hence R(n) ≤ R(s(n)).
We next turn to the solution of the recurrence. We need the additional assump-
tion1 b0 + z ≤ n0 .
Let k0 be maximal such that bk0 + z ≤ n0 . Then k0 ≥ 0, by the assumption.
The recurrence for numbers of the form bk + z, k ≥ 0 is
(
a if k ≤ k0
R(bk + z) = k k−1
c(b + z) + dR(b + z) if k > k0 .
For k ≥ k0 , the solution to this recurrence is
R(bk + z) = ad k−k0 + c ∑ d i (bk−i + z)
0≤i≤k−k0 −1
as an induction of k shows.
We now distinguish cases as in the proof of the master theorem: d < b, d = b,
and d > b.
Page 44, line 7: (pi − p j ) → (p j − pi ).
Page 47, line 2 of second paragraph: smallest L primes with → smallest L
primes p1 , . . . , pL with
Page 47, line -12: L = 1012 n → L = 1012 (n/k)
Page 55, line 6: K33 → K3,3 .
Page 55: Exercise 2.9 → Exercise 2.20
Page 56: Exercise 2.20 → Exercise 2.21
1 As it stands, we only know z ≤ n0 since by defintion of z, ⌈z/b⌉ + e = z
3
Page 56: Exercise 2.21 → Exercise 2.22
Page 61, line -13: elements → items
Page 61, line -2: positions → position
Page 63, lines -15, -9, and -7: element → item
Page 71, line 7: replace “followed by k − 1 zeros” by “followed by k zeros”.
Page 74/75: stacks, queues, and deques also support the operation isEmpty.
Page 83, line 3: m’s → mℓ ’s
Page 83, lines -6 and -4: h(e) = k → key(e) = k
Page 85, line 5: We have to define X to exclude a possible list element with key
k which will certainly be in the list regardless how the hash function is choosen.
Page 85, line 10: Xi → Xe .
Page 86, line 20: Xi → Xe .
Page 87, line -14: “g : 0..α n → {0, 1, 2}” −→ “g : 0..m − 1 → {0, 1, 2}”.
Page 87, lines 4, 15, -4: ha (x) → ha (x).
Page 97, line -5 k-wise → k-way
Page 110, Theorem 5.6: 1.45n logn → 1.39n logn.
Page 110, line -2: Sect. 2.8 → Sect. 2.7
Page 114, line 9: e⌊n/2⌋ → e′⌊n/2⌋ .
Page 123, line 4: C := 0 → C := 1.
4
i
Function ABItem::locateRec(k : Key, h : N ) : Handle 1 2 3 4
i:=locateLocally(k) 7 11 13 k = 12
if h = 1 then
if c[i] → e ≥ k then return c[i] h=1 h>1
else return c[i] → next 13 12
else return c[i] →locateRec(k, h − 1) //
Figure 1: Corrected version of function locateRec for (a, b)-trees. The framed
pieces are new.
Page 131, line 17: no larger → no smaller.
Page 140, line 19: Change “If the minimum is in Q′ and comes from sequence
Si , the next largest element of Si is inserted into Q′ ” into “If the minimum is in Q′
and comes from sequence Si , the first element of Si is moved to Q′ ” for increased
clarity.
Pages 147 and 150: Implementation of locate. Once again, we see that the
phrase “It is clear” is dangerous. Our invariants for both binary search trees and
(a, b)-trees do not guarantee that the leaf x we reach is actually the list element
specified as the result of locate(k). When we remove an element somewhere used
as a splitter, we may end up at an element x < k. This is easy to fix however:
If x < k return the successor of x in the linked list. Figure 1figure.1 shows that
we need to change only two lines of pseudocode. We also need to change the
insertion method. If we end up at an element x < k we simply swap it with the
element to be inserted before proceeding with the insertion.
Page 158, line 1: hh → hk .
Page 162, line -6: two kind of operation → two kind of operations.
Page 169, line 5: out of V → out of v.
Page 170, line -15: discussed in Chap. 2.9 → discussed in Section 2.9.
5
Page 172, line 3: Change “disconnected” into “not connected” for increased
linguistic quality.
Page 173, line -17: Change
The algorithms should access the graph data structure only through
a small set of operations, such as . . . . The interface can be captured
in an interface description, and a graph algorithm can be run on any
representation that realizes the interface.
into
The algorithms should access the graph data structure only through a
small interface – a set of operations, such as . . . . ldots. An algorithm
that only accesses the graph only through this interface can be run on
any representation realizing the interface.
for increased clarity.
Page 187, line -8: than → that
Page 211, Line -9– -7: d(·) → d[·].
Page 222, Line 4: O(V ) → O(n).
Page 227: The original edge (u0 , v0 ) stored at the end of the priority queue tu-
ples should consistently be put into parentheses.
Page 247: The instance used here violates the assertion that the profit densities
should be sorted. Better use the profit vector p = (10, 24, 14, 20). Furthermore,
∑k<i xi wi → ∑k<i xk wk , ∑k<i xi pi → ∑k<i xk pk and ∑ j<i xi wi → ∑ j<i x j w j .
Remarks
Page 40, Section 2.6.2, Theorem 2.6, master theorem for the analysis of recur-
rences: Recent papers with many interesting generalizations of the master theo-
rem are [1, 3, 2]. .
6
References
[1] M. Akra and L. Bazzi. On the solution of linear recurrence equations. Com-
putational Optimization and Applications, 10(2):195–210, 1998.
[2] T. Leighton. Notes on better master theorems for divide and conquer recur-
rences. http://citeseerx.ist.psu.edu/viewdoc/summary?
doi=10.1.1.39.1636.
[3] C. Yap. A real elementary approach to the master recurrence and gener-
alizations. http://www.cs.nyu.edu/cs/faculty/yap/papers/
SYNOP.htm.