0% found this document useful (0 votes)
8 views137 pages

Lecture 5 Soriting Revisited (Heap and Linear Searches)

The document discusses advanced algorithms, focusing on linear-time sorting methods, particularly Heapsort. It explains the properties of heaps, the HEAPIFY operation, and the BUILDHEAP process, highlighting their time complexities. The document concludes with an overview of the Heapsort algorithm, detailing its steps and efficiency.

Uploaded by

babyfisho19
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views137 pages

Lecture 5 Soriting Revisited (Heap and Linear Searches)

The document discusses advanced algorithms, focusing on linear-time sorting methods, particularly Heapsort. It explains the properties of heaps, the HEAPIFY operation, and the BUILDHEAP process, highlighting their time complexities. The document concludes with an overview of the Heapsort algorithm, detailing its steps and efficiency.

Uploaded by

babyfisho19
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 137

SWEN 5012

ADVANCED ALGORITHMS
AND PROBLEM SOLVING

LECTURE 5 LINEAR-TIME SORTING


Beakal Gizachew

Based on slides of David Luebke, Jennifer Welch, Michael Goodrich, Roberto Tamassia, Cevdet Aykanat, and
Alptekin Küpçü
SORTING REVISITED
• So far:
• Quicksort
• O(n2) worst-case, O(n log n) average-case
• O(n log n) expected time for Randomized Quicksort
• Merge Sort
• O(n log n) worst-case running time
• Insertion Sort
• Sorts in-place
• O(n2) worst-case but O(n) best-case
• Next: Heapsort
• Combines advantages of Merge Sort and Insertion Sort
• O(n log n) worst-case, in-place
• Another design paradigm

2
HEAP
• A heap can be seen as a nearly-complete
binary tree: 16

14 10

8 7 9 3

2 4 1

• What makes a binary tree complete?


• Is the example above complete?

3
ARRAY REPRESENTATION
• Represent a nearly-complete binary tree as an array:
• The root node is the first elementA[1]
• Node i is A[i]
• The parent of node i is A[i/2] (integer division)
• The left child of node i is A[2i]
• The right child of node i is A[2i + 1]

16
16 14 10 8 7 9 3 2 4 1

14 10

8 7 9 3

2 4 1

4
HEAP PROPERTY
• Heaps also satisfy the heap property:
A[Parent(i)]  A[i] for all nodes i > 1
• The value of a node is at most the value of its
parent
• Where is the largest element in a heap
stored?
• Largest element in a sub-tree of a heap is at
the root of the sub-tree. max-heap
• For a min-heap, the relation would be
otherwise:
• A[Parent(i)] ≤ A[i]

5
HEAP HEIGHT
• The height of a node in the tree
is the number of edges on the
longest (leftmost) path to a leaf
• The height of a tree is the height
of its root
• What is the height of an n-
element heap?

6
HEAP OPERATIONS: HEAPIFY()
• HEAPIFY(i): maintain the heap property
• Given: a node i in the heap with left child l and right child r
• Given: two sub-trees rooted at l and r, assumed to be
heaps
• Problem: The sub-tree rooted at i may violate the heap
property (How?)
• Action: let the value of the parent node “float down” so
sub-tree rooted at i satisfies the heap property
• What will be the basic operation between i, l, and r?

7
HEAP OPERATIONS: HEAPIFY()
HEAPIFY(A, i)
l = Left(i)
r = Right(i)
largest = indexof(max(A[i], A[l], A[r]))
if (largest != i) then
swap A[i] ↔ A[largest]
HEAPIFY(A, largest)

8
HEAPIFY() EXAMPLE

16

4 10

14 7 9 3

2 8 1

A= 16 4 10 14 7 9 3 2 8 1

9
HEAPIFY() EXAMPLE

16

4 10

14 7 9 3

2 8 1

A= 16 4 10 14 7 9 3 2 8 1

10
HEAPIFY() EXAMPLE

16

4 10

14 7 9 3

2 8 1

A= 16 4 10 14 7 9 3 2 8 1

11
HEAPIFY() EXAMPLE

16

14 10

4 7 9 3

2 8 1

A= 16 14 10 4 7 9 3 2 8 1

12
HEAPIFY() EXAMPLE

16

14 10

4 7 9 3

2 8 1

A= 16 14 10 4 7 9 3 2 8 1

13
HEAPIFY() EXAMPLE

16

14 10

4 7 9 3

2 8 1

A= 16 14 10 4 7 9 3 2 8 1

14
HEAPIFY() EXAMPLE

16

14 10

8 7 9 3

2 4 1

A= 16 14 10 8 7 9 3 2 4 1

15
HEAPIFY() EXAMPLE

16

14 10

8 7 9 3

2 4 1

A= 16 14 10 8 7 9 3 2 4 1

16
HEAPIFY() EXAMPLE

16

14 10

8 7 9 3

2 4 1

A= 16 14 10 8 7 9 3 2 4 1

17
HEAPIFY() RUNNING TIME
• Within a single recursive call, what is the running time
of HEAPIFY()?

• How many times can HEAPIFY() recursively call itself


in the worst-case?

• What is the worst-case running time of HEAPIFY() on a


heap of size n?

18
HEAPIFY() RUNNING TIME
• Within a single recursive call, what is the running time
of HEAPIFY()?
• O(1)
• How many times can HEAPIFY() recursively call itself
in the worst-case?
• O(height) = O(log n)
• What is the worst-case running time of HEAPIFY() on a
heap of size n?
• O(log n)

19
HEAP OPERATIONS: BUILDHEAP()
• Build a heap in a bottom-up manner by running
HEAPIFY() on successive sub-trees
• For array of length n, all elements in range
A[ n/2 + 1 ... n ] are already heaps (Why?)
All leaves are heaps by default
Denote #nodes at level d-1 by m
m = 2d-1
Total #nodes is n
n = 2d+1 – 1 – 2(m-f/2)
= 4m – 1 – 2m + f
= 2m + f – 1
#leafs = m – f/2 + f = n/2
20
HEAP OPERATIONS: BUILDHEAP()
• Walk backwards through the array from n/2 to 1, calling
HEAPIFY() on each node.
• Order of processing guarantees that the children of
node i are already heaps when i is processed during
HEAPIFY(i)

BUILDHEAP (A, n)
for (i = n/2 downto 1)
HEAPIFY(A, i)

21
BUILDHEAP() EXAMPLE
• Work through the example on the board
A = {4, 1, 3, 2, 16, 9, 10, 14, 8, 7}

1 3

2 16 9 10

14 8 7

22
BUILDHEAP() RUNNING TIME
• Each call to HEAPIFY() takes O(log n) time
• There are O(n) such calls ( n/2 calls indeed )
• Thus the running time is O(n log n)
• Is this a correct asymptotic upper bound?
• Is this an asymptotically tight bound?
• A tighter bound is actually O(n)
• How can this be?Is there a flaw in the above
reasoning?

23
BUILDHEAP() : TIGHTER
RUNNING TIME ANALYSIS
L=0, h0=d
L=1, d-2 ≤ h1 ≤ d-1

L, d-1-L ≤ hL ≤ d-L

L= d-1, 0 ≤ hd-1 ≤1
L=d, hd=0

Let hL denote height of a node at level L


We have d-1-L ≤ hL ≤d-L
24
BUILDHEAP() : TIGHTER
RUNNING TIME ANALYSIS
• Assume that all nodes at the last complete level (l = d – 1) are
processed (upper bound)
T(n) ≤ σ 𝑑−1 𝑛 O ℎ = O σ 𝑑−1 𝑛 ℎ
𝑙=0 𝑙 𝑙 𝑙 =0 𝑙 𝑙

𝑑 −1 𝑙
nl = # of nodes at level l ≤ 2l
T(n) ≤ O( σ𝑙=0 2 (𝑑 − 𝑙)) hl = height of nodes at level l ≤ d-l

Let h = d - l ⇒ 𝑙 = 𝑑 − ℎ 𝑐ℎ𝑎𝑛𝑔𝑒 𝑜𝑓 𝑣𝑎𝑟𝑖𝑎𝑏𝑙𝑒𝑠


_
T( n ) ≤ O ( σ 𝑑ℎ=1 ℎ 2𝑑 ℎ ) = O ( σ 𝑑ℎ=1 ℎ 2𝑑 /2ℎ ) = O ( 2𝑑 σ 𝑑ℎ =1 ℎ ( 1/2)ℎ )

But 2d = Θ( n ) ⇒ 𝑇 𝑛 ≤ 𝑂( n σ 𝑑ℎ=1 ℎ ( 1/2)ℎ)

25
BUILDHEAP() : TIGHTER
RUNNING TIME ANALYSIS
• σ 𝑑ℎ=𝟏 ℎ ( 1/2)ℎ ≤ σ 𝑑ℎ=𝟎 ℎ ( 1/2)ℎ ≤ σ ∞ℎ=𝟎 ℎ ( 1/2)ℎ

• Recall infinite decreasing geometric series


• σ ∞𝑘=0 𝑥𝑘 = 1
where |x| < 1
1−𝑥

• Differentiate both sides

• σ ∞𝑘 =0 𝑘𝑥𝑘_1 = 1−𝑥
1
2

• Then multiply both sides by x


• σ ∞𝑘=0 𝑘𝑥𝑘 = 𝑥
2
1−𝑥

26
BUILD-HEAP: TIGHTER RUNNING
TIME ANALYSIS
• σ ∞𝑘 =0 𝑘𝑥𝑘 = 1−𝑥
𝑥
2

• In our case: x=1/2 and k=h


• σ ∞ℎ=0 ℎ 1 ℎ
=
1/2
1 2 =2
2 1−
2

• T(n) ≤ O( n σ 𝑑ℎ=1 ℎ(1/2) ℎ ) = O(2n) = O(n)

27
BUILD-HEAP: TIGHTER RUNNING
TIME ANALYSIS
• σ ∞𝑘 =0 𝑘𝑥𝑘 = 1−𝑥
𝑥
2

• In our case: x=1/2 and k=h


• σ ∞ℎ=0 ℎ 1 ℎ
=
1/2
1 2 =2
2 1−
2

• T(n) ≤ O( n σ 𝑑ℎ=1 ℎ(1/2) ℎ ) = O(2n) = O(n)

• Intuition:
• Most HEAPIFY() calls occur at lower levels, since most of the nodes in a tree
are at lower levels.
• Those calls are very fast, O(1) for the lowest levels that contain most of the
nodes.
• Only relatively few nodes at upper levels require O(log n) HEAPIFY() cost.

28
HEAPSORT
HEAPSORT (A, n)
BUILDHEAP(A, n)
Repeat until n = 2
//The largest element is the root, which
should be the last element in the sorted
array
swap A[1] ↔ A[n]
//Discard node n from the heap (reduce
heap size)
//Sub-trees rooted at children of the root
are heaps but the new root may violate
heap property
HEAPIFY(A, n - 1)
set n = n - 1

29
HEAPSORT EXAMPLE

16

14 10

8 7 9 3

2 4 1

A= 16 14 10 8 7 9 3 2 4 1

30
HEAPSORT EXAMPLE

14 10

8 7 9 3

2 4 16

A= 1 14 10 8 7 9 3 2 4 16

31
HEAPSORT EXAMPLE

14 10

8 7 9 3

2 4

A= 1 14 10 8 7 9 3 2 4 16

32
HEAPSORT EXAMPLE

14

1 10

8 7 9 3

2 4

A= 14 1 10 8 7 9 3 2 4 16

33
HEAPSORT EXAMPLE

14

1 10

8 7 9 3

2 4

A= 14 1 10 8 7 9 3 2 4 16

34
HEAPSORT EXAMPLE

14

8 10

1 7 9 3

2 4

A= 14 8 10 1 7 9 3 2 4 16

35
HEAPSORT EXAMPLE

14

8 10

1 7 9 3

2 4

A= 14 8 10 1 7 9 3 2 4 16

36
HEAPSORT EXAMPLE

14

8 10

4 7 9 3

2 1

A= 14 8 10 4 7 9 3 2 1 16

37
HEAPSORT EXAMPLE

14

8 10

4 7 9 3

2 1

A= 14 8 10 4 7 9 3 2 1 16

38
HEAPSORT EXAMPLE

14

8 10

4 7 9 3

2 1

A= 14 8 10 4 7 9 3 2 1 16

39
HEAPSORT EXAMPLE

8 10

4 7 9 3

2 14

A= 1 8 10 4 7 9 3 2 14 16

40
HEAPSORT EXAMPLE

8 10

4 7 9 3

A= 1 8 10 4 7 9 3 2 14 16

41
HEAPSORT EXAMPLE

10

8 1

4 7 9 3

A= 10 8 1 4 7 9 3 2 14 16

42
HEAPSORT EXAMPLE

10

8 1

4 7 9 3

A= 10 8 1 4 7 9 3 2 14 16

43
HEAPSORT EXAMPLE

10

8 9

4 7 1 3

A= 10 8 9 4 7 1 3 2 14 16

44
HEAPSORT EXAMPLE

10

8 9

4 7 1 3

A= 10 8 9 4 7 1 3 2 14 16

45
HEAPSORT EXAMPLE

10

8 9

4 7 1 3

A= 10 8 9 4 7 1 3 2 14 16

46
HEAPSORT EXAMPLE

8 9

4 7 1 3

10

A= 2 8 9 4 7 1 3 10 14 16

47
HEAPSORT EXAMPLE

8 9

4 7 1 3

A= 2 8 9 4 7 1 3 10 14 16

48
HEAPSORT EXAMPLE

8 2

4 7 1 3

A= 9 8 2 4 7 1 3 10 14 16

49
HEAPSORT EXAMPLE

8 2

4 7 1 3

A= 9 8 2 4 7 1 3 10 14 16

50
HEAPSORT EXAMPLE

8 3

4 7 1 2

A= 9 8 3 4 7 1 2 10 14 16

51
HEAPSORT EXAMPLE

8 3

4 7 1 2

A= 9 8 3 4 7 1 2 10 14 16

52
HEAPSORT EXAMPLE

8 3

4 7 1 2

A= 9 8 3 4 7 1 2 10 14 16

53
HEAPSORT EXAMPLE

8 3

4 7 1 9

A= 2 8 3 4 7 1 9 10 14 16

54
HEAPSORT EXAMPLE

8 3

4 7 1

A= 2 8 3 4 7 1 9 10 14 16

55
HEAPSORT EXAMPLE

2 3

4 7 1

A= 8 2 3 4 7 1 9 10 14 16

56
HEAPSORT EXAMPLE

2 3

4 7 1

A= 8 2 3 4 7 1 9 10 14 16

57
HEAPSORT EXAMPLE

7 3

4 2 1

A= 8 7 3 4 2 1 9 10 14 16

58
HEAPSORT EXAMPLE

7 3

4 2 1

A= 8 7 3 4 2 1 9 10 14 16

59
HEAPSORT EXAMPLE

7 3

4 2 1

A= 8 7 3 4 2 1 9 10 14 16

60
HEAPSORT EXAMPLE

7 3

4 2 8

A= 1 7 3 4 2 8 9 10 14 16

61
HEAPSORT EXAMPLE

7 3

4 2

A= 1 7 3 4 2 8 9 10 14 16

62
HEAPSORT EXAMPLE

1 3

4 2

A= 7 1 3 4 2 8 9 10 14 16

63
HEAPSORT EXAMPLE

1 3

4 2

A= 7 1 3 4 2 8 9 10 14 16

64
HEAPSORT EXAMPLE

4 3

1 2

A= 7 4 3 1 2 8 9 10 14 16

65
HEAPSORT EXAMPLE

4 3

1 2

A= 7 4 3 1 2 8 9 10 14 16

66
HEAPSORT EXAMPLE

4 3

1 2

A= 7 4 3 1 2 8 9 10 14 16

67
HEAPSORT EXAMPLE

4 3

1 7

A= 2 4 3 1 7 8 9 10 14 16

68
HEAPSORT EXAMPLE

4 3

A= 2 4 3 1 7 8 9 10 14 16

69
HEAPSORT EXAMPLE

2 3

A= 4 2 3 1 7 8 9 10 14 16

70
HEAPSORT EXAMPLE

2 3

A= 4 2 3 1 7 8 9 10 14 16

71
HEAPSORT EXAMPLE

2 3

A= 4 2 3 1 7 8 9 10 14 16

72
HEAPSORT EXAMPLE

2 3

A= 1 2 3 4 7 8 9 10 14 16

73
HEAPSORT EXAMPLE

2 3

A= 1 2 3 4 7 8 9 10 14 16

74
HEAPSORT EXAMPLE

2 1

A= 3 2 1 4 7 8 9 10 14 16

75
HEAPSORT EXAMPLE

2 1

A= 3 2 1 4 7 8 9 10 14 16

76
HEAPSORT EXAMPLE

2 1

A= 3 2 1 4 7 8 9 10 14 16

77
HEAPSORT EXAMPLE

2 3

A= 1 2 3 4 7 8 9 10 14 16

78
HEAPSORT EXAMPLE

A= 1 2 3 4 7 8 9 10 14 16

79
HEAPSORT EXAMPLE

A= 2 1 3 4 7 8 9 10 14 16

80
HEAPSORT EXAMPLE

A= 2 1 3 4 7 8 9 10 14 16

81
HEAPSORT EXAMPLE

A= 2 1 3 4 7 8 9 10 14 16

82
HEAPSORT EXAMPLE

A= 1 2 3 4 7 8 9 10 14 16

83
HEAPSORT EXAMPLE

A= 1 2 3 4 7 8 9 10 14 16

84
HEAPSORT EXAMPLE

A= 1 2 3 4 7 8 9 10 14 16

85
CONCLUSION
• Heapsort is a very neat and clean algorithm
• Quicksort is faster in practice
• Why deal with Heapsort?
• Shows how to employ data structures to achieve more
complicated functionality.
• The BUILDHEAP() analysis is really important, since it
does not result in the obvious guess!
• Heaps are used in implementing Priority Queues
• Which are used in game engines and operating systems for
scheduling purposes. Read your book for more.

86
SORTING SO FAR
• Insertion sort:
• Easy to code
• Fast on small inputs (less than ~30 elements)
• In-place
• O(n) best case (nearly-sorted inputs)
• O(n2) worst case (reverse-sorted inputs)
• O(n2) average case (assuming all inputs are equally-likely)

87
SORTING SO FAR
• Merge sort:
• Divide-and-conquer:
• Split array in half
• Recursively sort sub-arrays
• Linear-time merge step
• O(n log n) worst case, best case, and average case
• Not in-place

88
SORTING SO FAR
• Heap sort:
• Uses the very useful heap data structure
• Nearly-complete binary tree
• Heap property:
• parent key ≥ children’s keys (max-heap)
• parent key ≤ children’s keys (min-heap)
• O(n log n) worst case, best case, average case
• In-place
• Many swap operations

89
SORTING SO FAR
• Quick sort:
• Divide-and-conquer:
• Partition array into two sub-arrays, recursively sort both
• All of first sub-array ≤ all of second subarray
• No merge step needed!
• Fast in practice
• O(n log n) average case
• O(n2) worst case (on sorted or reverse-sorted input)
• Randomized Quicksort:
• O(n2) worst case (on no particular input)
• O(n log n) expected running time
90
HOW FAST CAN WE SORT?
• We will provide a lower bound, then beat it
• How do you suppose we can beat impossibility?
• Observation: All sorting algorithms so far are
comparison sorts
• The only operation used to gain ordering information
about a sequence is the pairwise comparison of two
elements
• Theorem: All comparison sorts are (n log n)

91
DECISION TREE
• A decision tree represents the comparisons made by a
comparison sort. Every thing else is ignored.
first comparison:
check if ai ≤ aj

YES NO

second comparison second comparison


check if ak ≤ al check if am ≤ ap

YES NO YES NO
third comparison
check if ax ≤ ay ...
92
DECISION TREE FOR INSERTION
SORT OF 3 ITEMS
a1 ≤ a2 ?
YES NO

a2 ≤ a3 ? a1 ≤ a3 ?
YES NO YES NO

a1 a2 a3 a1 ≤ a3 ? a2 a1 a3 a2 ≤ a3 ?
YES NO YES NO
a1 a3 a2 a3 a1 a2 a2 a3 a1 a3 a2 a1
What do the leaves represent?
How many leaves are there? Why?
93
DECISION TREE
• Decision trees can model comparison sorts.
• For a given algorithm (e.g., Insertion Sort):
• One decision tree for each n
• Tree paths are all possible execution traces
• What’s the longest path in a decision tree for
insertion sort? For merge sort?
• What is the asymptotic height of any decision
tree for sorting n elements?
• Answer: (n log n) (let’s prove it…)

94
DECISION TREE:
HOW MANY LEAVES?
• Must be at least one leaf for each permutation
of the input (Why?)
• otherwise there would be a situation that was not correctly
sorted
• Number of permutations of n keys is n!
• Decision trees are binary trees.
• Minimum depth of a binary tree with n! leaves?
• Maximum #leaves of a binary tree of height h?

95
COMPARISON SORTING LOWER
BOUND
Theorem: Any decision tree that sorts n elements has height (n log n)
Proof: Maximum number of leaves in a binary tree with height h is 2h.
2h ≥ n!
h ≥ log(n!)
= log(n(n-1)(n-1)…(2)(1))
≥ (n/2)log(n/2) (WHY??)
= (n log n)

h = 1,
21 leaves h = 2, 22 leaves h = 3, 23 leaves
96
COMPARISON SORTING LOWER
BOUND
• Time to comparison sort n elements is (n log n)
• Corollary: Heapsort and Mergesort are
asymptotically optimal comparison sorts
• Quicksort is not asymptotically-optimal. Yet, it is fast
in practice.
• But the name of this lecture is “Linear-Time
Sorting”!
• How can we do better than (n log n)?

97
LINEAR-TIME SORTING:
COUNTING SORT
• No comparisons between elements!
• But…depends on assumption that the numbers being
sorted are in the range 1..k
• where k must be O(n) for it to take linear time
• Input: A[1..n], where A[j]  {1, 2, 3, …, k}
• Output: sorted array B[1..n] (not in-place)
• Uses an array C[1..k] for auxiliary storage
• Space Complexity??

98
LINEAR-TIME SORTING:
COUNTING SORT
COUNTINGSORT (A, B, k)
for i=1 to k Takes time O(k)
C[i]= 0;
for j=1 to n
C[A[j]] += 1;
Takes time O(n)
for i=2 to k
C[i] = C[i] + C[i-1];
for j=n downto 1
B[C[A[j]]] = A[j];
C[A[j]] -= 1; TOTAL TIME: O(n+k)

99
COUNTING SORT EXAMPLE:
LOOP 1
Sort A={4 1 3 4 3} with k = 4

1 2 3 4 5 1 2 3 4

A: 4 1 3 4 3 C: 0 0 0 0

B:
for 𝒊 ← 𝟏 𝒕𝒐 𝒌
do 𝑪 𝒊 ← 𝟎

100
COUNTING SORT EXAMPLE:
LOOP 2

1 2 3 4 5 1 2 3 4

A: 4 1 3 4 3 C: 0 0 0 1

B:
for 𝒊 ← 𝟏 𝒕𝒐 𝒏
do 𝑪 𝑨[ 𝒋 ] ← 𝑪 𝑨[ 𝒋 ] + 𝟏 𝑪 𝒊 = | 𝒌𝒆𝒚 ≤ 𝒊 |

101
COUNTING SORT EXAMPLE:
LOOP 2

1 2 3 4 5 1 2 3 4

A: 4 1 3 4 3 C: 1 0 0 1

B:
for 𝒊 ← 𝟏 𝒕𝒐 𝒏
do 𝑪 𝑨[ 𝒋 ] ← 𝑪 𝑨[ 𝒋 ] + 𝟏 𝑪 𝒊 = | 𝒌𝒆𝒚 ≤ 𝒊 |

102
COUNTING SORT EXAMPLE:
LOOP 2

1 2 3 4 5 1 2 3 4

A: 4 1 3 4 3 C: 1 0 1 1

B:
for 𝒊 ← 𝟏 𝒕𝒐 𝒏
do 𝑪 𝑨[ 𝒋 ] ← 𝑪 𝑨[ 𝒋 ] + 𝟏 𝑪 𝒊 = | 𝒌𝒆𝒚 ≤ 𝒊 |

103
COUNTING SORT EXAMPLE:
LOOP 2

1 2 3 4 5 1 2 3 4

A: 4 1 3 4 3 C: 1 0 1 2

B:
for 𝒊 ← 𝟏 𝒕𝒐 𝒏
do 𝑪 𝑨[ 𝒋 ] ← 𝑪 𝑨[ 𝒋 ] + 𝟏 𝑪 𝒊 = | 𝒌𝒆𝒚 ≤ 𝒊 |

104
COUNTING SORT EXAMPLE:
LOOP 2

1 2 3 4 5 1 2 3 4

A: 4 1 3 4 3 C: 1 0 2 2

B:
for 𝒊 ← 𝟏 𝒕𝒐 𝒏
do 𝑪 𝑨[ 𝒋 ] ← 𝑪 𝑨[ 𝒋 ] + 𝟏 𝑪 𝒊 = | 𝒌𝒆𝒚 ≤ 𝒊 |

105
COUNTING SORT EXAMPLE:
LOOP 3

1 2 3 4 5 1 2 3 4

A: 4 1 3 4 3 C: 1 0 2 2

B:
for 𝒊 ← 𝟐 𝒕𝒐 𝒌
do 𝑪 𝒊 ← 𝑪 𝒊 + 𝑪 𝒊 − 𝟏 𝑪 𝒊 = | 𝒌𝒆𝒚 ≤ 𝒊 |

106
COUNTING SORT EXAMPLE:
LOOP 3

1 2 3 4 5 1 2 3 4

A: 4 1 3 4 3 C: 1 1 2 2

B:
for 𝒊 ← 𝟐 𝒕𝒐 𝒌
do 𝑪 𝒊 ← 𝑪 𝒊 + 𝑪 𝒊 − 𝟏 𝑪 𝒊 = | 𝒌𝒆𝒚 ≤ 𝒊 |

107
COUNTING SORT EXAMPLE:
LOOP 3

1 2 3 4 5 1 2 3 4

A: 4 1 3 4 3 C: 1 1 3 2

B:
for 𝒊 ← 𝟐 𝒕𝒐 𝒌
do 𝑪 𝒊 ← 𝑪 𝒊 + 𝑪 𝒊 − 𝟏 𝑪 𝒊 = | 𝒌𝒆𝒚 ≤ 𝒊 |

108
COUNTING SORT EXAMPLE:
LOOP 3

1 2 3 4 5 1 2 3 4

A: 4 1 3 4 3 C: 1 1 3 5

B:
for 𝒊 ← 𝟐 𝒕𝒐 𝒌
do 𝑪 𝒊 ← 𝑪 𝒊 + 𝑪 𝒊 − 𝟏 𝑪 𝒊 = | 𝒌𝒆𝒚 ≤ 𝒊 |

109
COUNTING SORT EXAMPLE:
LOOP 3

1 2 3 4 5 1 2 3 4

A: 4 1 3 4 3 C: 1 1 3 5

B:
for 𝒊 ← 𝟐 𝒕𝒐 𝒌
do 𝑪 𝒊 ← 𝑪 𝒊 + 𝑪 𝒊 − 𝟏 𝑪 𝒊 = | 𝒌𝒆𝒚 ≤ 𝒊 |

110
COUNTING SORT EXAMPLE:
LOOP 4

1 2 3 4 5 1 2 3 4

A: 4 1 3 4 3 C: 1 1 3 5

B:
for 𝒋 ← 𝒏 𝒅𝒐𝒘𝒏𝒕𝒐 𝟏
𝐁 𝑪[ 𝑨[ 𝒋 ] ] ← 𝑨 𝒋
𝑪 𝑨[ 𝒋 ] ← 𝑪 𝑨[ 𝒋 ] − 𝟏

111
COUNTING SORT EXAMPLE:
LOOP 4

1 2 3 4 5 1 2 3 4

A: 4 1 3 4 3 C: 1 1 3 5

B:
for 𝒋 ← 𝒏 𝒅𝒐𝒘𝒏𝒕𝒐 𝟏
𝐁 𝑪[ 𝑨[ 𝒋 ] ] ← 𝑨 𝒋
𝑪 𝑨[ 𝒋 ] ← 𝑪 𝑨[ 𝒋 ] − 𝟏

112
COUNTING SORT EXAMPLE:
LOOP 4

1 2 3 4 5 1 2 3 4

A: 4 1 3 4 3 C: 1 1 3 5

B: 3

for 𝒋 ← 𝒏 𝒅𝒐𝒘𝒏𝒕𝒐 𝟏
𝐁 𝑪[ 𝑨[ 𝒋 ] ] ← 𝑨 𝒋
𝑪 𝑨[ 𝒋 ] ← 𝑪 𝑨[ 𝒋 ] − 𝟏

113
COUNTING SORT EXAMPLE:
LOOP 4

1 2 3 4 5 1 2 3 4

A: 4 1 3 4 3 C: 1 1 2 5

B: 3

for 𝒋 ← 𝒏 𝒅𝒐𝒘𝒏𝒕𝒐 𝟏
𝐁 𝑪[ 𝑨[ 𝒋 ] ] ← 𝑨 𝒋
𝑪 𝑨[ 𝒋 ] ← 𝑪 𝑨[ 𝒋 ] − 𝟏

114
COUNTING SORT EXAMPLE:
LOOP 4

1 2 3 4 5 1 2 3 4

A: 4 1 3 4 3 C: 1 1 2 5

B: 3

for 𝒋 ← 𝒏 𝒅𝒐𝒘𝒏𝒕𝒐 𝟏
𝐁 𝑪[ 𝑨[ 𝒋 ] ] ← 𝑨 𝒋
𝑪 𝑨[ 𝒋 ] ← 𝑪 𝑨[ 𝒋 ] − 𝟏

115
COUNTING SORT EXAMPLE:
LOOP 4

1 2 3 4 5 1 2 3 4

A: 4 1 3 4 3 C: 1 1 2 5

B: 3

for 𝒋 ← 𝒏 𝒅𝒐𝒘𝒏𝒕𝒐 𝟏
𝐁 𝑪[ 𝑨[ 𝒋 ] ] ← 𝑨 𝒋
𝑪 𝑨[ 𝒋 ] ← 𝑪 𝑨[ 𝒋 ] − 𝟏

116
COUNTING SORT EXAMPLE:
LOOP 4

1 2 3 4 5 1 2 3 4

A: 4 1 3 4 3 C: 1 1 2 5

B: 3 4

for 𝒋 ← 𝒏 𝒅𝒐𝒘𝒏𝒕𝒐 𝟏
𝐁 𝑪[ 𝑨[ 𝒋 ] ] ← 𝑨 𝒋
𝑪 𝑨[ 𝒋 ] ← 𝑪 𝑨[ 𝒋 ] − 𝟏

117
COUNTING SORT EXAMPLE:
LOOP 4

1 2 3 4 5 1 2 3 4

A: 4 1 3 4 3 C: 1 1 2 4

B: 3 4

for 𝒋 ← 𝒏 𝒅𝒐𝒘𝒏𝒕𝒐 𝟏
𝐁 𝑪[ 𝑨[ 𝒋 ] ] ← 𝑨 𝒋
𝑪 𝑨[ 𝒋 ] ← 𝑪 𝑨[ 𝒋 ] − 𝟏

118
COUNTING SORT EXAMPLE:
LOOP 4

1 2 3 4 5 1 2 3 4

A: 4 1 3 4 3 C: 1 1 2 4

B: 3 4

for 𝒋 ← 𝒏 𝒅𝒐𝒘𝒏𝒕𝒐 𝟏
𝐁 𝑪[ 𝑨[ 𝒋 ] ] ← 𝑨 𝒋
𝑪 𝑨[ 𝒋 ] ← 𝑪 𝑨[ 𝒋 ] − 𝟏

119
COUNTING SORT EXAMPLE:
LOOP 4

1 2 3 4 5 1 2 3 4

A: 4 1 3 4 3 C: 1 1 2 4

B: 3 4

for 𝒋 ← 𝒏 𝒅𝒐𝒘𝒏𝒕𝒐 𝟏
𝐁 𝑪[ 𝑨[ 𝒋 ] ] ← 𝑨 𝒋
𝑪 𝑨[ 𝒋 ] ← 𝑪 𝑨[ 𝒋 ] − 𝟏

120
COUNTING SORT EXAMPLE:
LOOP 4

1 2 3 4 5 1 2 3 4

A: 4 1 3 4 3 C: 1 1 2 4

B: 3 3 4

for 𝒋 ← 𝒏 𝒅𝒐𝒘𝒏𝒕𝒐 𝟏
𝐁 𝑪[ 𝑨[ 𝒋 ] ] ← 𝑨 𝒋
𝑪 𝑨[ 𝒋 ] ← 𝑪 𝑨[ 𝒋 ] − 𝟏

121
COUNTING SORT EXAMPLE:
LOOP 4

1 2 3 4 5 1 2 3 4

A: 4 1 3 4 3 C: 1 1 1 4

B: 3 3 4

for 𝒋 ← 𝒏 𝒅𝒐𝒘𝒏𝒕𝒐 𝟏
𝐁 𝑪[ 𝑨[ 𝒋 ] ] ← 𝑨 𝒋
𝑪 𝑨[ 𝒋 ] ← 𝑪 𝑨[ 𝒋 ] − 𝟏

122
COUNTING SORT EXAMPLE:
LOOP 4

1 2 3 4 5 1 2 3 4

A: 4 1 3 4 3 C: 1 1 1 4

B: 3 3 4

for 𝒋 ← 𝒏 𝒅𝒐𝒘𝒏𝒕𝒐 𝟏
𝐁 𝑪[ 𝑨[ 𝒋 ] ] ← 𝑨 𝒋
𝑪 𝑨[ 𝒋 ] ← 𝑪 𝑨[ 𝒋 ] − 𝟏

123
COUNTING SORT EXAMPLE:
LOOP 4

1 2 3 4 5 1 2 3 4

A: 4 1 3 4 3 C: 1 1 1 4

B: 3 3 4

for 𝒋 ← 𝒏 𝒅𝒐𝒘𝒏𝒕𝒐 𝟏
𝐁 𝑪[ 𝑨[ 𝒋 ] ] ← 𝑨 𝒋
𝑪 𝑨[ 𝒋 ] ← 𝑪 𝑨[ 𝒋 ] − 𝟏

124
COUNTING SORT EXAMPLE:
LOOP 4

1 2 3 4 5 1 2 3 4

A: 4 1 3 4 3 C: 1 1 1 4

B: 1 3 3 4

for 𝒋 ← 𝒏 𝒅𝒐𝒘𝒏𝒕𝒐 𝟏
𝐁 𝑪[ 𝑨[ 𝒋 ] ] ← 𝑨 𝒋
𝑪 𝑨[ 𝒋 ] ← 𝑪 𝑨[ 𝒋 ] − 𝟏

125
COUNTING SORT EXAMPLE:
LOOP 4

1 2 3 4 5 1 2 3 4

A: 4 1 3 4 3 C: 0 1 1 4

B: 1 3 3 4

for 𝒋 ← 𝒏 𝒅𝒐𝒘𝒏𝒕𝒐 𝟏
𝐁 𝑪[ 𝑨[ 𝒋 ] ] ← 𝑨 𝒋
𝑪 𝑨[ 𝒋 ] ← 𝑪 𝑨[ 𝒋 ] − 𝟏

126
COUNTING SORT EXAMPLE:
LOOP 4

1 2 3 4 5 1 2 3 4

A: 4 1 3 4 3 C: 0 1 1 4

B: 1 3 3 4

for 𝒋 ← 𝒏 𝒅𝒐𝒘𝒏𝒕𝒐 𝟏
𝐁 𝑪[ 𝑨[ 𝒋 ] ] ← 𝑨 𝒋
𝑪 𝑨[ 𝒋 ] ← 𝑪 𝑨[ 𝒋 ] − 𝟏

127
COUNTING SORT EXAMPLE:
LOOP 4

1 2 3 4 5 1 2 3 4

A: 4 1 3 4 3 C: 0 1 1 4

B: 1 3 3 4

for 𝒋 ← 𝒏 𝒅𝒐𝒘𝒏𝒕𝒐 𝟏
𝐁 𝑪[ 𝑨[ 𝒋 ] ] ← 𝑨 𝒋
𝑪 𝑨[ 𝒋 ] ← 𝑪 𝑨[ 𝒋 ] − 𝟏

128
COUNTING SORT EXAMPLE:
LOOP 4

1 2 3 4 5 1 2 3 4

A: 4 1 3 4 3 C: 0 1 1 4

B: 1 3 3 4 4

for 𝒋 ← 𝒏 𝒅𝒐𝒘𝒏𝒕𝒐 𝟏
𝐁 𝑪[ 𝑨[ 𝒋 ] ] ← 𝑨 𝒋
𝑪 𝑨[ 𝒋 ] ← 𝑪 𝑨[ 𝒋 ] − 𝟏

129
COUNTING SORT EXAMPLE:
LOOP 4

1 2 3 4 5 1 2 3 4

A: 4 1 3 4 3 C: 0 1 1 3

B: 1 3 3 4 4

for 𝒋 ← 𝒏 𝒅𝒐𝒘𝒏𝒕𝒐 𝟏
𝐁 𝑪[ 𝑨[ 𝒋 ] ] ← 𝑨 𝒋
𝑪 𝑨[ 𝒋 ] ← 𝑪 𝑨[ 𝒋 ] − 𝟏

130
STABLE SORTING
Counting sort is a stable sort: preserves the input order among
equal elements.

A: 4 1 3 4 3

B: 1 3 3 4 4

What other sorts have this property?

131
COUNTING SORT
• Cool! Why don’t we always use counting sort?
• Because it depends on range k of elements
• Can we use counting sort to sort 32-bit integers?
Why or why not?
• Answer: NO, k is too large (232 = 4,294,967,296)
• Affects both time and space complexity

132
RADIX SORT
• Sorting d-digit numbers:
• Sort on the most significant digit
• Then sort on the second-most significant digit, etc.
• Problem: lots of intermediate results to keep track of
• Key idea of IBM: sort the least significant digit first
• Sort a d-digit number:
RADIXSORT(A, d)
for i=1 to d
STABLESORT(A) on digit i
• What sort will we use to sort on digits?
133
RADIX SORT EXAMPLE: DECIMAL

134
RADIX SORT EXAMPLE: BINARY
• Sorting a sequence of 4-bit integers

1001 0010 1001 1001 0001

0010 1110 1101 0001 0010

1101 1001 0001 0010 1001

0001 1101 0010 1101 1101

1110 0001 1110 1110 1110

http://www.cs.usfca.edu/~galles/visualization/RadixSort.html
135
RADIX SORT
• Each of the d passes over n numbers takes time
O(n+k), so total time O( d(n+k) )
• When d is constant (e.g., d = 32 for 32-bit integers) and
k=O(n), takes total O(n) time
• In practice
• Radix Sort is fast for large inputs.
• Radix Sort is simple to code and maintain.
• Problem: Radix Sort displays little locality of reference (same
problem as Heap Sort)
• A well-tuned quicksort is better, since it runs mostly on
consecutive memory locations.

136
CONCLUSIONS
• All theorems rely on assumptions to be true:
• Example: Sorting is (n log n)
• Assumes comparison-based sorting
• When you come up with an impossibiliy result, try to think outside
of the box and find a completely different approach.
• Example: Assume different distribution on input, or impose a different
condition
• Counting Sort assumes all items are less than k = O(n)
• Randomized Quicksort makes sure average-case is like best-case
• Quicksort is quick!
• Asymptotic complexity matters, but algorithmic details and computer
architecture also matters!

137

You might also like