20 Merged
20 Merged
History
Al-khwarizmi (Persian) introduced the
concept of algorithm.
Also founded the discipline of algebra.
Around 800 CE
Analysis of Algorithms 2
What is an algorithm?
Definition:
Analysis of Algorithms 3
We analyze an algorithm on the basis of
its following resource requirements.
1. Time required.
2. Space required.
Analysis of Algorithms 4
Theoretical Analysis
Describe the algorithm in pseudo code.
Count the number of pseudo code
steps.
Characterize the running time as a
function of the input size, n.
Analysis of Algorithms 5
Example
The algorithm below finds the maximum element in an
array of size n.
return currentMax 1
Total 3n − 1
Analysis of Algorithms 6
Best case vs worst case
What input array will lead to best case performance.
What input array will lead to worst case performance.
return currentMax 1
Total 3n − 1
Analysis of Algorithms 7
Big-Oh Notation
Given functions f(n) and g(n), we say that f(n) is
O(g(n)) if there are positive constants
c > 0 and n0 >=0 such that
f(n) ≤ cg(n) for n ≥ n0
Example: 2n + 10 is O(n)
How ?
Analysis of Algorithms 8
Big-Oh Notation
10,000
3n
Example: 2n + 10 is O(n)
1,000 2n+10
2n + 10 ≤ cn
n
(c − 2) n ≥ 10
n ≥ 10/(c − 2) 100
Pick c = 3 and n0 = 10
10
1
1 10 100 1,000
n
Analysis of Algorithms 9
Big-Oh Example
Example: the function n2 is not O(n)
Why ?
Analysis of Algorithms 10
Big-Oh Example
1,000,000
n^2
Example: the function 100n
n2 is not O(n)
100,000
10n
n2 ≤ cn 10,000 n
n≤c
The above inequality 1,000
cannot be satisfied
since c must be a 100
constant
10
1
1 10 100 1,000
n
Analysis of Algorithms 11
More Big-Oh Examples
7n-2
7n-2 is O(n)
need c > 0 and n0 ≥ 1 such that 7n-2 ≤ c•n for n ≥ n0
this is true for c = 7 and n0 = 1
3n3 + 20n2 + 5
3n3 + 20n2 + 5 is O(n3)
need c > 0 and n0 ≥ 1 such that 3n3 + 20n2 + 5 ≤ c•n3 for n ≥ n0
this is true for c = 4 and n0 = 21
Analysis of Algorithms 13
Big-Oh Rules
Analysis of Algorithms 14
Example revisited
We say that algorithm arrayMax “runs in O(n) time”
return currentMax 1
Total 3n − 1
Analysis of Algorithms 15
What constitutes a fast
algorithm ?
O(nx) is considered fast. (x > 0)
Analysis of Algorithms 16
What constitutes a fast
algorithm ?
O(nx) is considered fast. (x > 1)
Analysis of Algorithms 17
What constitutes a fast
algorithm ?
O(nx) is considered fast. (x > 1)
Analysis of Algorithms 18
Relatives of Big-Oh
big-Omega
Analysis of Algorithms 19
Relatives of Big-Oh
big-Theta
Analysis of Algorithms 20
Relatives of Big-Oh
big-Theta
f ( x)
lim =c
x →∞ g ( x)
Analysis of Algorithms 22
Relatives of Big-Oh
little-oh
Analysis of Algorithms 23
Relatives of Big-Oh
little-oh
Analysis of Algorithms 24
Remember
f ( x)
lim =0
x →∞ g ( x)
Analysis of Algorithms 25
Relatives of Big-Oh
little-omega
f(n) is ω(g(n)) if, for any
constant c > 0, there is an
integer constant n0 ≥ 0 such
that f(n) ≥ c•g(n) for n ≥ n0
Analysis of Algorithms 26
Relatives of Big-Oh
little-omega
f(n) is ω(g(n)) if, for any
constant c > 0, there is an
integer constant n0 ≥ 0 such
that f(n) ≥ c•g(n) for n ≥ n0
f ( x)
lim =∞
x →∞ g ( x)
Analysis of Algorithms 28
Example Uses of the
Relatives of Big-Oh
5n2 is Ω(n2)
f(n) is Ω(g(n)) if there is a constant c > 0 and an integer constant n0 ≥ 1
such that f(n) ≥ c•g(n) for n ≥ n0
let c = 5 and n0 = 1
5n2 is Ω(n)
f(n) is Ω(g(n)) if there is a constant c > 0 and an integer constant n0 ≥ 1
such that f(n) ≥ c•g(n) for n ≥ n0
let c = 1 and n0 = 1
5n2 is ω(n)
f(n) is ω(g(n)) if, for any constant c > 0, there is an integer constant n0 ≥
0 such that f(n) ≥ c•g(n) for n ≥ n0
need 5n02 ≥ c•n0 → given c, the n0 that satifies this is n0 ≥ c/5 ≥ 0
Analysis of Algorithms 29
Some Examples
Example 1
ex 1 e x
lim = lim ( ) =∞
x→∞ (100).2x 100 x→∞ 2
Example 2
log2 x
loge x log2 e 1
lim = lim = lim =
x→∞ log2 x x→∞ log2 x x→∞ log2 e
Example 3
2
•Which of the two functions 10.e loge x
and x loge x /10 grows faster?
Example 4
2
•Which of the two functions 10.e loge x
and x loge x /10 grows faster?
Aug-14 1
Tower of Hanoi
8/7/2014 2
Towers of Hanoi
8/7/2014 4
Tower of Hanoi
8/7/2014 5
Solution :
8/7/2014 7
Tower of Hanoi
8/7/2014 8
Let T(n) denote the number of moves
required to transfer a stack of size n
Solution :
Solution :
8/7/2014 11
What about the lower bound ?
T(n) >= ??
8/7/2014 12
What about the lower bound ?
T(n) >= ??
8/7/2014 13
Tower of Hanoi
and
Aug-14 15
Tower of Hanoi
T(0)=0
T(1)=2. 0 + 1 = 1
T(2)= 2.1 + 1 = 3
T(3)= 2.3 + 1 = 7
Verify it !
Aug-14 16
Tower of Hanoi
8/7/2014 17
Pizza cutting problem
Aug-14 18
Problem Statement:
Aug-14 19
Let’s rephrase it mathematically !
Aug-14 20
Notation: Let Ln be the number of
regions obtained by drawing n lines.
Aug-14 21
What is L1 , L2 and L3
Case 1: We cut through the center each time
In Case 1, each cut adds two regions.
Aug-14 23
Case 2: You don’t need to cut through the
center each time
A Better Slicing Method ...
Aug-14 26
Upper bound
Ln <= Ln-1 + n
Aug-14 27
Is it possible that Ln = Ln-1 + n
Aug-14 28
7-Aug-14 29
Pizza Cutting Problem: A Variation
What is the variation ?
• It is the same problem, except for the fact that
instead of straight lines we use bent lines.
Bent lines
• Each line has exactly one bend, as shown
below:
Statement
What is the number of regions determined by n
bent lines.
Notation
• Let Zn be the number of regions determined
by the bent lines.
1
Z2 = 7
2 3
6 7 4
1 5
What is Zn
L3.5
Binary search
Find an element in a sorted array:
1. Divide: Check middle element.
2. Conquer: Recursively search 1 subarray.
3. Combine: Trivial.
Example: Find 9
3 5 7 8 9 12 15
L3.6
Binary search
Find an element in a sorted array:
1. Divide: Check middle element.
2. Conquer: Recursively search 1 subarray.
3. Combine: Trivial.
Example: Find 9
3 5 7 8 9 12 15
L3.7
Binary search
Find an element in a sorted array:
1. Divide: Check middle element.
2. Conquer: Recursively search 1 subarray.
3. Combine: Trivial.
Example: Find 9
3 5 7 8 9 12 15
L3.8
Binary search
Find an element in a sorted array:
1. Divide: Check middle element.
2. Conquer: Recursively search 1 subarray.
3. Combine: Trivial.
Example: Find 9
3 5 7 8 9 12 15
L3.9
Binary search
Find an element in a sorted array:
1. Divide: Check middle element.
2. Conquer: Recursively search 1 subarray.
3. Combine: Trivial.
Example: Find 9
3 5 7 8 9 12 15
L3.10
Recurrence for binary search
T(n) = 1 T(n/2) + Q(1)
L3.11
Powering a number
Problem: Compute a n, where n N.
Naive algorithm: Q(n).
Divide-and-conquer algorithm:
a n/2 a n/2 if n is even;
an =
a (n–1)/2 a (n–1)/2 a if n is odd.
L3.12
Fibonacci numbers
Recursive definition:
0 if n = 0;
Fn = 1 if n = 1;
Fn–1 + Fn–2 if n 2.
0 1 1 2 3 5 8 13 21 34 L
Naive recursive algorithm: W(fn)
(exponential time), where f = (1 5) / 2
is the golden ratio.
L3.13
Matrix multiplication
Input: A = [aij], B = [bij].
i, j = 1, 2,… , n.
Output: C = [cij] = AB.
c11 c12 L c1n a11 a12 L a1n b11 b12 L b1n
c c L c2n a21 a22 L a2 n b21 b22 L b2n
21 22
c c L cnn an1 an 2 L ann bn1 bn 2 L bnn
n1 n 2
n
cij aik bkj
k 1
L3.17
Standard algorithm
for i 1 to n
do for j 1 to n
do cij 0
for k 1 to n
do cij cij + aik bkj
L3.18
Divide-and-conquer algorithm
IDEA:
nn matrix = 22 matrix of (n/2)(n/2) submatrices:
r s a b e f
t u c d g h
C = A B
r = ae + bg
s = af + bh 8 mults of (n/2)(n/2) submatrices
t = ce + dh 4 adds of (n/2)(n/2) submatrices
u = cf + dh
L3.19
Analysis of D&C algorithm
T(n) = 8 T(n/2) + Q(n2)
L3.20
Strassen’s idea
• Multiply 22 matrices with only 7 recursive mults.
P1 = a ( f – h) r = P 5 + P 4 – P2 + P 6
P2 = (a + b) h s = P1 + P2
P3 = (c + d) e t = P3 + P4
P4 = d (g – e) u = P5 + P1 – P3 – P7
P5 = (a + d) (e + h)
P6 = (b – d) (g + h) 7 mults, 18 adds/subs.
P7 = (a – c) (e + f ) Note: No reliance on
commutativity of mult!
L3.21
Strassen’s idea
• Multiply 22 matrices with only 7 recursive mults.
P1 = a ( f – h) r = P 5 + P 4 – P2 + P 6
P2 = (a + b) h = (a + d) (e + h)
P3 = (c + d) e + d (g – e) – (a + b) h
P4 = d (g – e) + (b – d) (g + h)
P5 = (a + d) (e + h) = ae + ah + de + dh
P6 = (b – d) (g + h) + dg –de – ah – bh
P7 = (a – c) (e + f ) + bg + bh – dg – dh
= ae + bg
L3.22
Strassen’s algorithm
1. Divide: Partition A and B into
(n/2)(n/2) submatrices. Form terms
to be multiplied using + and – .
2. Conquer: Perform 7 multiplications of
(n/2)(n/2) submatrices recursively.
3. Combine: Form C using + and – on
(n/2)(n/2) submatrices.
L3.23
Analysis of Strassen
T(n) = 7 T(n/2) + Q(n2)
nlogba = nlog27 n2.81 CASE 1 T(n) = Q(nlg 7).
The number 2.81 may not seem much smaller than
3, but because the difference is in the exponent, the
impact on running time is significant. In fact,
Strassen’s algorithm beats the ordinary algorithm
on today’s machines for n 30 or so.
Best to date (of theoretical interest only): Q(n2.376L).
L3.24
Conclusion
L3.25
Divide-and-Conquer
7 29 4 2 4 7 9
72 2 7 94 4 9
Divide-and-Conquer 1
Outline and Reading
Divide-and-conquer paradigm
Review Merge-sort
Recurrence Equations
Iterative substitution
Recursion trees
Guess-and-test
The master method
Divide-and-Conquer 2
Divide-and-Conquer
Divide-and conquer is a
general algorithm design
paradigm:
Divide: divide the input data S in
two or more disjoint subsets S1,
S2 , …
Recur: solve the subproblems
recursively
Conquer: combine the solutions
for S1, S2, …, into a solution for S
The base case for the
recursion are subproblems of
constant size
Analysis can be done using
recurrence equations
Divide-and-Conquer 3
Merge-Sort Review
Merge-sort on an input
sequence S with n Algorithm mergeSort(S)
elements consists of Input sequence S with n
three steps: elements
Divide: partition S into Output sequence S sorted
two sequences S1 and S2
of about n/2 elements if S.size() > 1
each
(S1, S2) partition(S, n/2)
Recur: recursively sort S1
and S2 mergeSort(S1,)
Conquer: merge S1 and mergeSort(S2)
S2 into a unique sorted S merge(S1, S2)
sequence
Divide-and-Conquer 4
Recurrence Equation
Analysis
The conquer step of merge-sort consists of merging two sorted
sequences, each with n/2 elements and implemented by means of
a doubly linked list, takes at most bn steps, for some constant b.
Likewise, the basis case (n < 2) will take at b most steps.
Therefore, if we let T(n) denote the running time of merge-sort:
b if n 2
T (n)
2T ( n / 2) bn if n 2
We can therefore analyze the running time of merge-sort by
finding a closed form solution to the above equation.
That is, a solution that has T(n) only on the left-hand side.
Divide-and-Conquer 5
Iterative Substitution
In the iterative substitution, or “plug-and-chug,” technique, we
iteratively apply the recurrence equation to itself and see if we can
find a pattern: T ( n ) 2T ( n / 2) bn
2( 2T ( n / 2 2 )) b( n / 2)) bn
2 2 T ( n / 2 2 ) 2bn
23 T ( n / 23 ) 3bn
2 4 T ( n / 2 4 ) 4bn
...
2i T ( n / 2i ) ibn
Note that base, T(n)=b, case occurs when 2i=n. That is, i = log n.
So, T (n) bn bn log n
Thus, T(n) is O(n log n).
Divide-and-Conquer 6
The Recursion Tree
Draw the recursion tree for the recurrence relation and look for a
pattern:
b if n 2
T (n)
2T ( n / 2) bn if n 2
time
depth T’s size
0 1 n bn
1 2 n/2 bn
i 2i n/2i bn
… … … …
Divide-and-Conquer 8
Guess-and-Test Method
In the guess-and-test method, we guess a closed form solution
and then try to prove it is true by induction:
b if n 2
T (n)
2T ( n / 2) bn log n if n 2
Guess: T(n) < cn log n.
T (n) 2T ( n / 2) bn log n
2(c(n / 2) log( n / 2)) bn log n
cn (log n log 2) bn log n
cn log n cn bn log n
Divide-and-Conquer 9
Guess-and-Test Method,
Part 2
Recall the recurrence equation:
b if n 2
T (n)
2T ( n / 2) bn log n if n 2
Guess #2: T(n) < cn log2 n.
Divide-and-Conquer 10
Guess-and-Test Method,
Part 2
Recall the recurrence equation:
b if n 2
T (n)
2T ( n / 2) bn log n if n 2
Guess #2: T(n) < cn log2 n.
T (n) 2T (n / 2) bn log n
2(c(n / 2) log 2 (n / 2)) bn log n
cn(log n log 2) 2 bn log n
cn log 2 n 2cn log n cn bn log n
Here the above guess is correct if c > b. This is very much possible.
Divide-and-Conquer 11
Guess-and-Test Method,
Part 2
Recall the recurrence equation:
b if n 2
T (n)
2T ( n / 2) bn log n if n 2
Guess #2: T(n) < cn log2 n.
T (n) 2T (n / 2) bn log n
2(c(n / 2) log 2 (n / 2)) bn log n
cn(log n log 2) 2 bn log n
cn log 2 n 2cn log n cn bn log n
Divide-and-Conquer 12
Master Method
Many divide-and-conquer recurrence equations have
the form:
c if n d
T (n)
aT ( n / b) f ( n ) if n d
Divide-and-Conquer 13
Master Method, Example 1
The form: T (n ) c if n d
aT ( n / b) f ( n ) if n d
The Master Theorem:
1. if f (n) is O(n log b a ), then T (n) is (n log b a )
2. if f (n) is (n log b a log k n), then T (n) is (n log b a log k 1 n)
3. if f (n) is (n log b a ), then T (n) is ( f (n)),
provided af (n / b) f (n) for some 1.
Example:
T (n) 4T (n / 2) n
Solution: logba=2, so case 1 says T(n) is Θ(n2).
Divide-and-Conquer 14
Master Method, Example 2
The form: T (n ) c if n d
aT ( n / b) f ( n ) if n d
The Master Theorem:
1. if f (n) is O(n log b a ), then T (n) is (n log b a )
2. if f (n) is (n log b a log k n), then T (n) is (n log b a log k 1 n)
3. if f (n) is (n log b a ), then T (n) is ( f (n)),
provided af (n / b) f (n) for some 1.
Example:
T (n) 2T (n / 2) n log n
Solution: logba=1, so case 2 says T(n) is Θ(n log2 n).
Divide-and-Conquer 15
Master Method, Example 3
The form: T (n ) c if n d
aT ( n / b) f ( n ) if n d
The Master Theorem:
1. if f (n) is O(n log b a ), then T (n) is (n log b a )
2. if f (n) is (n log b a log k n), then T (n) is (n log b a log k 1 n)
3. if f (n) is (n log b a ), then T (n) is ( f (n)),
provided af (n / b) f (n) for some 1.
Example:
T (n) T (n / 3) n log n
Solution: logba=0, so case 3 says T(n) is Θ(n log n).
Divide-and-Conquer 16
Master Method, Example 4
The form: T (n ) c if n d
aT ( n / b) f ( n ) if n d
The Master Theorem:
1. if f (n) is O(n log b a ), then T (n) is (n log b a )
2. if f (n) is (n log b a log k n), then T (n) is (n log b a log k 1 n)
3. if f (n) is (n log b a ), then T (n) is ( f (n)),
provided af (n / b) f (n) for some 1.
Example:
T (n) 8T (n / 2) n 2
Divide-and-Conquer 17
Master Method, Example 5
The form: T (n ) c if n d
aT ( n / b) f ( n ) if n d
The Master Theorem:
1. if f (n) is O(n log b a ), then T (n) is (n log b a )
2. if f (n) is (n log b a log k n), then T (n) is (n log b a log k 1 n)
3. if f (n) is (n log b a ), then T (n) is ( f (n)),
provided af (n / b) f (n) for some 1.
Example:
T (n) 9T (n / 3) n 3
Divide-and-Conquer 18
Master Method, Example 6
The form: T (n ) c if n d
aT ( n / b) f ( n ) if n d
The Master Theorem:
1. if f (n) is O(n log b a ), then T (n) is (n log b a )
2. if f (n) is (n log b a log k n), then T (n) is (n log b a log k 1 n)
3. if f (n) is (n log b a ), then T (n) is ( f (n)),
provided af (n / b) f (n) for some 1.
Example:
T (n) T (n / 2) 1 (binary search)
Divide-and-Conquer 19
Master Method, Example 7
The form: T (n ) c if n d
aT ( n / b) f ( n ) if n d
The Master Theorem:
1. if f (n) is O(n log b a ), then T (n) is (n log b a )
2. if f (n) is (n log b a log k n), then T (n) is (n log b a log k 1 n)
3. if f (n) is (n log b a ), then T (n) is ( f (n)),
provided af (n / b) f (n) for some 1.
Example:
T (n) 2T (n / 2) log n (heap construction)
Solution: logba=1, so case 1 says T(n) is Θ(n).
Divide-and-Conquer 20
Iterative “Proof” of the
Master Theorem
Using iterative substitution, let us see if we can find a pattern:
T (n) aT (n / b) f (n)
a (aT (n / b 2 )) f (n / b)) bn
a 2T (n / b 2 ) af (n / b) f (n)
a 3T (n / b 3 ) a 2 f (n / b 2 ) af (n / b) f (n)
...
(log b n ) 1
a log b n
T (1) a
i 0
i
f (n / b i )
(log b n ) 1
n log b a
T (1) a
i 0
i
f (n / b i )
We then distinguish the three cases as
The first term is dominant
Each part of the summation is equally dominant
The summation is a geometric series
Divide-and-Conquer 21
Integer Multiplication
Algorithm: Multiply two n-bit integers I and J.
Divide step: Split I and J into high-order and low-order bits
I I h 2n / 2 I l
J J h 2n / 2 J l
We can then define I*J by multiplying the parts and adding:
I * J ( I h 2n / 2 I l ) * ( J h 2n / 2 J l )
I h J h 2n I h J l 2n / 2 I l J h 2n / 2 I l J l
So, T(n) = 4T(n/2) + n, which implies T(n) is O(n2).
But that is no better than the algorithm we learned in grade
school.
Divide-and-Conquer 22
An Improved Integer
Multiplication Algorithm
Algorithm: Multiply two n-bit integers I and J.
Divide step: Split I and J into high-order and low-order bits
I I h 2n / 2 I l
J J h 2n / 2 J l
Observe that there is a different way to multiply parts:
I * J I h J h 2 n ( I h J l I l J h )2 n / 2 I l J l
I h J h 2n [(I h J l I l J l I h J h I l J h ) I h J h I l J l ]2n / 2 I l J l
I h J h 2n [(I h I l )( J l J h ) I h J h I l J l ]2n / 2 I l J l
Submitted By:
Arjun Saraswat
Nishant Kapoor
Problem Definition
……………….. 2 54 44 4 25 ………………..
………………..
……………….. 5 5 32 18 39
………………..
21 26 47
……………….. 9 87
……………….. 19 9 13 16 56 ………………..
2 19 71 ………………..
……………….. 24 10
Find median of N/5 groups
……………….. 2 5 2 4 25 ………………..
………………..
……………….. 5 9 13 16 39
………………..
……………….. 9 10 21 18 47
……………….. 19 54 32 19 56 ………………..
44 26 71 ………………..
……………….. 24 87
Median of each group
Find the Median of each group
……………….. 2 5 2 4 25 ………………..
3.n/10 ………………..
……………….. 5 9 13 16 39
………………..
21 18 47
……………….. 9 10
……………….. 19 54 32 19 56 ………………..
44 26 71 ………………..
……………….. 24 87
Compare each n-1 elements with the median m and find two
sets L and R such that every element in L is smaller than M and
every element in R is greater than m.
m
L R
3n/10<L<7n/10 3n/10<R<7n/10
Description of the Algorithm step
If n is small, for example n<6, just sort and return the k the smallest
number.( Bound time- 7)
If n>5, then partition the numbers into groups of 5.(Bound time n/5)
Sort the numbers within each group. Select the middle elements (the
medians). (Bound time- 7n/5)
Call your "Selection" routine recursively to find the median of n/5
medians and call it m. (Bound time-Tn/5)
Compare all n-1 elements with the median of medians m and
determine the sets L and R, where L contains all elements <m, and R
contains all elements >m. Clearly, the rank of m is r=|L|+1 (|L| is the
size or cardinality of L). (Bound time- n)
Contd….
4 2 2 4 7 9 7 9
22 99
Quick-Sort 1
Outline and Reading
Quick-sort
Algorithm
Partition step
Quick-sort tree
Execution example
Analysis of quick-sort
In-place quick-sort
Summary of sorting algorithms
Quick-Sort 2
Quick-Sort
Quick-sort is a randomized
sorting algorithm based x
on the divide-and-conquer
paradigm:
Divide: pick a random
element x (called pivot) and
x
partition S into
L elements less than x
E elements equal x L E G
G elements greater than x
Recur: sort L and G
Conquer: join L, E and G x
Quick-Sort 3
Partition
We partition an input Algorithm partition(S, p)
sequence as follows: Input sequence S, position p of pivot
We remove, in turn, each Output subsequences L, E, G of the
elements of S less than, equal to,
element y from S and
or greater than the pivot, resp.
We insert y into L, E or G, L, E, G empty sequences
depending on the result of
x S.remove(p)
the comparison with the
while S.isEmpty()
pivot x
y S.remove(S.first())
Each insertion and removal if y < x
is at the beginning or at the L.insertLast(y)
end of a sequence, and else if y = x
hence takes O(1) time E.insertLast(y)
Thus, the partition step of else { y > x }
quick-sort takes O(n) time G.insertLast(y)
return L, E, G
Quick-Sort 4
Quick-Sort Tree
An execution of quick-sort is depicted by a binary tree
Each node represents a recursive call of quick-sort and stores
Unsorted sequence before the execution and its pivot
Sorted sequence at the end of the execution
The root is the initial call
The leaves are calls on subsequences of size 0 or 1
7 4 9 6 2 2 4 6 7 9
4 2 2 4 7 9 7 9
22 99
Quick-Sort 5
Execution Example
Pivot selection
7 2 9 43 7 6 1 1 2 3 4 6 7 8 9
7 2 9 4 2 4 7 9 3 8 6 1 1 3 8 6
99 44
Quick-Sort 6
Execution Example (cont.)
Partition, recursive call, pivot selection
7 2 9 4 3 7 6 1 1 2 3 4 6 7 8 9
2 4 3 1 2 4 7 9 3 8 6 1 1 3 8 6
99 44
Quick-Sort 7
Execution Example (cont.)
Partition, recursive call, base case
7 2 9 43 7 6 1 1 2 3 4 6 7 8 9
2 4 3 1 2 4 7 3 8 6 1 1 3 8 6
99 44
Quick-Sort 8
Execution Example (cont.)
Recursive call, …, base case, join
7 2 9 43 7 6 1 1 2 3 4 6 7 8 9
2 4 3 1 1 2 3 4 3 8 6 1 1 3 8 6
99 44
Quick-Sort 9
Execution Example (cont.)
Recursive call, pivot selection
7 2 9 43 7 6 1 1 2 3 4 6 7 8 9
2 4 3 1 1 2 3 4 7 9 7 1 1 3 8 6
99 44
Quick-Sort 10
Execution Example (cont.)
Partition, …, recursive call, base case
7 2 9 43 7 6 1 1 2 3 4 6 7 8 9
2 4 3 1 1 2 3 4 7 9 7 1 1 3 8 6
99 44
Quick-Sort 11
Execution Example (cont.)
Join, join
7 2 9 4 3 7 6 1 1 2 3 4 6 7 7 9
2 4 3 1 1 2 3 4 7 9 7 17 7 9
99 44
Quick-Sort 12
Worst-case Running Time
The worst case for quick-sort occurs when the pivot is the unique
minimum or maximum element
One of L and G has size n - 1 and the other has size 0
The running time is proportional to the sum
n + (n - 1) + … + 2 + 1
Thus, the worst-case running time of quick-sort is O(n2)
depth time
0 n
1 n-1
… …
n-1 1
Quick-Sort 13
Expected Running Time
Consider a recursive call of quick-sort on a sequence of size s
Good call: the sizes of L and G are each less than 3s/4
Bad call: one of L and G has size greater than 3s/4
7 2 9 43 7 6 19 7 2 9 43 7 6 1
2 4 3 1 7 9 7 1 1 1 7294376
Quick-Sort 14
Expected Running Time
Consider a recursive call of quick-sort on a sequence of size s
Good call: the sizes of L and G are each less than 3s/4
Bad call: one of L and G has size greater than 3s/4
7 2 9 43 7 6 19 7 2 9 43 7 6 1
2 4 3 1 7 9 7 1 1 1 7294376
Quick-Sort 15
Expected Running Time, Part 2
For a node of depth i,
The size of the input sequence for the current call is at most (3/4)i.n
Quick-Sort 16
In-Place Quick-Sort
Quick-sort can be implemented
to run in-place
In the partition step, we use Algorithm inPlaceQuickSort(S, l, r)
replace operations to rearrange Input sequence S, ranks l and r
the elements of the input Output sequence S with the
sequence such that elements of rank between l and r
rearranged in increasing order
the elements less than the
pivot have rank less than h if l r
the elements equal to the pivot return
have rank between h and k i a random integer between l and r
the elements greater than the x S.elemAtRank(i)
pivot have rank greater than k (h, k) inPlacePartition(x)
The recursive calls consider inPlaceQuickSort(S, l, h - 1)
elements with rank less than h inPlaceQuickSort(S, k + 1, r)
elements with rank greater
than k
Quick-Sort 17
Summary of Sorting Algorithms
Algorithm Time Notes
in-place
selection-sort O(n2) slow (good for small inputs)
in-place
insertion-sort O(n2) slow (good for small inputs)
Quick-Sort 18
Selection
Selection 1
The Selection Problem
Given an integer k and n elements x1, x2, …, xn,
taken from a total order, find the k-th smallest
element in this set.
Of course, we can sort the set in O(n log n) time
and then index the k-th element.
k=3 7 4 9 6 2 2 4 6 7 9
Selection 2
Quick-Select
Quick-select is a randomized
selection algorithm based on
x
the prune-and-search
paradigm:
Prune: pick a random element x
(called pivot) and partition S into
L elements less than x
x
E elements equal x
G elements greater than x L E G
Search: depending on k, either k < |L| k > |L|+|E|
answer is in E, or we need to k’ = k - |L| - |E|
recurse in either L or G
|L| < k < |L|+|E|
(done)
Selection 3
Partition
We partition an input Algorithm partition(S, p)
sequence as in the quick-sort Input sequence S, position p of pivot
algorithm: Output subsequences L, E, G of the
elements of S less than, equal to,
We remove, in turn, each or greater than the pivot, resp.
element y from S and
L, E, G empty sequences
We insert y into L, E or G, x S.remove(p)
depending on the result of
while S.isEmpty()
the comparison with the
y S.remove(S.first())
pivot x
if y < x
Each insertion and removal is L.insertLast(y)
at the beginning or at the else if y = x
end of a sequence, and E.insertLast(y)
hence takes O(1) time else { y > x }
Thus, the partition step of G.insertLast(y)
quick-select takes O(n) time return L, E, G
Selection 4
Quick-Select Visualization
An execution of quick-select can be visualized by a
recursion path
Each node represents a recursive call of quick-select, and
stores k and the remaining sequence
k=5, S=(7 4 9 3 2 6 5 1 8)
k=2, S=(7 4 9 6 5 8)
k=2, S=(7 4 6 5)
k=1, S=(7 6 5)
5
Selection 5
Expected Running Time
Consider a recursive call of quick-select on a sequence of size s
Good call: the sizes of L and G are each less than 3s/4
Bad call: one of L and G has size greater than 3s/4
7 2 9 43 7 6 19 7 2 9 43 7 6 1
2 4 3 1 7 9 7 1 1 1 7294376
Selection 6
Expected Running Time,
Part 2
Probabilistic Fact #1: The expected number of coin tosses required in
order to get one head is two
Probabilistic Fact #2: Expectation is a linear function:
E(X + Y ) = E(X ) + E(Y )
E(cX ) = cE(X )
Let T(n) denote the expected running time of quick-select.
By Fact #2,
T(n) < T(3n/4) + bn*(expected # of calls before a good call)
By Fact #1,
T(n) < T(3n/4) + 2bn
So T(n) is O(n).
We can solve the selection problem in O(n) expected
time.
Selection 7
Deterministic Selection
We can do selection in O(n) worst-case time.
Main idea: recursively use the selection algorithm
itself to find a good pivot for quick-select:
Divide S into n/5 sets of 5 each
Examples:
E 𝑋 = σ𝑥 𝑥. Pr(𝑋 = 𝑥)
Example
• Let X be a random variable that assigns the
outcomes of the roll of two fair dice the sum
of the number on the two dices. What is the
expected value of X.
• E(X)= ?
Example
• Let X be a random variable that assigns the
outcomes of the roll of two fair dice the sum
of the number on the two dices. What is the
expected value of X.
• E(X)= 7
Linearity of Expectation
• Let X and Y be two random variables
• Answer is 2
The Greedy Method
Objective: maximize b ( x / w )
iS
i i i
Constraint: x
iS
i W
The Greedy Method 4
Example
Given: A set S of n items, with each item i having
bi - a positive benefit
wi - a positive weight
Goal: Choose items with maximum total benefit but with
weight at most W.
“knapsack”
Solution:
• 1 ml of 5
Items:
1 2 3 4 5 • 2 ml of 3
• 6 ml of 4
Weight: 4 ml 8 ml 2 ml 6 ml 1 ml • 1 ml of 2
Benefit: $12 $32 $40 $30 $50 10 ml
Value: 3 4 20 5 50
($ per ml)
The Greedy Method 5
The Fractional Knapsack
Algorithm
Greedy choice: Keep taking
item with highest value Algorithm fractionalKnapsack(S, W)
(benefit to weight ratio) Input: set S of items w/ benefit bi
Since bi ( xi / wi ) (bi / wi ) xi and weight wi; max. weight W
iS iS Output: amount xi of each item i
Run time: ? to maximize benefit with
Correctness: ? weight at most W
for each item i in S
xi 0
vi bi / wi {value}
w0 {total weight}
while w < W
remove item i with highest vi
xi min{wi , W - w}
w w + min{wi , W - w}
Machine 3
Machine 2
Machine 1
1 2 3 4 5 6 7 8 9
Machine 3
Machine 2
Machine 1
1 2 3 4 5 6 7 8 9
L3.5
Binary search
Find an element in a sorted array:
1. Divide: Check middle element.
2. Conquer: Recursively search 1 subarray.
3. Combine: Trivial.
Example: Find 9
3 5 7 8 9 12 15
L3.6
Binary search
Find an element in a sorted array:
1. Divide: Check middle element.
2. Conquer: Recursively search 1 subarray.
3. Combine: Trivial.
Example: Find 9
3 5 7 8 9 12 15
L3.7
Binary search
Find an element in a sorted array:
1. Divide: Check middle element.
2. Conquer: Recursively search 1 subarray.
3. Combine: Trivial.
Example: Find 9
3 5 7 8 9 12 15
L3.8
Binary search
Find an element in a sorted array:
1. Divide: Check middle element.
2. Conquer: Recursively search 1 subarray.
3. Combine: Trivial.
Example: Find 9
3 5 7 8 9 12 15
L3.9
Binary search
Find an element in a sorted array:
1. Divide: Check middle element.
2. Conquer: Recursively search 1 subarray.
3. Combine: Trivial.
Example: Find 9
3 5 7 8 9 12 15
L3.10
Recurrence for binary search
T(n) = 1 T(n/2) + Q(1)
L3.11
Powering a number
Problem: Compute a n, where n N.
Naive algorithm: Q(n).
Divide-and-conquer algorithm:
a n/2 a n/2 if n is even;
an =
a (n–1)/2 a (n–1)/2 a if n is odd.
L3.12
Fibonacci numbers
Recursive definition:
0 if n = 0;
Fn = 1 if n = 1;
Fn–1 + Fn–2 if n 2.
0 1 1 2 3 5 8 13 21 34 L
Naive recursive algorithm: W(fn)
(exponential time), where f = (1 5) / 2
is the golden ratio.
L3.13
Matrix multiplication
Input: A = [aij], B = [bij].
i, j = 1, 2,… , n.
Output: C = [cij] = AB.
c11 c12 L c1n a11 a12 L a1n b11 b12 L b1n
c c L c2n a21 a22 L a2 n b21 b22 L b2n
21 22
c c L cnn an1 an 2 L ann bn1 bn 2 L bnn
n1 n 2
n
cij aik bkj
k 1
L3.17
Standard algorithm
for i 1 to n
do for j 1 to n
do cij 0
for k 1 to n
do cij cij + aik bkj
L3.18
Divide-and-conquer algorithm
IDEA:
nn matrix = 22 matrix of (n/2)(n/2) submatrices:
r s a b e f
t u c d g h
C = A B
r = ae + bg
s = af + bh 8 mults of (n/2)(n/2) submatrices
t = ce + dh 4 adds of (n/2)(n/2) submatrices
u = cf + dh
L3.19
Analysis of D&C algorithm
T(n) = 8 T(n/2) + Q(n2)
L3.20
Strassen’s idea
• Multiply 22 matrices with only 7 recursive mults.
P1 = a ( f – h) r = P 5 + P 4 – P2 + P 6
P2 = (a + b) h s = P1 + P2
P3 = (c + d) e t = P3 + P4
P4 = d (g – e) u = P5 + P1 – P3 – P7
P5 = (a + d) (e + h)
P6 = (b – d) (g + h) 7 mults, 18 adds/subs.
P7 = (a – c) (e + f ) Note: No reliance on
commutativity of mult!
L3.21
Strassen’s idea
• Multiply 22 matrices with only 7 recursive mults.
P1 = a ( f – h) r = P 5 + P 4 – P2 + P 6
P2 = (a + b) h = (a + d) (e + h)
P3 = (c + d) e + d (g – e) – (a + b) h
P4 = d (g – e) + (b – d) (g + h)
P5 = (a + d) (e + h) = ae + ah + de + dh
P6 = (b – d) (g + h) + dg –de – ah – bh
P7 = (a – c) (e + f ) + bg + bh – dg – dh
= ae + bg
L3.22
Strassen’s algorithm
1. Divide: Partition A and B into
(n/2)(n/2) submatrices. Form terms
to be multiplied using + and – .
2. Conquer: Perform 7 multiplications of
(n/2)(n/2) submatrices recursively.
3. Combine: Form C using + and – on
(n/2)(n/2) submatrices.
L3.23
Analysis of Strassen
T(n) = 7 T(n/2) + Q(n2)
nlogba = nlog27 n2.81 CASE 1 T(n) = Q(nlg 7).
The number 2.81 may not seem much smaller than
3, but because the difference is in the exponent, the
impact on running time is significant. In fact,
Strassen’s algorithm beats the ordinary algorithm
on today’s machines for n 30 or so.
Best to date (of theoretical interest only): Q(n2.376L).
L3.24
Conclusion
L3.25
David Huffman
or or or
Complete Graph All 16 of its Spanning Trees
Minimum Spanning Trees
The Minimum Spanning Tree for a given graph is the Spanning Tree of
minimum cost for that graph.
2 2
5 3 3
1 1
Algorithms for Obtaining the Minimum Spanning Tree
• Kruskal's Algorithm
• Prim's Algorithm
• Boruvka's Algorithm
Kruskal's Algorithm
Every step will have joined two trees in the forest together, so that at
the end, there will only be one tree in T.
Complete Graph
B 4 C
4
2 1
A 4 E
1 F
D 2 3
10
G 5
5 6 3
4
I
H
2 3
J
A 4 B A 1 D
B 4 C B 4 D
B 4 C B 10 J C 2 E
4
2 1
C 1 F D 5 H
A 4 E
1 F
2 D 6 J E 2 G
D 3
10
G 5
F 3 G F 5 I
5 6 3
4
I G 3 I G 4 J
H
2 3
J H 2 J I 3 J
Sort Edges A 1 D C 1 F
B 4 C H 2 J F 3 G
4
2 1
G 3 I I 3 J
A 4 E
1 F
2 A 4 B B 4 D
D 3
10
G 5
B 4 C G 4 J
5 6 3
4
I F 5 I D 5 H
H
2 3
J D 6 J B 10 J
Add Edge A 1 D C 1 F
C 2 E E 2 G
B 4 C H 2 J F 3 G
4
2 1
G 3 I I 3 J
A 4 E
1 F
2 A 4 B B 4 D
D 3
10
G 5
B 4 C G 4 J
5 6 3
4
I F 5 I D 5 H
H
2 3
J D 6 J B 10 J
Add Edge A 1 D C 1 F
C 2 E E 2 G
B 4 C H 2 J F 3 G
4
2 1
G 3 I I 3 J
A 4 E
1 F
2 A 4 B B 4 D
D 3
10
G 5
B 4 C G 4 J
5 6 3
4
I F 5 I D 5 H
H
2 3
J D 6 J B 10 J
Add Edge A 1 D C 1 F
C 2 E E 2 G
B 4 C H 2 J F 3 G
4
2 1
G 3 I I 3 J
A 4 E
1 F
2 A 4 B B 4 D
D 3
10
G 5
B 4 C G 4 J
5 6 3
4
I F 5 I D 5 H
H
2 3
J D 6 J B 10 J
Add Edge A 1 D C 1 F
C 2 E E 2 G
B 4 C H 2 J F 3 G
4
2 1
G 3 I I 3 J
A 4 E
1 F
2 A 4 B B 4 D
D 3
10
G 5
B 4 C G 4 J
5 6 3
4
I F 5 I D 5 H
H
2 3
J D 6 J B 10 J
Add Edge A 1 D C 1 F
C 2 E E 2 G
B 4 C H 2 J F 3 G
4
2 1
G 3 I I 3 J
A 4 E
1 F
2 A 4 B B 4 D
D 3
10
G 5
B 4 C G 4 J
5 6 3
4
I F 5 I D 5 H
H
2 3
J D 6 J B 10 J
Cycle A 1 D C 1 F
B 4 C H 2 J F 3 G
4
2 1
G 3 I I 3 J
A 4 E
1 F
2 A 4 B B 4 D
D 3
10
G 5
B 4 C G 4 J
5 6 3
4
I F 5 I D 5 H
H
2 3
J D 6 J B 10 J
Add Edge A 1 D C 1 F
C 2 E E 2 G
B 4 C H 2 J F 3 G
4
2 1
G 3 I I 3 J
A 4 E
1 F
2 A 4 B B 4 D
D 3
10
G 5
B 4 C G 4 J
5 6 3
4
I F 5 I D 5 H
H
2 3
J D 6 J B 10 J
Add Edge A 1 D C 1 F
C 2 E E 2 G
B 4 C H 2 J F 3 G
4
2 1
G 3 I I 3 J
A 4 E
1 F
2 A 4 B B 4 D
D 3
10
G 5
B 4 C G 4 J
5 6 3
4
I F 5 I D 5 H
H
2 3
J D 6 J B 10 J
Add Edge A 1 D C 1 F
C 2 E E 2 G
B 4 C H 2 J F 3 G
4
2 1
G 3 I I 3 J
A 4 E
1 F
2 A 4 B B 4 D
D 3
10
G 5
B 4 C G 4 J
5 6 3
4
I F 5 I D 5 H
H
2 3
J D 6 J B 10 J
Cycle A 1 D C 1 F
B 4 C H 2 J F 3 G
4
2 1
G 3 I I 3 J
A 4 E
1 F
2 A 4 B B 4 D
D 3
10
G 5
B 4 C G 4 J
5 6 3
4
I F 5 I D 5 H
H
2 3
J D 6 J B 10 J
Add Edge A 1 D C 1 F
C 2 E E 2 G
B 4 C H 2 J F 3 G
4
2 1
G 3 I I 3 J
A 4 E
1 F
2 A 4 B B 4 D
D 3
10
G 5
B 4 C G 4 J
5 6 3
4
I F 5 I D 5 H
H
2 3
J D 6 J B 10 J
Minimum Spanning Tree Complete Graph
B 4 C 4
B C
4 4
2 1 2 1
A E A 4
1 F E F
1
D 2 2
D 3
10
G G 5
3 5 6 3
4
I I
H H
2 3 3
J 2 J
Analysis of Kruskal's Algorithm
It usually only has to check a small fraction of the edges, but in some
cases (like if there was a vertex connected to the graph by only one edge
and it was the longest edge) it would have to check all the edges.
This algorithm starts with one node. It then, one by one, adds a node that
is unconnected to the new graph, each time selecting the node whose
connecting edge has the smallest weight out of the available nodes’
connecting edges.
The steps are:
1. The new graph is constructed - with one node from the old graph.
2. While new graph has fewer than n nodes,
1. Find the node from the old graph with the smallest connecting
edge to the new graph,
2. Add it to the new graph
Every step will have joined one node, so that at the end we will have
one graph with all the nodes and it will be a minimum spanning tree of
the original graph.
Complete Graph
B 4 C
4
2 1
A 4 E
1 F
D 2 3
10
G 5
5 6 3
4
I
H
2 3
J
Old Graph New Graph
B 4 C
4 B 4 C
2 1 4
2 1
A 4 E
1 F A 4 E F
1
D 2 3
10 D 2 3
G 5 10
G 5
5 6 3
5 6 3
4
I 4
H I
3 H
2 J
2 3
J
Old Graph New Graph
B 4 C
4 B 4 C
2 1 4
2 1
A 4 E
1 F A 4 E F
1
D 2 3
10 D 2 3
G 5 10
G 5
5 6 3
5 6 3
4
I 4
H I
3 H
2 J
2 3
J
Old Graph New Graph
B 4 C
4 B 4 C
2 1 4
2 1
A 4 E
1 F A 4 E F
1
D 2 3
10 D 2 3
G 5 10
G 5
5 6 3
5 6 3
4
I 4
H I
3 H
2 J
2 3
J
Old Graph New Graph
B 4 C
4 B 4 C
2 1 4
2 1
A 4 E
1 F A 4 E F
1
D 2 3
10 D 2 3
G 5 10
G 5
5 6 3
5 6 3
4
I 4
H I
3 H
2 J
2 3
J
Old Graph New Graph
B 4 C
4 B 4 C
2 1 4
2 1
A 4 E
1 F A 4 E F
1
D 2 3
10 D 2 3
G 5 10
G 5
5 6 3
5 6 3
4
I 4
H I
3 H
2 J
2 3
J
Old Graph New Graph
B 4 C
4 B 4 C
2 1 4
2 1
A 4 E
1 F A 4 E F
1
D 2 3
10 D 2 3
G 5 10
G 5
5 6 3
5 6 3
4
I 4
H I
3 H
2 J
2 3
J
Old Graph New Graph
B 4 C
4 B 4 C
2 1 4
2 1
A 4 E
1 F A 4 E F
1
D 2 3
10 D 2 3
G 5 10
G 5
5 6 3
5 6 3
4
I 4
H I
3 H
2 J
2 3
J
Old Graph New Graph
B 4 C
4 B 4 C
2 1 4
2 1
A 4 E
1 F A 4 E F
1
D 2 3
10 D 2 3
G 5 10
G 5
5 6 3
5 6 3
4
I 4
H I
3 H
2 J
2 3
J
Old Graph New Graph
B 4 C
4 B 4 C
2 1 4
2 1
A 4 E
1 F A 4 E F
1
D 2 3
10 D 2 3
G 5 10
G 5
5 6 3
5 6 3
4
I 4
H I
3 H
2 J
2 3
J
Old Graph New Graph
B 4 C
4 B 4 C
2 1 4
2 1
A 4 E
1 F A 4 E F
1
D 2 3
10 D 2 3
G 5 10
G 5
5 6 3
5 6 3
4
I 4
H I
3 H
2 J
2 3
J
Complete Graph Minimum Spanning Tree
B 4 C
4 B 4 C
2 1 4
2 1
A 4 E
1 F A E F
1
D 2 3
10 D 2
G 5
G
5 6 3
3
4
I
H I
3 H
2 J
2 3
J
Analysis of Prim's Algorithm
If a heap is not used, the run time will be O(n^2) instead of O(m + n log
n). However, using a heap complicates the code since you’re
complicating the data structure. A Fibonacci heap is the best kind of
heap to use, but again, it complicates the code.
Unlike Kruskal’s, it doesn’t need to see all of the graph at once. It can
deal with it one piece at a time. It also doesn’t need to worry if adding
an edge will create a cycle since this algorithm deals primarily with the
nodes, and not the edges.
This algorithm is similar to Prim’s, but nodes are added to the new graph
in parallel all around the graph. It creates a list of trees, each containing
one node from the original graph and proceeds to merge them along the
smallest-weight connecting edges until there’s only one tree, which is, of
course, the MST. It works rather like a merge sort.
The steps are:
Every step will have joined groups of trees, until only one tree remains.
Complete Graph
B 4 C
4
2 1
A 4 E
1 F
D 2 3
10
G 5
5 6 3
4
I
H
2 3
J
Trees of the Graph at Beginning List of Trees
of Round 1
B 4 C • A • I
4
2 1 • B • J
A 4
1
E F • C
D 2 3 • D
10
G 5
• E
5 6 3
4
• F
H
I
• G
2 J
3
• H
Round 1 Tree A
B 4 C B
4 4
2 1
A 4 E A
1 F
1
D 2 3 D
10
G 5
5 6 3
4
I
H
2 3
J
Round 1 Edge A-D
B 4 C B
4 4
2 1
A 4 E A
1 F
1
D 2 3 D
10
G 5
5 6 3
4
I
H
2 3
J
Round 1 Tree B
B 4 C 4
B C
4 4
2 1
A 4 E A 4
1 F
D 2 3 D
10 10
G 5
5 6 3
4
I
H
2 3
J J
Round 1 Edge B-A
B 4 C 4
B C
4 4
2 1
A 4 E A 4
1 F
D 2 3 D
10 10
G 5
5 6 3
4
I
H
2 3
J J
Round 1 Tree C
B 4 C 4
B C
4
2 1 2 1
A 4 E
1 F E F
D 2 3
10
G 5
5 6 3
4
I
H
2 3
J
Round 1 Edge C-F
B 4 C 4
B C
4
2 1 2 1
A 4 E
1 F E F
D 2 3
10
G 5
5 6 3
4
I
H
2 3
J
Round 1 Tree D
B 4 C B
4
2 1
A 4 E A 4
1 F
1
D 2 3 D
10
G 5
5 6 3 5 6
4
I
H H
2 3
J J
Round 1 Edge D-A
B 4 C B
4
2 1
A 4 E A 4
1 F
1
D 2 3 D
10
G 5
5 6 3 5 6
4
I
H H
2 3
J J
Round 1 Tree E
B 4 C C
4
2 1 2
A 4 E
1 F E
D 2 3 2
10
G 5
G
5 6 3
4
I
H
2 3
J
Round 1 Edge E-C
B 4 C C
4
2 1 2
A 4 E
1 F E
D 2 3 2
10
G 5
G
5 6 3
4
I
H
2 3
J
Round 1 Tree F
B 4 C C
4
2 1 1
A 4 E
1 F F
D 2 3 3
10
G 5 5
G
5 6 3
4
I I
H
2 3
J
Round 1 Edge F-C
B 4 C C
4
2 1 1
A 4 E
1 F F
D 2 3 3
10
G 5 5
G
5 6 3
4
I I
H
2 3
J
Round 1 Tree G
B 4 C
4
2 1
A 4 E
1 F E F
D 2 3 2 3
10
G 5
G
5 6 3 3
4 4
I I
H
2 3
J J
Round 1 Edge G-E
B 4 C
4
2 1
A 4 E
1 F E F
D 2 3 2 3
10
G 5
G
5 6 3 3
4 4
I I
H
2 3
J J
Round 1 Tree H
B 4 C
4
2 1
A 4 E
1 F
D 2 3 D
10
G 5
5 6 3 5
4
I
H H
2 3
J 2 J
Round 1 Edge H-J
B 4 C
4
2 1
A 4 E
1 F
D 2 3 D
10
G 5
5 6 3 5
4
I
H H
2 3
J 2 J
Round 1 Tree I
B 4 C
4
2 1
A 4 E
1 F F
D 2 3
10
G 5 5
G
5 6 3 3
4
I I
H
2 3 3
J J
Round 1 Edge I-G
B 4 C
4
2 1
A 4 E
1 F F
D 2 3
10
G 5 5
G
5 6 3 3
4
I I
H
2 3 3
J J
Round 1 Tree J
B 4 C B
4
2 1
A 4 E
1 F
D 2 3 D
10 10
G 5
G
5 6 3 6
4 4
I I
H H
2 3 3
J 2 J
Round 1 Edge J-H
B 4 C B
4
2 1
A 4 E
1 F
D 2 3 D
10 10
G 5
G
5 6 3 6
4 4
I I
H H
2 3 3
J 2 J
Round 1 Ends - List of Edges to
Add Edges Add
B 4 C • A-D • I-G
4
2 1 • B-A • J-H
A 4
1
E F • C-F
D 2 3 • D-A
10
G 5
• E-C
5 6 3
4
• F-C
H
I
• G-E
2 J
3
• H-J
Trees of the Graph at Beginning List of Trees
of Round 2
4
B 4 C • D-A-B
2
A 4
1
• F-C-E-G-I
E F
1
• H-J
D 2 3
10
G 5
5 6 3
4
I
H
2 3
J
Round 2 Tree D-A-B
B 4 C 4
B C
4 4
2 1 2 1
A 4 E A 4
1 F E F
1
D 2 3 2
D
10 10
G 5
G
5 6 3 5 6 3
4
I I
H H
2 3
J 2 J
Round 2 Edge B-C
B 4 C 4
B C
4 4
2 1 2 1
A 4 E A 4
1 F E F
1
D 2 3 2
D
10 10
G 5
G
5 6 3 5 6 3
4
I I
H H
2 3
J 2 J
Round 2 Tree F-C-E-G-I
B 4 C 4
B C
4 4
2 1 2 1
A 4 E A
1 F E F
1
D 2 3 2
D 3
10
G 5 5
G
5 6 3 3
4 4
I I
H H
2 3 3
J 2 J
Round 2 Edge I-J
B 4 C 4
B C
4 4
2 1 2 1
A 4 E A
1 F E F
1
D 2 3 2
D 3
10
G 5 5
G
5 6 3 3
4 4
I I
H H
2 3 3
J 2 J
Round 2 Tree H-J
B 4 C B C
4 4
2 1 2 1
A 4 E A
1 F E F
1
D 2 3 2
D
10 10
G 5
G
5 6 3 5 6 3
4 4
I I
H H
2 3 3
J 2 J
Round 2 Edge J-I
B 4 C B C
4 4
2 1 2 1
A 4 E A
1 F E F
1
D 2 3 2
D
10 10
G 5
G
5 6 3 5 6 3
4 4
I I
H H
2 3 3
J 2 J
Round 2 Ends - List of Edges to
Add Edges Add
B 4 C • B-C
4
2 1 • I-J
A 4
1
E F • J-I
D 2 3
10
G 5
5 6 3
4
I
H
2 3
J
Minimum Spanning Tree Complete Graph
B 4 C 4
B C
4 4
2 1 2 1
A E A 4
1 F E F
1
D 2 2
D 3
10
G G 5
3 5 6 3
4
I I
H H
2 3 3
J 2 J
Analysis of Boruvka's Algorithm
Like Prim’s, it does not need to worry about detecting cycles. It does,
however, need to see the whole graph, but it only examines pieces of it
at a time, not all of it at once.
Boruvka’s avoids the complicated data structures needed for the other
two algorithms.
So, of course, the best algorithm depends on the graph and if you want
to bear the cost of complex data structures.
24
2 3
9
s
18
14
2 6
6
30 4 19
11
15 5
5
6
20 16
t
7 44
1
Dijkstra's Shortest Path Algorithm
C={ }
PQ = { s, 2, 3, 4, 5, 6, 7, t }
24
2 3
0 9
s
18
14 2 6
6
30 4 19
11
15 5
5
6
20 16
t
7 44
distance label
2
Dijkstra's Shortest Path Algorithm
C={ }
PQ = { s, 2, 3, 4, 5, 6, 7, t }
delmin
24
2 3
0 9
s
18
14 2 6
6
30 4 19
11
15 5
5
6
20 16
t
7 44
distance label
3
Dijkstra's Shortest Path Algorithm
C={s}
PQ = { 2, 3, 4, 5, 6, 7, t }
decrease key
X
9
24
2 3
0 9
s
18
14 X
14 6
2
6
30 4 19
11
15 5
5
6
20 16
t
7 44
distance label 15
X
4
Dijkstra's Shortest Path Algorithm
C={s}
PQ = { 2, 3, 4, 5, 6, 7, t }
delmin
X 9
24
2 3
0 9
s
18
14 X 14
6
2
6
30 4 19
11
15 5
5
6
20 16
t
7 44
distance label 15
X
5
Dijkstra's Shortest Path Algorithm
C = { s, 2 }
PQ = { 3, 4, 5, 6, 7, t }
X 9
24
2 3
0 9
s
18
14 X 14
6
2
6
30 4 19
11
15 5
5
6
20 16
t
7 44
15
X
6
Dijkstra's Shortest Path Algorithm
C = { s, 2 }
PQ = { 3, 4, 5, 6, 7, t }
decrease key
X
33
X 9
24
2 3
0 9
s
18
14 X 14
6
2
6
30 4 19
11
15 5
5
6
20 16
t
7 44
15
X
7
Dijkstra's Shortest Path Algorithm
C = { s, 2 }
PQ = { 3, 4, 5, 6, 7, t }
X
33
X 9
24
2 3
0 9
delmin
s
18
14 X 14
6
2
6
30 4 19
11
15 5
5
6
20 16
t
7 44
15
X
8
Dijkstra's Shortest Path Algorithm
C = { s, 2, 6 }
PQ = { 3, 4, 5, 7, t }
32
X
33
X
X 9
24
2 3
0 9
s
18
14 X 14
6
2
6
44
30 X
4 19
11
15 5
5
6
20 16
t
7 44
15
X
9
Dijkstra's Shortest Path Algorithm
C = { s, 2, 6 }
PQ = { 3, 4, 5, 7, t }
32
X
33
X
X 9
24
2 3
0 9
s
18
14 X 14
6
2
6
44
30 X
4 19
11
15 5
5
6
20 16
t
7 44
15
X delmin
10
Dijkstra's Shortest Path Algorithm
C = { s, 2, 6, 7 }
PQ = { 3, 4, 5, t }
32
X
33
X
X 9
24
2 3
0 9
s
18
14 X 14
6
2
6
X 35
44
30 X
4 19
11
15 5
5
6
20 16
t
7 44
15
X
59 X
11
Dijkstra's Shortest Path Algorithm
C = { s, 2, 6, 7 }
PQ = { 3, 4, 5, t } delmin
32
X
33
X
X 9
24
2 3
0 9
s
18
14 X 14
6
2
6
X 35
44
30 X
4 19
11
15 5
5
6
20 16
t
7 44
15
X
59 X
12
Dijkstra's Shortest Path Algorithm
C = { s, 2, 3, 6, 7 }
PQ = { 4, 5, t }
32
X
33
X
X 9
24
2 3
0 9
s
18
14 X 14
6
2
6
X 34
X 35
44
30 X
4 19
11
15 5
5
6
20 16
t
7 44
15
X 51 59
X X
13
Dijkstra's Shortest Path Algorithm
C = { s, 2, 3, 6, 7 }
PQ = { 4, 5, t }
32
X
33
X
X 9
24
2 3
0 9
s
18
14 X 14
6
2
6
X 34
X 35
44
30 X
4 19
11
15 5
5
6
20 16
delmin
t
7 44
15
X 51 59
X X
14
Dijkstra's Shortest Path Algorithm
C = { s, 2, 3, 5, 6, 7 }
PQ = { 4, t }
32
X
33
X
X 9
24
2 3
0 9
s
18
14 X 14
6
2
6 45 X
X 34
X 35
44
30 X
4 19
11
15 5
5
6
20 16
t
7 44
15
X 50 51
X 59
X X
15
Dijkstra's Shortest Path Algorithm
C = { s, 2, 3, 5, 6, 7 }
PQ = { 4, t }
32
X
33
X
X 9
24
2 3
0 9
s
18
14 X 14
6
2
6 45 X
X 34
X 35
44
30 X
4 19
11
15 5 delmin
5
6
20 16
t
7 44
15
X 50 51
X 59
X X
16
Dijkstra's Shortest Path Algorithm
C = { s, 2, 3, 4, 5, 6, 7 }
PQ = { t }
32
X
33
X
X 9
24
2 3
0 9
s
18
14 X 14
6
2
6 45 X
X 34
X 35
44
30 X
4 19
11
15 5
5
6
20 16
t
7 44
15
X 50 51
X 59
X X
17
Dijkstra's Shortest Path Algorithm
C = { s, 2, 3, 4, 5, 6, 7 }
PQ = { t }
32
X
33
X
X 9
24
2 3
0 9
s
18
14 X 14
6
2
6 45 X
X 34
X 35
44
30 X
4 19
11
15 5
5
6
20 16
t
7 44
delmin 50 51
X 59
X X
15
X
18
Dijkstra's Shortest Path Algorithm
C = { s, 2, 3, 4, 5, 6, 7, t }
PQ = { }
32
X
33
X
X 9
24
2 3
0 9
s
18
14 X 14
6
2
6 45 X
X 34
X 35
44
30 X
4 19
11
15 5
5
6
20 16
t
7 44
15
X 50 51
X 59
X X
19
Dijkstra's Shortest Path Algorithm
C = { s, 2, 3, 4, 5, 6, 7, t }
PQ = { }
32
X
33
X
X 9
24
2 3
0 9
s
18
14 X 14
6
2
6 45 X
X 34
X 35
44
30 X
4 19
11
15 5
5
6
20 16
t
7 44
15
X 50 51
X 59
X X
20