Lecture-3 (DivideAndConquer)
Lecture-3 (DivideAndConquer)
Conquer
Merge Sort, Quick Sort
Sorting
• Arrange an unordered list of elements in some
order.
• Some common algorithms
• Bubble Sort
• Insertion Sort
• Merge Sort
• Quick Sort
Sorting
• Important primitive
• For today, we’ll pretend all elements are distinct.
6 4 3 8 1 5 2 7
1 2 3 4 5 6 7 8
6 4 3 8 5
4 6 3 8 5
Insertion Sort 6 4 3 8 5
example
Insertion-Sort(A, n)
for i = 1 to n – 1
Then move A[2]:
key = A[i] key = 3
j = i – 1
while j >= 0 and A[j] > key
A[j + 1] = A[j] 4 6 3 8 5
j = j – 1
A[j + 1] = key
4 6 6 8 5
4 4 6 8 5
3 4 6 8 5
Insertion Sort 6 4 3 8 5
example
Insertion-Sort(A, n)
for i = 1 to n – 1
Then move A[3]:
key = A[i] key = 8
j = i – 1
while j >= 0 and A[j] > key
A[j + 1] = A[j]
j = j – 1
A[j + 1] = key 3 4 6 8 5
3 4 6 8 5
Insertion Sort 6 4 3 8 5
example
Insertion-Sort(A, n)
for i = 1 to n – 1
Then move A[4]:
key = A[i] key = 5
j = i – 1
while j >= 0 and A[j] > key
A[j + 1] = A[j]
j = j – 1 3 4 6 8 5
A[j + 1] = key
3 4 6 8 8
3 4 6 6 8
3 4 5 6 8
Insertion Sort 6 4 3 8 5
example
Start by moving A[1] toward
the beginning of the list until
you find something smaller
(or can’t go any further): Then move A[3]:
6 4 3 8 5 3 4 6 8 5
4 6 3 8 5 3 4 6 8 5
Then move A[2]: Then move A[4]:
4 6 3 8 5 3 4 6 8 5
3 4 6 8 5 3 4 5 6 8
Then we are done!
Why does this work?
Proof By
Induction!
Outline of a proof by
induction
Let A be a list of length n
• Base case:
• A[:1] is sorted at the end of the 0’th iteration. ✓
• Inductive Hypothesis:
• A[:i+1] is sorted at the end of the ith iteration (of the outer loop).
• Inductive step:
• For any 0 < k < n, if the inductive hypothesis holds for i=k-1, then it holds for
i=k.
• Aka, if A[:k] is sorted at step k-1, then A[:k+1] is sorted at step k (previous
slide)
• Conclusion:
• The inductive hypothesis holds for i = 0, 1, …, n-1.
• In particular, it holds for i=n-1.
• At the end of the n-1’st iteration (aka, at the end of the algorithm), A[:n] = A is
sorted.
• That’s what we wanted! ✓
Worst-case Analysis
• In this class we will use worst-case analysis:
• We assume that a “bad guy” produces a worst-case
input for our algorithm, and we measure performance
on that worst-case input.
By my count*…
• variable assignments
• increments/decrements
• comparisons
• … *A complete count of the operation will be insignificant from
later discussion.
In this class we will
use…
• Big-Oh notation!
• Gives us a meaningful way to talk about the
running time of an algorithm, independent of
programming language, computing platform, etc.,
without having to count all the operations.
Main idea:
Focus on how the runtime scales with n (the input size).
Asymptotic Running
Number of operations Time
• We say “ is ” if:
for all large enough n,
is at most some constant multiple of .
• Formally,
“There exists”
Ω(…) means a lower bound
• We say “ is ” if, for large enough n, is at least as big
as a constant multiple of .
• Formally,
Switched these!!
Θ(…) means both!
• We say “ is ” iff both:
and
Insertion Sort: running
time
def InsertionSort(A):
for i in range(1,len(A)):
current = A[i]
j = i-1
while j >= 0 and A[j] > current: n-1 iterations
A[j+1] = A[j] of the outer
j -= 1 loop
A[j+1] = current
Can we do better?
Can we do better?
• MergeSort: a divide-and-conquer approach
Big problem
Smaller Smaller
problem problem
Recurse! Recurse!
6 4 3 8 1 5 2 7
Recursive magic! Recursive magic!
3 4 6 8 1 2 5 7
MERGE! 1 2 3 4 5 6 7 8
MergeSort Pseudocode
MERGESORT(A):
• n = length(A)
• if n 1: If A has length 1,
It is already sorted!
• return A
Sort the left half
• L = MERGESORT(A[0 : n/2])
Sort the right half
• R = MERGESORT(A[n/2 : n])
• return MERGE(L, R) Merge the two halves
What actually happens?
First, recursively break up the array all the
way down to the base cases
6 4 3 8 1 5 2 7
6 4 3 8 1 5 2 7
6 4 3 8 1 5 2 7
6 4 3 8 1 5 2 7
This array of
length 1 is
sorted!
Then, merge them all back
up!
Sorted sequence!
1 2 3 4 5 6 7 8
Merge!
3 4 6 8 1 2 5 7
Merge! Merge!
4 6 3 8 1 5 2 7
Merge! Merge! Merge! Merge!
6 4 3 8 1 5 2 7
A bunch of sorted lists of length 1 (in the order of the original sequence).
Does it work?
• Yet another job for proof by induction!!!
• Try it yourself.
Assume that n is a power of 2
It’s fast
for convenience.
CLAIM:
MergeSort runs in time
CLAIM:
MergeSort runs in time
Let’s prove the claim
Size n Level 0
+
Time spent within the
n/2 t+1
n/2 t+1
two sub-problems
How much work in this sub-
problem?
Let k=n/2t…
+
Time spent within the
k/2 k/2 two sub-problems
How long does it k
take to MERGE? k/2 k/2
Answer: It takes time O(k), since we just walk across the list once.
k/2 k/2
3 4 6 8 1 2 5 7
MERGE! 1 2 3 4 5 6 7 8
k
Recursion tree
Size n
n/2 n/2
…
n/2t n/2t n/2t n/2t n/2t n/2t There are O(k) operations
done at this node.
…
k
k/2 k/2
(Size 1)
Recursion tree
…
This level?
n/2t n/2t n/2t n/2t n/2t n/2t There are O(k) operations
done at this node.
…
k
k/2 k/2
(Size 1)
Work this out yourself!
Recursion tree Size of
Amount of work
# each
Level problems at this level
problem
Size n
0 1 n O(n)
n/2 n/2
1 2 n/2 O(n)
n/4 n/4 n/4 n/4 2 4 n/4 O(n)
… …
n/2t
n/2t n/2t n/2t n/2t n/2t t 2t n/2t O(n)
… …
log(n)
(Size 1) n 1 O(n)
Total runtime…
• log(n) + 1 levels
• O( n log(n) ) total!
• Divide
• Partition the array A[1:n] into two (possibly empty) subarrays A[1:q-
1] (the low side) and A [q+1:n] (the high side)
• Each element in the low side of the partition is <=A[q]
Each element in the high side is of the partition >= A[q].
• Compute the index q of the pivot as part of this partitioning
procedure.
• Conquer
• Recursively sort the subarrays A[1:q-1] and A[q+1:n]
• Combine
• Already sorted
Pseudocode of QuickSort
QUICKSORT(A, p, r)
if p < r
q = PARTITION(A, p, r)
QUICKSORT(A, p, q – 1)
QUICKSORT(A, q+1, r)
PARTITION
PARTITION(A, p, r)
x = A[r]
i = p – 1
for j = p to r – 1
if A[j] <= x
i = i + 1
exchange A[i] with A[j]
exchange A[i + 1] with A[r]
return i + 1
Example of PARTITION
i = -1
7 6 3 5 1 2 4 Pick 4 as a pivot
j = 0 j = 1j = 2
Example of PARTITION
i = -1i = 0
3 6 7 5 1 2 4 Pick 4 as a pivot
j=3j=4
Example of PARTITION
i=0i=1
3 1 7 5 6 2 4 Pick 4 as a pivot
j=5
Example of PARTITION
i=1i=2
3 1 2 5 6 7 4 Pick 4 as a pivot
Example of PARTITION
1 2 3 4 5 6 7
QuickSort Runtime
Analysis
= The worst-case running time on a problem of size n
• Worst-case partitioning
• Best-case partitioning
Recurrences
• An equation that describes a function in terms of its
value on other, typically smaller, arguments.
• Recursive Case
• Involves the recursive invocation of the function on different
(usually smaller) inputs
• Base Case
• Does not involve a recursive invocation
Algorithmic Recurrences
• A recurrence is algorithmic if, for every sufficiently
large threshold constant , the following two
properties hold,
Solving Recurrences
• Substitution Method
• Guess a solution
• Use mathematical induction to prove the guess
Substitution Method
• Guess
• We need to prove,
• for all,
• For specific choice of and
Substitution Method
• Inductive Hypothesis
• for all and
If cn dominates
Substitution Method
• Base Case
• Assuming
• We get
Solving Recurrences
• Substitution Method
• Guess a solution
• Use mathematical induction to prove the guess
• Example
Master Theorem
• Let’s consider the following recurrence relation,
•
Master Theorem
•
Master Theorem
• Regularity condition
Master Theorem
• Worst-case partitioning
• Running Time
Can we do better?
Multiplying Square
Matrices
• Divide Assuming
• Conquer
Multiplying Square
Matrices
Θ (1)
𝑛
8𝑇 ( )
2
Multiplying Square
Matrices
• Running Time
Can we do better?
Strassen’s Algorithm
• Intuitions,
• Addition is faster than multiplications.
• Additional Resource:
• mastertheorem.pdf