DSA I Week 2 Lecture 2
DSA I Week 2 Lecture 2
Sherajul Arifin
Lecturer, Department of Computer Science and Engineering,
United International University
Counting Sort
2
Sorting in Linear Time
•Counting sort
o No comparisons between elements!
o But…depends on assumption about the numbers
being sorted
o We assume numbers are in the range 1, …, k
o The algorithm:
o Input: A[1..n], where A[i] ∈ {1, 2, 3, …, k}
o Output: B[1..n], sorted (notice: not sorting in place)
o Also: Array C[1..k] for auxiliary storage
3
Counting Sort
O(n)
11/21/21 5
Counting Sort
What will be the running time?
•Total time: O(n + k)
●Usually, k = O(n)
11/21/21 6
Simulation
7
Simulation
Initial o n
rati
st Ite
1
nd Iterati
on
2
3 rd It
erati
on
Final Result
8
Counting Sort
•Why don’t we always use counting sort?
Because it depends on range k of
elements
9
Divide-and-Conquer Technique
• Divide-and-Conquer is a general algorithm design
paradigm:
o Divide the problem into a number of subproblems that
are smaller
o instances of the same problem
o Conquer the subproblems by solving them recursively
o Combine the solutions to the subproblems into the
solution for the
o original problem
• The base case for the recursion are subproblems of
constant size
• Analysis can be done using recurrence equations
10
Merge Sort
11
Divide-and-Conquer
12
Merge Sort and Quicksort
•Two well-known sorting algorithms adopt this divide-and-conquer
strategy
•Merge sort
o Divide step is trivial – just split the list into two equal parts.
o Work is carried out in the conquer step by merging two sorted
lists.
•Quick sort
o Work is carried out in the divide step using a pivot element.
o Conquer step is trivial.
13
Merge Sort
p q r
q+1
14
Merge Sort
A
p q r
q+1
L R
∞ ∞
1 n1+1 n2+1
1
15
1 6 3 7 9 2 8 4
16
1 6 3 7 9 2 8 4
1 6 3 7 9 2 8 4
17
1 6 3 7 9 2 8 4
1 6 3 7 9 2 8 4
1 6 3 7 9 2 8 4
18
1 6 3 7 9 2 8 4
1 6 3 7 9 2 8 4
1 6 3 7 9 2 8 4
1 6 3 7 9 2 8 4
19
1 6 3 7 9 2 8 4
20
1 6 3 7 9 2 8 4
1 6 3 7 2 9 4 8
21
1 6 3 7 9 2 8 4
1 6 3 7 2 9 4 8
1 3 6 7 2 4 8 9
22
1 6 3 7 9 2 8 4
1 6 3 7 2 9 4 8
1 3 6 7 2 4 8 9
1 2 3 4 6 7 8 9
23
Merge Sort
Divide Step
7 2 9 4|
3 8 6
1
24
Merge Sort
7 2 9 4|
3 8 6
1
7 2 | 9 4
25
Merge Sort
7 2 9 4|
3 8 6
1
7 2 | 9 4
7 | 2
26
Merge Sort
7 2 9 4|
3 8 6
1
7 2 | 9 4
7 | 2
27
Merge Sort
7 2 9 4|
3 8 6
1
7 2 | 9 4
7 | 2
7 -> 7 2 -> 2
28
Merge Sort
Merge
7 2 9
4| 3 8 6 1
7 2 |9 4
7 | 2 -> 2 7
7 -> 7 2 -> 2
29
Merge Sort
7 2 9 4 | 3 8 6
1
7 2 | 9 4
7 | 2 -> 2 7 9 | 4 -> 4 9
30
Merge Sort
Merge
7 2 9 4 |
3 8 6 1
7 2 | 9 4 -> 2 4 7 9
7 | 2 -> 2 7 9 | 4 -> 4 9
31
Merge Sort
7 2 9 4|
3 8 6
1
7 2 | 9 4 -> 2 4 7 9 3 8 | 6 1 -> 1 3 6 8
32
Merge Sort
Merge
7294|3861 -> 1 2 3 4 6 7 8 9
7 2 | 9 4 -> 2 4 7 9 3 8 | 6 1 -> 1 3 6 8
33
Merge Sort – Run Time
T(n)
T(n/2)
T(n/2)
O(n)
35
Time Complexity Analysis
Let T(n) denote its worst-case running time on input instances of size
n. Thus the running time,T(n) satisfies the following recurrence
relation.
T(2) ≤ c.
36
Solving Recurrence Relations
Method-1: Unrolling the Recursive Tree
Step -1: Analyze the first few steps:
o At the first level of recursion, we have a single problem of
size n, which takes time at most cn plus the time spent in
all subsequent recursive calls.
o At the next level, we have two problems each of size n/2.
Each of these takes time at most cn/2, for a total of at
most cn, again plus the time in subsequent recursive calls.
o At the third level, we have four problems each of size n/4,
each taking time at most cn/4, for a total of at most cn.
37
Solving Recurrence Relations
38
Solving Recurrence Relations
39
Solving Recurrence Relations
Step -3: Summing over all levels of recursion:
We know, the recursion will ‘bottom out’ when the value of input n
reaches n = 1. We assume the recursion will level down x times. Each
time the input size is halved, after x times it will be n/2x
→x = log2 n = log2 n * cn
Master Theorem:
a = 2, b = 2, k = 1, so, a = bk
= O(nlog2 2 log2 n)
= O(n1 log n)
= O(n log n)
42
Thank you!