0% found this document useful (0 votes)
16 views43 pages

DSA I Week 2 Lecture 2

The document discusses Counting Sort and its linear time complexity, which operates under the assumption that the input numbers are within a specific range. It also introduces the Divide-and-Conquer technique, exemplified by Merge Sort and Quick Sort, detailing their operational steps and time complexity analysis. The document concludes with methods for solving recurrence relations, particularly using the Master Theorem.

Uploaded by

nsalman223675
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views43 pages

DSA I Week 2 Lecture 2

The document discusses Counting Sort and its linear time complexity, which operates under the assumption that the input numbers are within a specific range. It also introduces the Divide-and-Conquer technique, exemplified by Merge Sort and Quick Sort, detailing their operational steps and time complexity analysis. The document concludes with methods for solving recurrence relations, particularly using the Master Theorem.

Uploaded by

nsalman223675
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 43

CSE 2215: Data Structures and Algorithms I

Sherajul Arifin
Lecturer, Department of Computer Science and Engineering,
United International University
Counting Sort

2
Sorting in Linear Time
•Counting sort
o No comparisons between elements!
o But…depends on assumption about the numbers
being sorted
o We assume numbers are in the range 1, …, k
o The algorithm:
o Input: A[1..n], where A[i] ∈ {1, 2, 3, …, k}
o Output: B[1..n], sorted (notice: not sorting in place)
o Also: Array C[1..k] for auxiliary storage

3
Counting Sort

COUNTING-SORT assumes that each of the input elements is an integer in the


range 0 to k, inclusive.
A [1..n] is the input array, B [1..n] is the output array, C [0..k] is a temporary working
array.
4
Counting Sort
O(k)

O(n)

11/21/21 5
Counting Sort
What will be the running time?
•Total time: O(n + k)
●Usually, k = O(n)

●Thus counting sort runs in O(n) time

11/21/21 6
Simulation

7
Simulation

Initial o n
rati
st Ite
1
nd Iterati
on
2
3 rd It
erati
on

Final Result
8
Counting Sort
•Why don’t we always use counting sort?
Because it depends on range k of
elements

•Could we use counting sort to sort 32 bit


integers? Why or why not?
Answer: No, k is too large (232 =
4,294,967,296)

9
Divide-and-Conquer Technique
• Divide-and-Conquer is a general algorithm design
paradigm:
o Divide the problem into a number of subproblems that
are smaller
o instances of the same problem
o Conquer the subproblems by solving them recursively
o Combine the solutions to the subproblems into the
solution for the
o original problem
• The base case for the recursion are subproblems of
constant size
• Analysis can be done using recurrence equations

10
Merge Sort

11
Divide-and-Conquer

12
Merge Sort and Quicksort
•Two well-known sorting algorithms adopt this divide-and-conquer
strategy

•Merge sort
o Divide step is trivial – just split the list into two equal parts.
o Work is carried out in the conquer step by merging two sorted
lists.

•Quick sort
o Work is carried out in the divide step using a pivot element.
o Conquer step is trivial.

13
Merge Sort

p q r

q+1

14
Merge Sort
A

p q r

q+1

L R

∞ ∞

1 n1+1 n2+1

1
15
1 6 3 7 9 2 8 4

16
1 6 3 7 9 2 8 4

1 6 3 7 9 2 8 4

17
1 6 3 7 9 2 8 4

1 6 3 7 9 2 8 4

1 6 3 7 9 2 8 4

18
1 6 3 7 9 2 8 4

1 6 3 7 9 2 8 4

1 6 3 7 9 2 8 4

1 6 3 7 9 2 8 4

19
1 6 3 7 9 2 8 4

20
1 6 3 7 9 2 8 4

1 6 3 7 2 9 4 8

21
1 6 3 7 9 2 8 4

1 6 3 7 2 9 4 8

1 3 6 7 2 4 8 9

22
1 6 3 7 9 2 8 4

1 6 3 7 2 9 4 8

1 3 6 7 2 4 8 9

1 2 3 4 6 7 8 9

23
Merge Sort

Divide Step

7 2 9 4|
3 8 6
1

24
Merge Sort

Recursive Call, Divide

7 2 9 4|
3 8 6
1

7 2 | 9 4

25
Merge Sort

Recursive Call, Divide

7 2 9 4|
3 8 6
1

7 2 | 9 4

7 | 2

26
Merge Sort

Recursive Call, Base Case

7 2 9 4|
3 8 6
1

7 2 | 9 4

7 | 2

27
Merge Sort

Recursive Call, Base Case

7 2 9 4|
3 8 6
1

7 2 | 9 4

7 | 2

7 -> 7 2 -> 2

28
Merge Sort

Merge

7 2 9
4| 3 8 6 1

7 2 |9 4

7 | 2 -> 2 7

7 -> 7 2 -> 2

29
Merge Sort

Recursive Call, Partition, Base Case, Merge

7 2 9 4 | 3 8 6
1

7 2 | 9 4

7 | 2 -> 2 7 9 | 4 -> 4 9

7 -> 7 2 -> 2 9 -> 9 4 -> 4

30
Merge Sort

Merge

7 2 9 4 |
3 8 6 1

7 2 | 9 4 -> 2 4 7 9

7 | 2 -> 2 7 9 | 4 -> 4 9

7 -> 7 2 -> 2 9 -> 9 4 -> 4

31
Merge Sort

Recursive Call, … , Merge, Merge

7 2 9 4|
3 8 6
1

7 2 | 9 4 -> 2 4 7 9 3 8 | 6 1 -> 1 3 6 8

7 | 2 -> 2 7 9 | 4 -> 4 9 3 | 8 -> 3 8 6 | 1 -> 1 6

7 -> 7 2 -> 2 9 -> 9 4 -> 4 3 -> 3 8 -> 8 6 -> 6 1 -> 1

32
Merge Sort

Merge

7294|3861 -> 1 2 3 4 6 7 8 9

7 2 | 9 4 -> 2 4 7 9 3 8 | 6 1 -> 1 3 6 8

7 | 2 -> 2 7 9 | 4 -> 4 9 3 | 8 -> 3 8 6 | 1 -> 1 6

7 -> 7 2 -> 2 9 -> 9 4 -> 4 3 -> 3 8 -> 8 6 -> 6 1 -> 1

33
Merge Sort – Run Time

Input Size = n = r-p+1


O(n)

O(n) For loops at line 4,6


n1 = n2 = n/2 = O(n)

O(n) For loop at line 12


r – p = n-1 = O(n)

Total run time T(n) =


O(n)
34
Merge Sort

Total Run Time T(n)

T(n)

T(n/2)
T(n/2)
O(n)

35
Time Complexity Analysis

Let T(n) denote its worst-case running time on input instances of size
n. Thus the running time,T(n) satisfies the following recurrence
relation.

T(n) ≤ 2T(n / 2) + O(n)


→T(n) ≤ 2T(n / 2) + cn
Here, c is some arbitrary constant.

Now, the base case for this recurrence?

T(2) ≤ c.
36
Solving Recurrence Relations
Method-1: Unrolling the Recursive Tree
Step -1: Analyze the first few steps:
o At the first level of recursion, we have a single problem of
size n, which takes time at most cn plus the time spent in
all subsequent recursive calls.
o At the next level, we have two problems each of size n/2.
Each of these takes time at most cn/2, for a total of at
most cn, again plus the time in subsequent recursive calls.
o At the third level, we have four problems each of size n/4,
each taking time at most cn/4, for a total of at most cn.
37
Solving Recurrence Relations

Figure: Unrolling the recurrence T(n) ≤ 2T(n/2) + O(n).

38
Solving Recurrence Relations

Step -2: Identifying a pattern:

What’s going on in general?


o At level j of the recursion, the number of subproblems has doubled
j times, so there are now a total of 2j subproblems.
o Each has correspondingly shrunk in size by a factor of two j times,
and so each has size n/2j, and hence each takes time at most
cn/2j .
o Thus level j contributes a total of at most 2j (cn/2j) = cn to the total
running time.

39
Solving Recurrence Relations
Step -3: Summing over all levels of recursion:

We know, the recursion will ‘bottom out’ when the value of input n
reaches n = 1. We assume the recursion will level down x times. Each
time the input size is halved, after x times it will be n/2x

n/2x = 1 Each level contributes = cn

→n = 2x Total run time = # of levels * level


contribution

→x = log2 n = log2 n * cn

= cn log2 n = O(n log n)


40
Solving Recurrences by Master Theorem
Method-2: Use the magic formula

Master Theorem:

Let us consider the following general recurrence

T(n) = aT(n/b) + f(n)


Where, f(n) is a function of the form f(n) = O(nk)

1. If a > bk, T(n) = O(nlogb a)


2. If a = bk, T(n) = O(nlogb a log n)
3. If a < bk, T(n) = O(nk)
41
Solving Recurrences by Master Theorem
For Merge Sort –

T(n) = 2T(n / 2) + O(n)

a = 2, b = 2, k = 1, so, a = bk

Hence, T(n) = O(nlogb a log2 n)

= O(nlog2 2 log2 n)

= O(n1 log n)

= O(n log n)

42
Thank you!

You might also like