0% found this document useful (0 votes)
13 views

Handout Sorting

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views

Handout Sorting

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 25

Sorting Algorithms Sorting Algorithms

Outline I
1 Introduction
Sorting
Classification of Sorting
Applications of Sorting
Sorting Algorithms 2 Bubble Sort
Algorithm
Complexity
Modified version
3 Selection Sort
Algorithm
Complexity
4 Insertion Sort
Algorithm
Complexity
5 Merge Sort
Merging
Sorting Algorithms Sorting Algorithms
Introduction

1 Introduction
Outline II Sorting
Classification of Sorting
Complexity Applications of Sorting
2 Bubble Sort
6 Quicksort Sort Algorithm
Algorithm Complexity
Complexity Modified version
Randomized Quick sort 3 Selection Sort
Algorithm
Complexity
7 Count Sort
Algorithm 4 Insertion Sort
Complexity Algorithm
Complexity
5 Merge Sort
8 Radix Sort Merging
Algorithm Complexity
Complexity
6 Quicksort Sort
Algorithm
Sorting Algorithms Sorting Algorithms
Introduction Introduction
Sorting

Complexity
Randomized Quick sort
What is Sorting? I

Sorting is an algorithm that arranges the elements of a list


in a certain order [either ascending or descending]. The
7 Count Sort
Algorithm output is a permutation or reordering of the input.
Complexity I Let A be a list of n numbers
I Sorting refers to the operation of rearranging elements of A
so that they are in increasing order
I Suppose A is: 8, 4, 19, 2, 7, 13, 5, 16
I The sorted list would be: 2, 4, 5, 7, 8, 13, 16, 19
8 Radix Sort I The above definition of sorting refers to arranging numerical
Algorithm data in increasing order. This restriction is only for notational
Complexity convenience. Sorting also may mean arranging numerical
data in decreasing order

Sorting Algorithms Sorting Algorithms


Introduction Introduction
Sorting Classification of Sorting

Why is Sorting Necessary? I Classification of Sorting Algorithms I

Sorting algorithms are generally categorized based on the


following parameters.
I By Number of Comparisons: In this method, sorting al-
Sorting is one of the important categories of algorithms in gorithms are classified based on the number of compar-
computer science and a lot of research has gone into this isons. For comparison based sorting algorithms, best case
category. Sorting can significantly reduce the complexity of behavior is O(n log n) and worst case behavior is O(n2 ).
a problem, and is often used for database algorithms and Comparison-based sorting algorithms evaluate the elements
searches. of the list by key comparison operation and need at least
O(n log n) comparisons for most inputs.
I Non – comparison (linear): sorting algorithms like Count-
ing sort, Bucket sort, Radix sort, etc. Linear Sorting algo-
rithms impose few restrictions on the inputs to improve the
complexity.
Sorting Algorithms Sorting Algorithms
Introduction Introduction
Classification of Sorting Classification of Sorting

Classification of Sorting Algorithms II Classification of Sorting Algorithms III

I By Number of Swaps: In this method, sorting algorithms


are categorized by the number of swaps (also called inver- I By Adaptability: With a few sorting algorithms, the com-
sions). plexity changes based on pre-sortedness [quick sort]: pre-
I By Memory Usage: Some sorting algorithms are “in place” sortedness of the input affects the running time. Algorithms
and they need O(1) or O(log n) memory to create auxiliary that take this into account are known to be adaptive.
locations for sorting the data temporarily. I Other Classifications: Another method of classifying sort-
I By Recursion: Sorting algorithms are either recursive [quick ing algorithms is:
sort] or non-recursive [selection sort, and insertion sort], and • Internal Sort: Sort algorithms that use main memory exclu-
there are some algorithms which use both (merge sort). sively during the sort are called internal sorting algorithms.
I By Stability: Sorting algorithm is stable if for all indices i This kind of algorithm assumes high-speed random access
and j such that the key A[i] equals key A[j], if record R[i] to all memory.
• External Sort: Sorting algorithms that use external memory,
precedes record R[j] in the original file, record R[i] precedes
such as tape or disk, during the sort come under this cate-
record R[j] in the sorted list. Few sorting algorithms maintain
gory.
the relative order of elements with equal keys (equivalent
elements retain their relative positions even after sorting).

Sorting Algorithms Sorting Algorithms


Introduction Bubble Sort
Applications of Sorting

1 Introduction
Applications of Sorting I Sorting
Classification of Sorting
Applications of Sorting
2 Bubble Sort
Uniqueness testing Algorithm
Complexity
Deleting duplicates
Modified version
Prioritizing events 3 Selection Sort
Frequency counting Algorithm
Complexity
Reconstructing the original order
4 Insertion Sort
Set intersection/union Algorithm
Finding a target pair x, y such that x + y = z Complexity
5 Merge Sort
Efficient searching Merging
Complexity
6 Quicksort Sort
Algorithm
Sorting Algorithms Sorting Algorithms
Bubble Sort Bubble Sort

Complexity
Randomized Quick sort
Bubble Sort I

7 Count Sort Bubble sort is the simplest sorting algorithm.


Algorithm It works by iterating the input array from the first element
Complexity to the last, comparing each pair of elements and swapping
them if needed.
Bubble sort continues its iterations until no more swaps are
needed.
8 Radix Sort
The algorithm gets its name from the way smaller elements
Algorithm
Complexity “bubble” to the top of the list.

Sorting Algorithms Sorting Algorithms


Bubble Sort Bubble Sort

Bubble Sort II Bubble Sort III

Suppose list of A[1], A[2], · · · , A[N] is in memory. Then


Generally, insertion sort has better performance than bub- bubble sort algorithm works as follows
ble sort. 1. Step 1. Compare A[1] and A[2] and arrange them in the de-
sired order, so that A[l] < A[2]. Then compare A[2] and A[3]
Some researchers suggest that we should not teach bubble and arrange them so that A[2] < A[3]. Then compare A[3]
sort because of its simplicity and high time complexity. and A[4] and arrange them so that A[3] < A[4]. Continue
The only significant advantage that bubble sort has over until we compare A[N − 1] with A[N] and arrange them so
that A[N − 1] < A[N].
other implementations is that it can detect whether the input
Observe that Step 1 involves n − 1 comparisons. (During
list is already sorted or not. Step 1, the largest element is “bubbled up” to the nth position
or “sinks” to the nth position.) When Step 1 is completed,
A[N] will contain the largest element.
Sorting Algorithms Sorting Algorithms
Bubble Sort Bubble Sort

Bubble Sort IV Bubble Sort V


Suppose we want to sort the numbers 32, 51, 27, 85, 66,
2. Step 2. Repeat Step 1 with one less comparison; that is, 23, 13, 57
now we stop after we compare and possibly rearrange A[N − I Step 1 results as follows
2] and A[N − 1]. (Step 2 involves N − 2 comparisons and,
32, 51, 27, 85, 66, 23, 13, 57
when Step 2 is completed, the second largest element will
32, 27, 51, 85, 66, 23, 13, 57
occupy A[N − 1].)
32, 27, 51, 85, 66, 23, 13, 57
3. Step 3. Repeat Step 1 with two fewer comparisons; that is,
32, 27, 51, 66, 85, 23, 13, 57
we stop after we compare and possibly rearrange A[N − 3]
32, 27, 51, 66, 23, 85, 13, 57
and A[N − 2].
32, 27, 51, 66, 23, 13, 85, 57
···
32, 27, 51, 66, 23, 13, 57, 85
···
I After step 1 the largest number 85 comes to last position
···
I After step 2
4. Step N - 1. Compare A[l] with A[2] and arrange them so that
A[l] < A[2]. 27, 33, 51, 23, 13, 57, 66, 85
After n − 1 steps, the list will be sorted in increasing order. I After Step N-1
13, 23, 27, 33, 51, 57, 66, 85

Sorting Algorithms Sorting Algorithms


Bubble Sort Bubble Sort
Algorithm Complexity

Algorithm I Complexity I

BUBBLE(DATA,N)
1. Repeat steps 2 and 3 for K=1 to N-1.
2. Set PTR:=1. [Initializes pass pointer PTR.]
Here Step 1 requires n − 1 comparisons Step2 requires n − 2
3. Repeat while PTR≤N-K: [Executes pass.]
comparisons
(a) If DATA[PTR]>DATA[PTR+1], then:
f (n) = (n − 1) + (n − 2) + · · · + 2 + 1 = n(n−1)
2 = O(n2 )
Interchange DATA[PTR] and DATA[PTR+1] 2
[End of If structure] Hence the complexity is of the order of n
(b) Set PTR:=PTR+1
[End of inner loop.]
[End of step 1 outer loop.]
4. Exit.
Sorting Algorithms Sorting Algorithms
Bubble Sort Bubble Sort
Complexity Modified version

Complexity II Modified version I

Algorithm takes O(n2 ) (even in best case).


Performance We can improve it by using one extra flag.
I Worst case complexity : O(n2 ) No more swaps indicate the completion of sorting.
I Best case complexity (Improved version) : O(n)
I Average case complexity (Basic version) : O(n2 ) If the list is already sorted, we can use this flag to skip the
I Worst case space complexity : O(1) auxiliary remaining passes.
This modified version improves the best case of bubble sort
to O(n).

Sorting Algorithms Sorting Algorithms


Selection Sort Selection Sort

1 Introduction Complexity
Sorting Randomized Quick sort
Classification of Sorting
Applications of Sorting
2 Bubble Sort
Algorithm
Complexity 7 Count Sort
Modified version Algorithm
3 Selection Sort Complexity
Algorithm
Complexity
4 Insertion Sort
Algorithm
8 Radix Sort
Complexity
Algorithm
5 Merge Sort Complexity
Merging
Complexity
6 Quicksort Sort
Algorithm
Sorting Algorithms Sorting Algorithms
Selection Sort Selection Sort

Selection Sort I Selection Sort II

Selection sort is an in-place sorting algorithm. Advantages


Selection sort works well for small files. I Easy to implement
It is used for sorting the files with very large values and I In-place sort (requires no additional storage space)
small keys. Disadvantages
This is because selection is made based on keys and swaps I Doesn’t scale well: O(n2 )
are made only when required.

Sorting Algorithms Sorting Algorithms


Selection Sort Selection Sort

Selection Sort III Selection Sort IV

More precisely:
I Pass 1. Find the location LOC of the smallest in the list of
Suppose an array A with n elements A[1], A[2], · · · , A[N] is N elements A[1], A[2], A[N], and then interchange A[LOC]
in memory. The selection sort algorithm for sorting A works and A[1]. Then: A[1] is sorted.
as follows. I Pass 2. Find the location LOC of the smallest in the sublist
First find the smallest element in the list and put it in the first of N − 1 elements A[2], A[3], A[N], and then interchange
position. A[LOC] and A[2]. Then: A[1], A[2] is sorted, since A[1] ≤
A[2].
Then find the second smallest element in the list and put it I Pass 3. Find the location LOC of the smallest in the sublist
in the second position. of N − 2 elements A[3], A[4], A[N], and then interchange
And so on. A[LOC] and A[3]. Then: A[1], A[2], A[3] is sorted, since
A[2] ≤ A[3].
I ···
I ···
Sorting Algorithms Sorting Algorithms
Selection Sort Selection Sort

Selection Sort V Selection Sort VI

I Pass N - 1. Find the location LOC of the smaller of the


elements A[N − 1], A[N], and then interchange A[LOC] and
A[N–1]. Then: A[1], A[2], · · · , A[N] is sorted, since A[N −
1] ≤ A[N].
Thus, A is sorted after N–1 passes.

Sorting Algorithms Sorting Algorithms


Selection Sort Selection Sort
Algorithm Algorithm

Algorithm I Algorithm II

Algorithm MIN(A, K, N, LOC): An array A is in memory. This procedure


1. Find the minimum value in the list finds the location LOC of the smallest element among A[K ], A[K +
2. Swap it with the value in the current position 1], · · · , A[N].
1. Set MIN := A[K ] and LOC := K . [Initializes pointers.]
3. Repeat this process for all the elements until the entire array
2. Repeat for J = K + 1, K + 2, · · · , N :
is sorted If MIN > A[J], then: Set MIN := A[J] and LOC := A[J] and LOC := J.
This algorithm is called selection sort since it repeatedly se- [End of loop.]
3. Return.
lects the smallest element.
Sorting Algorithms Sorting Algorithms
Selection Sort Selection Sort
Algorithm Complexity

Algorithm III Complexity I

(Selection Sort) SELECTION(A, N) Observe that MIN(A, K, N, LOC) requires n − K compar-


This algorithm sorts the array A with N elements. isons.
1. Repeat Steps 2 and 3 for K = 1, 2, · · · , N − 1 :
2. Call MIN(A, K , N, LOC).
That is, there are n − 1 comparisons during Pass 1 to find
3. [Interchange A[K ] and A[LOC].] the smallest element, there are n − 2 comparisons during
Set TEMP := A[K ], A[K ] := A[LOC] and A[LOC] := TEMP. Pass 2 to find the second smallest element, and so on.
[End of Step 1 loop.] f (n) = (n − 1) + (n − 2) + · · · + 2 + 1 = n(n−1)
2 = O(n2 )
4. Exit.
Hence the complexity is of the order of n2 .

Sorting Algorithms Sorting Algorithms


Selection Sort Insertion Sort
Complexity

1 Introduction
Complexity II Sorting
Classification of Sorting
Applications of Sorting
2 Bubble Sort
Algorithm
Complexity
Performance Modified version
I Worst case complexity : O(n2 ) 3 Selection Sort
I Best case complexity : O(n2 ) Algorithm
I Average case complexity : O(n2 ) Complexity
I Worst case space complexity: O(1) auxiliary 4 Insertion Sort
Algorithm
Complexity
5 Merge Sort
Merging
Complexity
6 Quicksort Sort
Algorithm
Sorting Algorithms Sorting Algorithms
Insertion Sort Insertion Sort

Complexity
Randomized Quick sort
Insertion sort I

7 Count Sort Insertion sort is a simple and efficient comparison sort.


Algorithm
Complexity In this algorithm, each iteration removes an element from
the input data and inserts it into the correct position in the
list being sorted.
Thechoice of the element being removed from the input is
8 Radix Sort random and this process is repeated until all input elements
Algorithm have gone through.
Complexity

Sorting Algorithms Sorting Algorithms


Insertion Sort Insertion Sort

Insertion sort II Insertion sort III

Advantages
I Simple implementation
I Efficient for small data
I Adaptive: If the input list is presorted [may not be com- Suppose an array A with n elements A[1], A[2], · · · , A[N] is
pletely] then insertions sort takes O(n + d), where d is the in memory.
number of inversions Practically more efficient than selec- The insertion sort algorithm scans A from A[1] to A[N], in-
tion and bubble sorts, even though all of them have O(n2 )
serting each element A[K ] into its proper position in the pre-
worst case complexity
I Stable: Maintains relative order of input data if the keys are viously sorted subarray A[1], A[2], · · · , A[K − l].
same
I In-place: It requires only a constant amount O(1) of addi-
tional memory space
I Online: Insertion sort can sort the list as it receives it
Sorting Algorithms Sorting Algorithms
Insertion Sort Insertion Sort

Insertion sort IV Insertion sort V

Suppose an array A with n elements is in memory.


That is: The insertion sort algorithm scans A from A[1] to A[N] in-
1. Pass 1. A[1] by itself is trivially sorted. serting each element A[K] into its proper position.
2. Pass 2. A[2] is inserted either before or after A[1] so that:
Consider A with 8 elements: 77, 33, 44, 11, 88, 22, 66, 55
A[1], A[2] is sorted.
3. Pass 3. A[3] is inserted into its proper place in A[1], A[2], 1. In pass 1 A[1] by itself is trivially sorted
that is, before A[1], between A[1] and A[2], or after A[2], so 77, 33, 44, 11, 88, 22, 66, 55
that: A[1], A[2], A[3] is sorted. 2. In pass 2 A[2] is inserted either before or after A[1] so that
4. Pass 4. A[4] is inserted into its proper place in A[1], A[2], A[1], A[2] is sorted
A[3] so that: A[1], A[2], A[3], A[4] is sorted. 33, 77, 44, 11, 88, 22, 66, 55

5. · · · 3. In pass 3 A[3] is inserted so that A[1], A[2], A[3] is sorted


33, 44, 77, 11, 88, 22, 66, 55
6. Pass N. A[N] is inserted into its proper place in A[1], A[2],
· · · , A[N − 1] so that: A[1], A[2], A[N] is sorted. 4. In pass N A[N] is inserted so that A[1], A[2], A[3], · · · , A[N]
is sorted
11, 22, 33, 44, 55, 66, 77, 88

Sorting Algorithms Sorting Algorithms


Insertion Sort Insertion Sort
Algorithm

Insertion sort VI Algorithm

(Insertion Sort) INSERTION(A,N) This algorithm sorts the array


A with N elements.
1. Set A[0]:=-∞. [Initializes sentinel element.]
2. Repeat Steps 3 to 5 for K = 2, 3, · · · , N
3. Set TEMP:= A[K] and PTR:=K-1
4. Repeat while TEMP<A[PTR]
(a) Set A[PTR+1]:=A[PTR]. [Moves element forward.]
(b) Set PTR:=PTR-1
[End of loop]
5. Set A[PTR+1]:=TEMP. [Inserts element in proper place.]
[End of Step 2 loop]
6. Return
Sorting Algorithms Sorting Algorithms
Insertion Sort Insertion Sort
Complexity Complexity

Complexity I Complexity II

Performance: If every element is greater than or equal to


1. The number of comparisons in the insertion sort can be every element to its left, the running time of insertion sort
computed as is Θ(n). This situation occurs if the array starts out already
n(n−1) sorted, and so an already-sorted array is the best case for
1 + 2 + · · · + (n − 1) = 2 insertion sort.
2. Hence the time complexity is of the order of n2 I Worst case complexity: Θ(n2 )
3. Hence insertion sort is a very slow algorithm when the value I Best case complexity: Θ(n)
of n is very large I Average case complexity: Θ(n2 )
I Worst case space complexity: O(n2 ) total, O(1) auxiliary

Sorting Algorithms Sorting Algorithms


Merge Sort Merge Sort

1 Introduction Complexity
Sorting Randomized Quick sort
Classification of Sorting
Applications of Sorting
2 Bubble Sort
Algorithm
Complexity 7 Count Sort
Modified version Algorithm
3 Selection Sort Complexity
Algorithm
Complexity
4 Insertion Sort
Algorithm
8 Radix Sort
Complexity
Algorithm
5 Merge Sort Complexity
Merging
Complexity
6 Quicksort Sort
Algorithm
Sorting Algorithms Sorting Algorithms
Merge Sort Merge Sort
Merging Merging

Merging I MERGE I

Suppose A is a sorted list with r elements and B is a sorted


list with s elements
The operation that combines the elements of A and B into
a single sorted list with r+s elements is called merging Assume A is sorted with r elements and lower bound LBA. B is
The lists can be merged as follows sorted with s elements and lower bound LBB and C has lower
I At each step starting from the front two arrays elements are bound LBC. Then UBA = LBA + r − 1 and UBB = LBB + s − 1
compared and smaller one is placed in combined list
I The above step is repeated until one of the list becomes
empty
I When one list is empty then the remaining elements of other
list are placed in the combined list
I Thus merging can be done in linear time n that is in n = r +s

Sorting Algorithms Sorting Algorithms


Merge Sort Merge Sort
Merging Merging

MERGE II MERGE III

MERGE(A,R,LBA,S,LBB,C,LBC)
3. [Assign remaining elements to C.]
1. [Initialize] Set NA := LBA, NB = LBB, PTR := LBC, UBA = If NA > UBA then:
LBA + R − 1, UBB = LBB + S − 1 I Repeat for K=0,1,2, · · · , UBB-NB
2. [Compare.] Repeat while NA ≤ UBA and NB ≤ UBB: Set C[PTR+K]:=B[NB+K]
If A[NA] < B[NB] then [End of loop.]
(a) [Assign element from A to C.] Set C[PTR]:=A[NA] Else:
(b) [Update pointers.] Set PTR:=PTR+1 AND NA:=NA+1 I Repeat for K=0,1,2, · · · , UBA-NA
Else: Set C[PTR+K]:=A[NA+K]
(a) [Assign element from B to C.] Set C[PTR]:=B[NB] [End of loop.]
(b) [Update pointers.] Set PTR:=PTR+1 AND NB:=NB+1 [End of If structure.]
[End of If structure.] 4. Return
[End of loop.]
Sorting Algorithms Sorting Algorithms
Merge Sort Merge Sort
Merging Merging

Binary Search and Insertion Algorithm I Binary Search and Insertion Algorithm II

The binary search and insertion algorithm does not take into
Suppose the number r of elements in a sorted array A is account the fact that A is sorted. Accordingly, the algorithm
much smaller than the number s of elements in a sorted may be improved in two ways as follows. (Here we assume
array B. that A has 5 elements and B has 100 elements.)
One can merge A with B as follows. For each element A[K] Reducing the target set: Suppose after the first search
of A, use a binary search on B to find the proper location we find that A[1] is to be inserted after B[16]. Then we need
to insert A[K] into B. Each such search requires at most only use a binary search on B[17], ..., B[100] to find the
log s comparisons; hence this binary search and insertion proper location to insert A[2]. And so on.
algorithm to merge A and B requires at most r log s com- Tabbing: The expected location for inserting A[1] in B is
parisons. We emphasize that this algorithm is more efficient near B[20] (that is, B[s/r]), not near B[50]. Hence we first
than the usual merging Algorithm only when r  s, that is, use a linear search on B[20], B[40], B[60], B[80] and B[100]
when r is much less than s. to find B[K] such that A[1] ≤ B[K], and then we use a binary
search on B[K - 20], B[K - 19], ..., B[K].

Sorting Algorithms Sorting Algorithms


Merge Sort Merge Sort
Merging Merging

Merge sort I Merge sort II

Suppose array A with 14 elements:


(66, 33, 40, 22, 55, 88, 60, 11, 80, 20, 50, 44, 77, 30)
1. Merge each pair of elements to obtain the following sorted
pairs:
(33, 66) (22, 40) (55, 88) (11, 60) (20, 80) (44, 50) (30, 77) 5. The original array is now sorted
2. Merge each pair of pairs to obtain the following list of quadru- 6. The algorithm requires atmost log2 n steps to sort an n ele-
plets: ment array
(22, 33, 40, 66) (11, 55, 60, 88) (20, 44, 50, 80) (30, 77)
3. Merge each pair of sorted quadruplets to obtain the following
two sorted subarrays:
(11, 22, 33, 40, 55, 60, 66, 88) (20, 30, 44, 50, 77, 80)
4. Merge the two sorted subarrays to obtain single sorted ar-
ray:
(11, 20, 22, 30, 33, 40, 44, 50, 60, 66, 77, 80, 88)
Sorting Algorithms Sorting Algorithms
Merge Sort Merge Sort
Merging Merging

Mergesort I Mergesort II

MERGEPASS(A,N,L,B)
N
1. Set Q:=INT( 2∗L ), S:=2*L*Q and R:=N-S
2. Repeat for J = 1, 2, · · · , Q:
(a) Set LB:=1+(2*J-2)*L
The procedure merges pairs of sub arrays of A and assigns (b) Call MERGE(A,L,LB,A,L,LB+L,B,LB)
them to B. [End of loop.]
The N element array A is composed of sorted subarrays 3. [Only one subarray left?]
If R≤L then:
where each subarray has L elements except possibly last
Repeat for J = 1, 2, · · · , R:
subarray which may have fewer than L elements Set B(S+J):=A(S+J).
[End of loop.]
Else:
Call MERGE(A,L,S+1,A,R,L+S+1,B,S+1)
[End of if structure.]
4. Return

Sorting Algorithms Sorting Algorithms


Merge Sort Merge Sort
Merging Merging

Merge sort I Important Notes I

This algorithm sorts N-element array A using an auxiliary Merge sort is an example of the divide and conquer strat-
array B. egy.
MERGESORT(A, N) Merging is the process of combining two sorted files to make
1. Set L := 1 one bigger sorted file.
2. Repeat steps 3 to 6 while L < N:
Selection is the process of dividing a file into two parts: k
3. Call MERGEPASS(A, N, L, B)
4. Call MERGEPASS(B, N, 2*L, A) smallest elements and n - k largest elements.
5. Set L := 4 * L Selection and merging are opposite operations
[End of Step 2 loop] I selection splits a list into two lists
6. Exit I merging joins two files to make one file
Sorting Algorithms Sorting Algorithms
Merge Sort Merge Sort
Merging Complexity

Important Notes II Complexity I

Merge sort is Quick sort’s complement


Merge sort accesses the data in a sequential manner
This algorithm is used for sorting a linked list
Merge sort is insensitive to the initial order of its input The algorithm requires atmost log2 n passes
In Quick sort most of the work is done before the recursive Each pass merges n elements that is atmost n comparisons
calls. Quick sort starts with the largest subfile and finishes
Hence the time complexity is atmost n log2 n
with the small ones and as a result it needs stack. More-
over, this algorithm is not stable. Merge sort divides the
list into two parts; then each part is conquered individually.
Merge sort starts with the small subfiles and finishes with
the largest one. As a result it doesn’t need stack. This al-
gorithm is stable.

Sorting Algorithms Sorting Algorithms


Merge Sort Quicksort Sort
Complexity

1 Introduction
Complexity II Sorting
Classification of Sorting
Applications of Sorting
2 Bubble Sort
Algorithm
Complexity
Performance Modified version
I Worst case complexity : Θ(n log n) 3 Selection Sort
I Best case complexity : Θ(n log n) Algorithm
I Average case complexity : Θ(n log n) Complexity
I Worst case space complexity: Θ(n) auxiliary 4 Insertion Sort
Algorithm
Complexity
5 Merge Sort
Merging
Complexity
6 Quicksort Sort
Algorithm
Sorting Algorithms Sorting Algorithms
Quicksort Sort Quicksort Sort

Complexity
Randomized Quick sort
QUICKSORT I

7 Count Sort An Application Of Stacks.


Algorithm Quicksort is an algorithm of the divide-and-conquer type.
Complexity That is, the problem of sorting a set is reduced to the prob-
lem of sorting two smaller sets.
It picks an element as pivot and partitions the given array
around the picked pivot.
8 Radix Sort
The reduction step of the quicksort algorithm finds the final
Algorithm
Complexity position of pivot.

Sorting Algorithms Sorting Algorithms


Quicksort Sort Quicksort Sort

QUICKSORT II QUICKSORT III

Example: Suppose A is the following list of 12 numbers: Beginning with 22, next scan the list in the opposite direc-
tion, from left to right, comparing each number with 44 and
stopping at the first number greater than 44. The number is
The reduction step of the quicksort algorithm finds the final 55. Interchange 44 and 55 to obtain the list.
position of one of the numbers.
we use the first number, 44.
Beginning with the last number, 66, scan the list from right (Observe that the numbers 22, 33 and 11 to the left of 44
to left, comparing each number with 44 and stopping at the are each less than 44.)
first number less than 44. The number is 22. Interchange Beginning this time with 55, now scan the list in the original
44 and 22 to obtain the list direction, from right to left, until meeting the first number
less than 44. It is 40. Interchange 44 and 40 to obtain the
list.
Sorting Algorithms Sorting Algorithms
Quicksort Sort Quicksort Sort

QUICKSORT IV QUICKSORT V

Beginning with 77, scan the list from right to left seeking a
number less than 44. We do not meet such a number before
meeting 44. This means all numbers have been scanned
and compared with 44. Furthermore, all numbers less than
44 now form the sublist of numbers to the left of 44, and all
Beginning with 40, scan the list from left to right. The first numbers greater than 44 now form the sublist of numbers
number greater than 44 is 77. Interchange 44 and 77 to to the right of 44, as shown below:
obtain the list

Thus 44 is correctly placed in its final position, and the task


of sorting the original list A has now been reduced to the
task of sorting each of the above sublists.

Sorting Algorithms Sorting Algorithms


Quicksort Sort Quicksort Sort
Algorithm

QUICKSORT VI Algorithm I

Here A is an array with N elements. Parameters BEG and END


contain the boundary values of the sublist of A to which this pro-
The above reduction step is repeated with each sublist con-
cedure applies. LOC keeps track of the position of the first el-
taining 2 or more elements.
ement A[BEG] of the sublist during the procedure. The local
variables LEFT and RIGHT will contain the boundary values of
the list of elements that have not been scanned.
Sorting Algorithms Sorting Algorithms
Quicksort Sort Quicksort Sort
Algorithm Algorithm

Algorithm II Algorithm III

QUICK(A, N, BEG, END, LOC)


3. [Scan from left to right.]
1. [Initialize.] Set LEFT := BEG, RIGHT := END and LOC := BEG.
(a) Repeat while A[LEFT] ≤ A[LOC) and LEFT 6= LOC:
2. [Scan from right to left.] LEFT := LEFT + 1.
(a) Repeat while A[LOC) ≤ A[RIGHT] and LOC 6= RIGHT: [End of loop.]
RIGHT := RIGHT - 1. (b) If LOC = LEFT, then: Return.
[End of loop.]
(c) If A[LEFT] > A[LOC], then
(b) If LOC = RIGHT, then: Return.
(c) If A[LOC] > A[RIGHT], then: (i) [Interchange A[LEFT] and A[LOC].]
TEMP := A[LOC], A[LOC] := A[LEFT], A[LEFT] := TEMP.
(i) [Interchange A[LOC) and A[RIGHT].]
TEMP := A[LOC), A[LOC] := A[RIGHT), A[RIGHT] := TEMP. (ii) Set LOC := LEFT.
(ii) Set LOC := RIGHT. (iii) Go to Step 2.
(iii) Go to Step 3.
[End of If structure.]
[End of If structure.]

Sorting Algorithms Sorting Algorithms


Quicksort Sort Quicksort Sort
Algorithm Complexity

Algorithm IV Complexity

Algorithm: This algorithm sorts an array A with N elements.


1. [Initialize.] TOP := NULL.
2. [Push boundary values of A onto stacks when A has 2 or more elements.]
If N > 1, then: TOP := TOP + 1, LOWER[1] := 1, UPPER[1] := N.
3. Repeat Steps 4 to 7 while TOP 6= NULL.
4. [Pop sublist from stacks.] Quick sort algorithm has a worst-case running time of order
Set BEG := LOWER[TOP], END := UPPER[TOP], TOP := TOP - 1.
5. Call QUICK(A, N, BEG, END, LOC). (n2 / 2),
6. [Push left sublist onto stacks when it has 2 or more elements.]
If BEG < LOC - 1, then: but an average-case running time of order (n log n).
TOP := TOP + 1, LOWER[TOP] := BEG, UPPER[TOP] = LOC - 1.
[End of If structure.]
7. [Push right sublist onto stacks when it has 2 or more elements.]
If LOC + 1 < END, then:
TOP := TOP + 1, LOWER[TOP] := LOC + 1, UPPER[TOP] := END.
[End of If structure.]
[End of Step 3 loop.]

8. Exit.
Sorting Algorithms Sorting Algorithms
Quicksort Sort Quicksort Sort
Randomized Quick sort Randomized Quick sort

Randomized Quick sort I Randomized Quick sort II


In normal Quick sort, pivot element was always the leftmost
element in the list to be sorted. Instead of always using
In average-case behavior of Quick sort, we assume that all A[low] as pivot, we will use a randomly chosen element
permutations of the input numbers are equally likely. How- from the subarray A[low · · · high] in the randomized ver-
ever, we cannot always expect it to hold. We can add ran- sion of Quick sort. It is done by exchanging element A[low]
domization to an algorithm in order to reduce the probability with an element chosen at random from A[low · · · high].
of getting worst case in Quick sort. This ensures that the pivot element is equally likely to be
There are two ways of adding randomization in Quick sort: any of the high − low + 1 elements in the subarray.
either by randomly placing the input data in the array or by Since the pivot element is randomly chosen, we can expect
randomly choosing an element in the input data for pivot. the split of the input array to be reasonably well balanced on
The second choice is easier to analyze and implement. The average. This can help in preventing the worst-case behav-
change will only be done at the partition algorithm. ior of quick sort which occurs in unbalanced partitioning.
Even though the randomized version improves the worst
case complexity, its worst case complexity is still O(n2 ).
Sorting Algorithms Sorting Algorithms
Quicksort Sort Count Sort
Randomized Quick sort

1 Introduction
Randomized Quick sort III Sorting
Classification of Sorting
Applications of Sorting
2 Bubble Sort
Algorithm
Complexity
One way to improve Randomized - Quick sort is to choose Modified version
the pivot for partitioning more carefully than by picking a 3 Selection Sort
random element from the array. One common approach is Algorithm
to choose the pivot as the median of a set of 3 elements Complexity
randomly selected from the array. 4 Insertion Sort
Algorithm
Complexity
5 Merge Sort
Merging
Complexity
6 Quicksort Sort
Algorithm
Sorting Algorithms Sorting Algorithms
Count Sort Count Sort

Complexity
Randomized Quick sort
Count sort I

Counting sort is not a comparison sort algorithm and gives


7 Count Sort O(n) complexity for sorting.
Algorithm To achieve O(n) complexity, counting sort assumes that
Complexity each of the elements is an integer in the range 1 to K , for
some integer K .
The counting sort runs in O(n) time.
The basic idea of Counting sort is to determine, for each
8 Radix Sort
Algorithm input element X , the number of elements less than X .
Complexity This information can be used to place it directly into its cor-
rect position.

Sorting Algorithms Sorting Algorithms


Count Sort Count Sort

Count sort II Count sort III

Assume, A[1 · · · n] is the input array with length n.


In Counting sort we need two more arrays: array B[1 · · · n]
contains the sorted output and the array C[0 · · · K − 1] pro-
vides temporary storage.
Sorting Algorithms Sorting Algorithms
Count Sort Count Sort
Algorithm Complexity

Algorithm I Complexity I

Total Complexity: O(K ) + O(n) + O(K ) + O(n) = O(n) if


K = O(n).
Space Complexity: O(n) if K = O(n).
Note: Counting works well if K = O(n). Otherwise, the
complexity will be greater.

Sorting Algorithms Sorting Algorithms


Count Sort Radix Sort
Complexity

1 Introduction
Important Notes I Sorting
Classification of Sorting
Applications of Sorting
Counting sort is efficient if the range of input data is not 2 Bubble Sort
significantly greater than the number of objects to be sorted. Algorithm
Consider the situation where the input sequence is between Complexity
range 1 to 10K and the data is 10, 5, 10K , 5K . Modified version
It is not a comparison based sorting. It running time com- 3 Selection Sort
Algorithm
plexity is O(n) with space proportional to the range of data.
Complexity
It is often used as a sub-routine to another sorting algorithm 4 Insertion Sort
like radix sort. Algorithm
Counting sort uses a partial hashing to count the occur- Complexity
rence of the data object in O(1). 5 Merge Sort
Merging
Counting sort can be extended to work for negative inputs Complexity
also. 6 Quicksort Sort
Algorithm
Sorting Algorithms Sorting Algorithms
Radix Sort Radix Sort

Complexity
Randomized Quick sort
Radix sort I

Similar to Counting sort and Bucket sort, this sorting algo-


rithm also assumes some kind of information about the in-
7 Count Sort put elements.
Algorithm Suppose that the input values to be sorted are from base d.
Complexity
That means all numbers are d-digit numbers.
In Radix sort, first sort the elements based on the last digit
[the least significant digit].
8 Radix Sort These results are again sorted by second digit [the next to
Algorithm least significant digit].
Complexity Continue this process for all digits until we reach the most
significant digits.
Use some stable sort to sort them by last digit.

Sorting Algorithms Sorting Algorithms


Radix Sort Radix Sort

Radix sort II Radix sort III

Then stable sort them by the second least significant digit,


then by the third, etc.
If we use Counting sort as the stable sort, the total time is
O(nd) ≈ O(n).
Suppose 9 elements as follows: 348, 143, 361, 423, 538,
128, 321, 543, 366
Sorting Algorithms Sorting Algorithms
Radix Sort Radix Sort

Radix sort IV Radix sort V

Sorting Algorithms Sorting Algorithms


Radix Sort Radix Sort
Algorithm Complexity

Algorithm I Complexity I

Suppose a list A of n items A1 , A2 , · · · , An is given.


Let d denote the radix (e.g., d = 10 for decimal digits, d =
26 for letters and d = 2 for bits), and suppose each item Ai
1. Take the least significant digit of each element. is represented by means of s of the digits: Ai = di1 di2 · · · dis .
2. Sort the list of elements based on that digit, but keep the The radix sort algorithm will require s passes, the number
order of elements with the same digit (this is the definition of digits in each item.
of a stable sort). Pass K will compare each diK with each of the d digits.
3. Repeat the sort with each more significant digit. Hence the number C(n) of comparisons for the algorithm is
bounded as follows: C(n) ≤ d ∗ s ∗ n.
Although d is independent of n, the number s does depend
on n.
In the worst case, s = n, so C(n) = O(n2 ).
Sorting Algorithms Sorting Algorithms
Radix Sort Radix Sort
Complexity Complexity

Complexity II Important Notes I

The speed of Radix sort depends on the inner basic opera-


In the best case, s = logd n, so C(n) = O(n log n). tions.
In other words, radix sort performs well only when the num- If the operations are not efficient enough, Radix sort can be
ber s of digits in the representation of the Ai ’s is small. slower than other algorithms such as Quick sort and Merge
Another drawback of radix sort is that one may need d ∗ n sort.
memory locations. These operations include the insert and delete functions of
This comes from the fact that all the items may be “sent to the sub-lists and the process of isolating the digit we want.
the same pocket” during a given pass. If the numbers are not of equal length then a test is needed
This drawback may be minimized by using linked lists rather to check for additional digits that need sorting.
than arrays to store the items during a given pass. This can be one of the slowest parts of Radix sort and also
However, one will still require 2 ∗ n memory locations. one of the hardest to make efficient.
Since Radix sort depends on the digits or letters, it is less
flexible than other sorts.
Sorting Algorithms
Radix Sort
Complexity

Important Notes II

For every different type of data, Radix sort needs to be


rewritten, and if the sorting order changes, the sort needs
to be rewritten again.
In short, Radix sort takes more time to write, and it is very
difficult to write a general purpose Radix sort that can han-
dle all kinds of data.
For many programs that need a fast sort, Radix sort is a
good choice.
Still, there are faster sorts, which is one reason why Radix
sort is not used as much as some other sorts.
Time Complexity: O(nd) ≈ O(n), if d is small.

You might also like