0% found this document useful (0 votes)
118 views

Analysis and Comparative of Sorting Algorithms

Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-3 | Issue-5 , August 2019, PDF URL: http://www.ijtsrd.com/papers/ijtsrd26575.pdfPaper URL: https://www.ijtsrd.com/computer-science/programming-language/26575/analysis-and-comparative-of-sorting-algorithms/htwe-htwe-aung

Uploaded by

Editor IJTSRD
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
118 views

Analysis and Comparative of Sorting Algorithms

Published in International Journal of Trend in Scientific Research and Development (ijtsrd), ISSN: 2456-6470, Volume-3 | Issue-5 , August 2019, PDF URL: http://www.ijtsrd.com/papers/ijtsrd26575.pdfPaper URL: https://www.ijtsrd.com/computer-science/programming-language/26575/analysis-and-comparative-of-sorting-algorithms/htwe-htwe-aung

Uploaded by

Editor IJTSRD
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

International Journal of Trend in Scientific Research and Development (IJTSRD)

Volume 3 Issue 5, August 2019 Available Online: www.ijtsrd.com e-ISSN: 2456 – 6470

Analysis and Comparative of Sorting Algorithms


Htwe Htwe Aung
Lecturer, Faculty of Computer Science, University of Computer Studies, Pathein, Myanmar

How to cite this paper: Htwe Htwe Aung ABSTRACT


"Analysis and Comparative of Sorting There are many popular problems in different practical fields of computer
Algorithms" sciences, computer networks, database applications and artificial intelligence.
Published in One of these basic operations and problems is the sorting algorithm. Sorting is
International also a fundamental problem in algorithm analysis and designing point of view.
Journal of Trend in Therefore, many computer scientists have much worked on sorting algorithms.
Scientific Research Sorting is a key data structure operation, which makes easy arranging,
and Development searching, and finding the information. Sorting of elements is an important
(ijtsrd), ISSN: 2456- task in computation that is used frequently in different processes. For
IJTSRD26575
6470, Volume-3 | accomplish, the task in a reasonable amount of time-efficient algorithm is
Issue-5, August 2019, pp.1049-1053, needed. Different types of sorting algorithms have been devised for the
https://doi.org/10.31142/ijtsrd26575 purpose. Which is the best-suited sorting algorithm can only be decided by
comparing the available algorithms in different aspects. In this paper, a
Copyright © 2019 by author(s) and comparison is made for different sorting algorithms used in the computation.
International Journal of Trend in Scientific
Research and Development Journal. This KEYWORDS: Bubble Sort, Selection Sort, Insertion Sort, Merge Sort, Quick Sort,
is an Open Access article distributed Heap Sort, Time Complexity, Stability
under the terms of
the Creative 1. INTRODUCTION
Commons Attribution Sorting is a process of rearrangement a list of elements to the correct order
License (CC BY 4.0) since handling the elements in a certain order more efficient than handling
(http://creativecommons.org/licenses/by randomize elements [15].
/4.0)
Sorting is among the most common programming notation, where the O represents the complexity of the
processes, as an example take database applications if you algorithm and a value n represent the number of
want to maintain the information and ease of retrieval you elementary operations performed by the algorithm [19]. For
must keep information in a sensible order, for example, typical sorting algorithms, best behavior is O(n log n) and
alphabetical order, ascending or descending order and worst behavior is O(n2).
order according to names, ids, years, departments, etc. Each
sorting algorithm uses its own technique in execution. It is Space complexity, an algorithm that used recursive
also possible that a single problem can be solved by using techniques need more copies of sorting data that affect
more than one algorithm. Here I will compare between the memory space [1]. Some algorithms are either recursive or
sorting algorithms based on best case, average case and non-recursive while others may be both (e.g., merge sort).
worst case efficiency that refer to the performance of the Many previous types of research have been suggested to
number n of elements. enhance the sorting algorithm to maintain memory and
improve efficiency. Most of these algorithms are used
Information growth rapidly in our world leads to an comparative operation between the oldest algorithm and
increase in developing sort algorithms. Developing sort the newest one to prove that. In particular, some sorting
algorithms through improved performance and decreasing algorithms are "in place". This means that they need only O
complexity, because of any effect of sorting algorithm (1) memory beyond the items being sorted and they don't
enhancement of the current algorithms or product new need to create auxiliary locations for data to be temporarily
algorithms that reflects to optimize other algorithms. A stored, as in other sorting algorithms.
large number of algorithms developed to improve sorting
like merge sort, bubble sort, insertion sort, quick sort, The stability of the algorithm keeps elements with equal
selection sort and others, each of them has a different values in the same relative order in the output as they were
mechanism to reorder elements which increase the in the input [20]. Some sorting algorithms are stable by its
performance and efficiency of the practical applications and nature such as bubble sort, insertion sort and merge sort,
reduce the time complexity of each one. etc., while some sorting algorithms are not, such as selection
sort, quick sort, heap sort, etc. Any given sorting algorithm
When comparing various sorting algorithms, the several which is not stable can be modified to be stable [13]. Stable
factors must be considered such as time complexity, space sorting algorithms maintain the relative order of records
complexity and stability. The time complexity of an with equal keys (i.e., values).
algorithm determined the amount of time that can be taken
by an algorithm to run [3][13][14]. The different sorting 2. SORTING ALGORITHMS
algorithm compares to another according to the size of data, Sorting algorithms are an important part of managing data.
inefficient sorting algorithm and speed. The time complexity Most sorting algorithms work by comparing the data being
of an algorithm is generally written in form big O (n) sorted. In some cases, it may be desirable to sort a large

@ IJTSRD | Unique Paper ID – IJTSRD26575 | Volume – 3 | Issue – 5 | July - August 2019 Page 1049
International Journal of Trend in Scientific Research and Development (IJTSRD) @ www.ijtsrd.com eISSN: 2456-6470
volume of data based on only a portion of that data. The  Place at the second position of the list and continue until
piece of data actually used to determine the sorted order is the whole data items are sorted [11].
called the key. Sorting algorithms are usually judged by their
efficiency [15]. Sorting is the process of arranging data in a Pseudo-code:
specific order which benefits searching and locating the for j ← 1 to n-1
information in an easy and efficient way. Sorting algorithms smallest ← j;
are developed to arrange data in various ways; for instance, for k ← j+1 to n
an array of integers may be sorted from lower to highest or if (a[k] < a[min])
from highest to lower or array of string elements may sort in smallest ← k;
alphabetical order. Exchange A[k] and A[smallest]
end func
This paper describes a comparative study of bubble sort,
selection sort, insertion sort, merge sort, quick sort and heap Selection sort is work very well for small files, also it’s has a
sort. Compares six algorithms of their best case, average case quite important application because each item is actually
and worst-case time complexity and also discuss their moved at most once [17]. It has the best case and worst case
stability. It is a machine-independent analysis, which is a time complexity is O(n2), making it inefficient on large lists.
good approach. Selection sort has one advantage over other sort techniques
[16][2]. Although it does many comparisons, it does the
Bubble Sort number of swaps reduced. That means, if input data is small
Bubble sort is a simple sorting algorithm. Let A be a list of N keys but large data area, then selection sorting may be the
numbers. Sorting A refers to the operation of rearranging the quickest [19]. Selection sort is in-place sorting algorithm and
elements of A so they are in increasing order, i.e., so that it can't be implemented as a stable sort.
A[1]<A[2] <A[3] < . . .<A[N]
Insertion Sort
The step in the bubble sort algorithm works as follows: Insertion sort is based on the idea that one element from the
 Compare A[1] and A[2], if A[1] > A[2] then it swaps input elements is consumed in each iteration to find its
them. correct position i.e. the position to which it belongs in a
 It continues doing this for each pair of adjacent elements sorted array. Insertion sort works as below:
until to reach the right end.  It compares the current element with the largest value
 After N-1 comparisons and swaps them if needed, then in the sorted array.
A[N] will contain the largest element.  If the current element is greater, then it leaves the
 It then starts again with the first two elements, repeating element in its place and moves on to the next element
until no swaps have occurred on the last pass [1][5][18]. else it finds its correct position in the sorted array and
moves it to that position.
Pseudo-code:  This is done by shifting all the elements, which are
func bubblesort(var a as array) larger than the current element, in the sorted array to
for j from 2 to N one position of the front [10].
swaps = 0
for k from 0 to N-2 Pseudo-code:
if (a[k] > a[k+1]) for k ← 1 to n-1
swap (a[k], a[k+1]) key ← A[k]
swaps = swaps + 1 i←k
if swaps = 0 while i > 0 and A[i-1] > key
break A[i] ← A[i-1]
end func i←i-1
A[i] ← key
The sorted array as input or almost all elements are in the end func
proper place, bubble sort has O(n) as the best case
performance since it passes over the items one time and The comparisons and copies of this algorithm required: on
O(n2) as the worst-case performance and average-case the first pass, it compares a maximum of one item. On the
performance because it requires at least two passes through second pass, it's a maximum of two items, and so on, up to a
the data. Bubble sort has to perform a large number maximum of N-1 comparisons on the last pass. This is
comparison when there are more elements in the list and it 1 + 2 + 3 + … + N-1 = N*(N-1)/2
increases as the number of items increase that is needed to
be sorted. Although bubble sort is quite simple and easy to However, because on each pass an average of only half of the
implement it is inefficient in coding reference. It is in place maximum numbers of items are actually compared before
sorting algorithm and it can be implemented as a stable sort. the insertion point is found, can divide by 2, which gives
N*(N-1)/2
Selection Sort
Selection sorts the simplest of sorting techniques. The main The number of copies is approximately the same as the
idea of the selection sort algorithm is given by number of comparisons. However, a copy isn't as time-
 Find the smallest element in the data list. consuming as a swap, so for random data, this algorithm
 Put this element at first position of the list. runs faster than bubble sort and selection sort. The insertion
 Find the next smallest element in the list. sort runs in O(n2) time for random data as the worst-case
and average-case complexity. Best case complexity of O(n)

@ IJTSRD | Unique Paper ID – IJTSRD26575 | Volume – 3 | Issue – 5 | July - August 2019 Page 1050
International Journal of Trend in Scientific Research and Development (IJTSRD) @ www.ijtsrd.com eISSN: 2456-6470
while the array is already sorted. It is much less efficient on Pseudo-code:
large lists than more advanced algorithms such as quicksort QUICKSORT( array A, int j, int k )
and merges sort. However, insertion sort provides several if(k > j)
advantages simple implementation and efficient for small then i← a random index from [j..k]
data sets [6][4]. It is in place sorting algorithm and stable swap A[i] and A[j]
sort. p ← PARTITION(A, j, k)
QUICKSORT(A, j, p − 1)
Merge Sort QUICKSORT(A, p + 1, k)
Merge sort is based on divide and conquer strategy end func
technique which is a popular problem-solving technique. The
merge sort algorithm is work as under: The partition algorithm works as follows
 Split array A(x1, x2, x3, …, xn) from middle into two  A[j] = x is the pivot value.
parts of length n/2 (x1, x2, x3, …, xn/2 and x(n/2)+1, …,  A [j…p - 1] contains elements less than x.
xn).  A [p + 1…r - 1] contains the elements which are greater
 Sorts each part calling Sort algorithm recursively. than or equal to x.
 Merge the two sorting parts into a single sorted list  A[r...k] contains elements which are currently unknown.
[21].
PARTITION( array A, int j, int k)
Pseudo-code: x ← A[j]
MERGE-SORT(A, left, right) p←j
if left < right for r ← j + 1 to k do
mid = (l+(r-l)/2) if (A[r] < x) then p ← p + 1
MERGE-SORT(A, left, mid) swap A[p] and A[r]
MERGE-SORT (A, mid+1, right) swap A[j] and A[p]
MERGE(A, left, mid, right) return p
end func end func
MERGE(A, l, h, ub)
j←0 Quicksort is one of the fastest sorting algorithms which is the
lb ← l part of many sorting libraries. The running time of Quick
mid ← h-1 Sort depends upon heavily on choosing the pivot element.
n ← ub-lb+1 Since the selection of pivot element is randomly, therefore
while (l <= mid && h <= ub) average case and best-case running time is O(n log n).
if(theArray[l] < theArray[h]) However, worst-case running time is O(n2) but it happens
A[j++] ← theArray[l++] rarely. Quicksort is not stable but is an in-place [20].
else
A[j++] ← theArray[h++] Heap Sort
while(l <= mid) Heapsort is a comparison-based sorting algorithm. It is the
A[j++] ← theArray[l++] most efficient version of selection sort. It divides its input
while(h <= ub) into a sorted and an unsorted region, and it iteratively
A[j++] ← theArray[h++] shrinks the unsorted region by extracting the largest
for(j=0; j<n; j++) element and moving that to the sorted region. The
theArray[lb+j]← A[j] improvement consists of the use of a heap data structure
end func rather than a linear-time search to find the maximum [22].
Heaps can be used in sorting an array. In max-heaps, the
Merge sort can be easily applied to lists and arrays because it maximum element will always be at the root. Heapsort uses
needs sequential access rather than random access. It can this property of heap to sort the array.
handle very large lists due to its worst case, best case and
average case running time are O(n log n). The O(n) Consider an array A which is to be sorted using Heap sort.
additional space complexity and involvement of huge  Initially build a max heap of elements in array A.
amount of copies in simple implementation made it a little  The root element, that is A[1], will contain a maximum
inefficient. It is stable sort, parallelizes better and is more element of A. After that, swap this element with the last
efficient at handling slow-to-access sequential media but not element of array A and Heapify the max heap excluding
in place. Merge sort is often the best choice for sorting a the last element which is already in its correct position
linked list [7][12]. and then decrease the length of the heap by one.
 Repeat the above step, until all the elements are in their
Quick Sort correct position.
Quicksort also belongs to the divide and conquer category of
algorithms. It depends on the operation of the partition. To Pseudo-code:
partition an array of an element called a pivot is selected. All Heapsort(A)
elements smaller than the pivots are moved before it and all BuildHeap(A)
greater elements are moved after it. The lesser and greater for i ← length(A) step -1 until 2
sub-lists are then recursively sorted. Efficient interchange A[1] and A[i]
implementations of quicksort (with in-place partitioning) Heapify(A, 1)
are typically unstable sorts and somewhat complex but are end func
among the fastest sorting algorithms in practice [8]. BuildHeap(A)

@ IJTSRD | Unique Paper ID – IJTSRD26575 | Volume – 3 | Issue – 5 | July - August 2019 Page 1051
International Journal of Trend in Scientific Research and Development (IJTSRD) @ www.ijtsrd.com eISSN: 2456-6470
heapsize ← length(A) Heapsort is the slowest of the sorting algorithms but unlike
for i ← Xloor( length/2 ) step -1 until 1 merge and quicksort, it does not require massive recursion
Heapify(A, i) or multiple arrays to work. The merge sort is slightly faster
end func than the heap sort for larger sets, but it requires twice the
memory of the heap sort because of the second array. The
Heapify(A, i) quicksort is massively recursive sort. It can be said as the
l ← left(i) faster version of the merge sort.
r ← right(i)
if (l <= heapsize) and (A[l] > A[i]) In the following figures is the efficiency of different
largest ← l algorithms according to the above-stated criteria.
else
largest ← i
if (r <= heapsize) and (A[r] > A[largest])
largest ← r
if (largest != i) {
interchange A[i] and A[largest]
Heapify(A, largest)
end func

Heap sort has O(n log n) time complexities for all the cases
(best case, average case and worst case). Although
somewhat slower in practice on most machines than a well-
implemented quicksort, it has the advantage of a more Figure.1 Efficiency for O(n2) Sorts [9]
favorable worst-case O(n log n) runtime. Heapsort is an in-
place algorithm, but it is not a stable sort. [22].

3. COMPARATIVE STUDY AND DISCUSSION


In this paper, there are two classes of Sorting Algorithms:
O( n2):
 Bubble Sort
 Selection Sort
 Insertion Sort
 O(n log n )
 Merge Sort
 Quick Sort
 Heap Sort

Under best-case conditions (the list is already sorted), the Figure.2 Efficiency for O(n log n) Sorts[9]
bubble sort can approach a constant O(n) level of
complexity. General-case is abysmal, while the insertion sort This table gives the comparison of time complexity or
and selection sorts also have complexities; they are running time of different sorting algorithms in a short and
significantly more efficient than bubble sort. precise manner given as under.

Table 1: Comparison of sorting algorithms


Time
Sort Stable In place
Avg Best Worst
Bubble sort O(n2) O(n) O(n2) Yes Yes
Selection sort O(n2) O(n2) O(n2) No Yes
Insertion sort O(n2) O(n) O(n2) Yes Yes
Merge sort O(n logn) O(n logn) O(n logn) Yes No
Quick sort O(n logn) O(n logn) 2
O(n ) No Yes
Heap sort O(n logn) O(n logn) O(n logn) No Yes

4. CONCLUSIONS
This paper discusses well-known sorting algorithms, their point of view. This paper describes six well-known sorting
pseudo-code and running time. In the previous work section, algorithms and their running time which is given in the
people have done a comparative study of sorting algorithms. above table. To determine the good sorting algorithm, the
Nowadays, some of them compared the running time of time complexity is the main consideration but other factors
algorithms on real computers on a different number of include handling various data type, consistency of
inputs which is not much use because the diversity of performance and complexity of code, etc. From the above
computing devices is very high. discussion, every sorting algorithm has some advantages and
disadvantages and the programmer must choose according
This paper compares the running time of their algorithms as to his or her requirement of sorting algorithms.
a mathematical entity and tried to analyze as an abstract

@ IJTSRD | Unique Paper ID – IJTSRD26575 | Volume – 3 | Issue – 5 | July - August 2019 Page 1052
International Journal of Trend in Scientific Research and Development (IJTSRD) @ www.ijtsrd.com eISSN: 2456-6470
References [13] M. Goodrich and R. Tamassia, "Data Structures and
[1] Amity Dev Mistral & Deepak Garg. (2008, DEC). Algorithms in Java", Johnwiley& sons 4th edition,
"Selection of Best Sorting Algorithm", International 2010, pp.241-243.
Journal of intelligent information Processing, pp.363-
[14] M. Sipser, "Introduction to the Theory of
368.
Computation", Thomson, 1996, pp.177-190.
[2] A.Levitin, "Introduction to the Design & Analysis of
[15] P. Adhikari, Review on Sorting Algorithms, "A
Algorithms", Addison–Wesley Longman, 2007, pp.98-
comparative study on two sorting algorithms",
100.
Mississippi state university, 2007.
[3] C.Cook, D.Kim. "Best sorting algorithm for nearly
[16] R. Sedgewick, "Algorithms in C++", Addison-Wesley
sorted lists". Commun. ACM, 23(11), pp.620-624.
Longman, 1998, pp.273-274.
[4] http://corewar.co.uk/assembly/insertion.htm
[17] R. Sedgewick and K. Wayne, "Algorithms", Pearson
[5] https:// en.wikipedia.org/wiki/Seymour_Lipschutz Education, 4th Edition, 2011, pp.248-249.
[6] http://en.wikipedia.org/wiki/Insertion_sort [18] SCHAUM LIPSCHUTZ. "Data Structures", pp.73-74.
[7] http://en.wikipedia.org/wiki/Merge_sort [19] S. Jadoon, S. Solehria, S. Rehman and H. Jan.( 2011,
FEB). "Design and Analysis of Optimized Selection Sort
[8] https://en.wikipedia.org/wiki/quick_sort
Algorithm", pp.16-21.
[9] http://linux.wku.edu/~lamonml/algor/sort/sort.html
[20] T. H. Cormen, C. E. Lieserson, R. L. Rivest and S. Clifford,
[10] http://www.hackerearth.com/sorting "Introduction to Algorithms", 3rd ed., the MIT Press
Cambridge, Massachusetts London, England 2009.
[11] Kazim Ali. (2017 FEB). "A Comparative Study of Well-
Known Sorting Algorithms", International Journal of [21] Alfred V. Aho, John E. Hopcroft, Jeffrey D. Ullman, "The
Advanced Research in Computer Science, pp.5-6. Design and Analysis of Computer Algorithms", 1976,
pp.66
[12] Kronrod, M. A. (1969). "Optimal ordering algorithm
without operational field", Soviet Mathematics - [22] Williams, J. W. J. (1964), "Algorithm 232 - Heapsort",
Doklady (10), pp.744. Communications of the ACM, 7(6), pp.347-348.

@ IJTSRD | Unique Paper ID – IJTSRD26575 | Volume – 3 | Issue – 5 | July - August 2019 Page 1053

You might also like