Insertion Sort: What Are The Boundary Cases of Insertion Sort Algorithm?
Insertion Sort: What Are The Boundary Cases of Insertion Sort Algorithm?
SELECTION SORT
• The selection sort algorithm sorts an array by repeatedly finding the minimum
element (considering ascending order) from the unsorted part and putting it at
the beginning
• GREEDY METHOD
• two nested loops
• compare that element with every other Array element
• Selection sort makes O(n) swaps which is minimum among all sorting algorithms
mentioned above merge ,heap and insertion
• Is Selection Sort Algorithm stable?
Stability: The default implementation is not stable. However, it can be made stable.
Please see stable selection sort for details.
• Is Selection Sort Algorithm in place?
Yes, it does not require extra space.
• Selection Sort is an in-place algorithm having minimum number of swaps. It
works on greedy approach and takes O(n) swaps to sort the array of n elements.
• T(N) = T(N-1) + N RECCURENCE RELATION
BUBBLE SORT
• Bubble Sort is the simplest sorting algorithm that works by repeatedly swapping
the adjacent elements if they are in the wrong order.
• This algorithm is not suitable for large data sets as its average and worst-case
time complexity is quite high.
• Total number of swaps = Total number of comparison
• Total number of comparison (Worst case) = n(n-1)/2
• Total number of swaps (Worst case) = n(n-1)/2
• What is the Boundary Case for Bubble sort?
Bubble sort takes minimum time (Order of n) when elements are already sorted.
Hence it is best to check if the array is already sorted or not beforehand, to avoid
O(N2) time complexity.
• Does sorting happen in place in Bubble sort?
Yes, Bubble sort performs swapping of adjacent pairs without the use of any major
data structure. Hence Bubble sort algorithm is an in-place algorithm.
• Is the Bubble sort algorithm stable?
Yes, the bubble sort algorithm is stable.
• Where is the Bubble sort algorithm used?
Due to its simplicity, bubble sort is often used to introduce the concept of a sorting
algorithm. In computer graphics, it is popular for its capability to detect a tiny error
(like a swap of just two elements) in almost-sorted arrays and fix it with just linear
complexity (2n).
• RECCURENCE RELATION
MERGE SORT
• The Merge Sort algorithm is a sorting algorithm that is based on the Divide and
Conquer paradigm. In this algorithm, the array is initially divided into two equal
halves and then they are combined in a sorted manner.
• Is Merge sort In Place?
No, In merge sort the merging step requires extra space to store the elements.
• Is Merge sort Stable?
Yes, merge sort is stable.
• How can we make Merge sort more efficient?
Merge sort can be made more efficient by replacing recursive calls with Insertion
sort for smaller array sizes, where the size of the remaining array is less or equal to
43 as the number of operations required to sort an array of max size 43 will be less
in Insertion sort as compared to the number of operations required in Merge sort.
• Analysis of Merge Sort:
A merge sort consists of several passes over the input. The first pass merges
segments of size 1, the second merges segments of size 2, and the i_{th} pass
merges segments of size 2i-1. Thus, the total number of passes is [log2n]. As
merge showed, we can merge two sorted segments in linear time, which means
that each pass takes O(n) time. Since there are [log2n] passes, the total computing
time is O(nlogn).
• T(N) = 2*T(N/2) + N RECCURENCE RELATION
Drawbacks of Merge Sort:
• Slower compared to the other sort algorithms for smaller tasks.
• The merge sort algorithm requires an additional memory space of 0(n) for the
temporary array.
• It goes through the whole process even if the array is sorted.
QUICK SORT
• QuickSort is a Divide and Conquer algorithm. It picks an element as a pivot
and partitions the given array around the picked pivot
• When first element or last element is chosen as pivot, Quick Sort's worst
case occurs for the sorted arrays. In every step of quick sort, numbers are
divided as per the following recurrence. T(n) = T(n-1) + O(n)
• In worst case, the chosen pivot is always placed at a corner position and
recursive call is made for following. a) for subarray on left of pivot which is of
size n-1 in worst case. b) for subarray on right of pivot which is of size 0 in
worst case.
• Is QuickSort stable?
The default implementation is not stable. However any sorting algorithm can be
made stable by considering indexes as comparison parameter.
• Is QuickSort In-place?
As per the broad definition of in-place algorithm it qualifies as an in-place sorting
algorithm as it uses extra space only for storing recursive function calls but not
for manipulating the input.
• What is 3-Way QuickSort?
In simple QuickSort algorithm, we select an element as pivot, partition the array
around pivot and recur for subarrays on left and right of pivot.
Consider an array which has many redundant elements. For example, {1, 4, 2,
4, 2, 4, 1, 2, 4, 1, 2, 2, 2, 2, 4, 1, 4, 4, 4}. If 4 is picked as pivot in Simple
QuickSort, we fix only one 4 and recursively process remaining occurrences. In
3 Way QuickSort, an array arr[l..r] is divided in 3 parts:
• arr[l..i] elements less than pivot.
• arr[i+1..j-1] elements equal to pivot.
• arr[j..r] elements greater than pivot.
• T(N) = T(N-1)+ N WORST CASE RECCURENCE RELATION
• BEST CASE T(N) = 2*T(N/2) + N RECCURENCE RELATION
HEAP SORT