0% found this document useful (0 votes)
277 views

Insertion Sort Algorithm

The insertion sort algorithm works by taking unsorted elements from an array and inserting them into the sorted portion of the array in the correct position. It has a best case time complexity of O(n) when the array is already sorted, and average and worst case time complexities of O(n^2) for partially sorted or unsorted arrays. Selection sort finds the minimum element in the unsorted portion of an array and swaps it into the sorted portion, repeating until fully sorted, also with O(n^2) time complexity in average and worst cases. Bubble sort compares adjacent elements and swaps them into ascending order, requiring multiple passes through the array and also having O(n^2) time complexity.

Uploaded by

Vinston Raja
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
277 views

Insertion Sort Algorithm

The insertion sort algorithm works by taking unsorted elements from an array and inserting them into the sorted portion of the array in the correct position. It has a best case time complexity of O(n) when the array is already sorted, and average and worst case time complexities of O(n^2) for partially sorted or unsorted arrays. Selection sort finds the minimum element in the unsorted portion of an array and swaps it into the sorted portion, repeating until fully sorted, also with O(n^2) time complexity in average and worst cases. Bubble sort compares adjacent elements and swaps them into ascending order, requiring multiple passes through the array and also having O(n^2) time complexity.

Uploaded by

Vinston Raja
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 14

Insertion Sort Algorithm

Insertion sort works similar to the sorting of playing cards in hands. It is assumed that the
first card is already sorted in the card game, and then we select an unsorted card. If the
selected unsorted card is greater than the first card, it will be placed at the right side;
otherwise, it will be placed at the left side. Similarly, all unsorted cards are taken and put in
their exact place.

The same approach is applied in insertion sort. The idea behind the insertion sort is that first
take one element, iterate it through the sorted array. Although it is simple to use, it is not
appropriate for large data sets as the time complexity of insertion sort in the average case and
worst case is O(n2), where n is the number of items. Insertion sort is less efficient than the
other sorting algorithms like heap sort, quick sort, merge sort, etc.

Insertion sort has various advantages such as -

o Simple implementation
o Efficient for small data sets
o Adaptive, i.e., it is appropriate for data sets that are already substantially sorted.

Now, let's see the algorithm of insertion sort.

Algorithm

The simple steps of achieving the insertion sort are listed as follows -

Step 1 - If the element is the first element, assume that it is already sorted. Return 1.

Step2 - Pick the next element, and store it separately in a key.

Step3 - Now, compare the key with all elements in the sorted array.

Step 4 - If the element in the sorted array is smaller than the current element, then move to
the next element. Else, shift greater elements in the array towards the right.

Step 5 - Insert the value.

Step 6 - Repeat until the array is sorted.

Working of Insertion sort Algorithm


Insertion sort complexity

Now, let's see the time complexity of insertion sort in best case, average case, and in worst
case. We will also see the space complexity of insertion sort.

1. Time Complexity

Case Time Complexity


Best Case O(n)
Average Case O(n2)
Worst Case O(n2)
o Best Case Complexity - It occurs when there is no sorting required, i.e. the array is
already sorted. The best-case time complexity of insertion sort is O(n).
o Average Case Complexity - It occurs when the array elements are in jumbled order
that is not properly ascending and not properly descending. The average case time
complexity of insertion sort is O(n2).
o Worst Case Complexity - It occurs when the array elements are required to be sorted
in reverse order. That means suppose you have to sort the array elements in ascending
order, but its elements are in descending order. The worst-case time complexity of
insertion sort is O(n2).

2. Space Complexity

Space Complexity O(1)


Stable YES
o The space complexity of insertion sort is O(1). It is because, in insertion sort, an extra
variable is required for swapping.

# creating a function for insertion


def insertion_sort(list1):
# Outer loop to traverse through 1 to len(list1)
for i in range(1,len(list1)):
value = list1[i]
# Move elements of list1[0..i-1], that are greater than value, to one position ahead of
their current position
j=i-1
print("Before While",j)
while j >= 0 and value < list1[j]:
list1[j + 1] = list1[j]
#print(list1)
j =j- 1
print("After While",j)
print("Before Value",j)
list1[j + 1] = value
#print([j+1])
#print(value)
return list1
# Driver code to test above
list1 = [10, 5, 13, 8, 2]
print("The unsorted list is:", list1)
print("The sorted list1 is:", insertion_sort(list1))
Output
The unsorted list is: [10, 5, 13, 8, 2]
The sorted list1 is: [2, 5, 8, 10, 13]
Selection Sort Algorithm

is a simple sorting algorithm. This sorting algorithm, like insertion sort, is an in-place
comparison-based algorithm in which the list is divided into two parts, the sorted part at the
left end and the unsorted part at the right end. Initially, the sorted part is empty and the
unsorted part is the entire list.

The smallest element is selected from the unsorted array and swapped with the leftmost
element, and that element becomes a part of the sorted array. This process continues moving
unsorted array boundaries by one element to the right.

This algorithm is not suitable for large data sets as its average and worst case complexities
are of O(n2), where n is the number of items.

Selection Sort Algorithm

This type of sorting is called Selection Sort as it works by repeatedly sorting elements. That
is: we first find the smallest value in the array and exchange it with the element in the first
position, then find the second smallest element and exchange it with the element in the
second position, and we continue the process in this way until the entire array is sorted.

1. Set MIN to location 0.


2. Search the minimum element in the list.
3. Swap with value at location MIN.
4. Increment MIN to point to next element.
5. Repeat until the list is sorted.

Selection sort spends most of its time trying to find the minimum element in the unsorted part
of the array. It clearly shows the similarity between Selection sort and Bubble sort.

 Bubble sort selects the maximum remaining elements at each stage, but wastes some
effort imparting some order to an unsorted part of the array.
 Selection sort is quadratic in both the worst and the average case, and requires no
extra memory.

For each i from 1 to n - 1, there is one exchange and n - i comparisons, so there is a total of n
- 1 exchanges and

(n − 1) + (n − 2) + ...+2 + 1 = n(n − 1)/2 comparisons.

These observations hold, no matter what the input data is.

In the worst case, this could be quadratic, but in the average case, this quantity is O(n log n).
It implies that the running time of Selection sort is quite insensitive to the input.
Example

Consider the following depicted array as an example.

For the first position in the sorted list, the whole list is scanned sequentially. The first position
where 14 is stored presently, we search the whole list and find that 10 is the lowest value.

So we replace 14 with 10. After one iteration 10, which happens to be the minimum value in
the list, appears in the first position of the sorted list.

For the second position, where 33 is residing, we start scanning the rest of the list in a linear
manner.

We find that 14 is the second lowest value in the list and it should appear at the second place.
We swap these values.

After two iterations, two least values are positioned at the beginning in a sorted manner.

The same process is applied to the rest of the items in the array −
Implementation

The selection sort algorithm is implemented in four different programming languages below.
The given program selects the minimum number of the array and swaps it with the element in
the first index. The second minimum number is swapped with the element present in the
second index. The process goes on until the end of the array is reached.

# Selection sort in Python


# time complexity O(n*n)
#sorting by finding min_index
def selectionSort(array, size):
for ind in range(size):
min_index = ind
for j in range(ind + 1, size):
# select the minimum element in every iteration
if array[j] < array[min_index]:
min_index = j
# swapping the elements to sort the array
(array[ind], array[min_index]) = (array[min_index], array[ind])
arr = [12, 45, 0, 11, 9,88,97,202,747]
size = len(arr)
selectionSort(arr, size)
print('The array after sorting in Ascending Order by selection sort is:')
print(arr)

Output
The array after sorting in Ascending Order by selection sort is:
[0, 9, 11, 12, 45, 88, 97, 202, 747]

Bubble Sort Algorithm


Bubble sort is a simple sorting algorithm. This sorting algorithm is comparison-based
algorithm in which each pair of adjacent elements is compared and the elements are swapped
if they are not in order. This algorithm is not suitable for large data sets as its average and
worst case complexity are of O(n2) where n is the number of items.

Bubble Sort is an elementary sorting algorithm, which works by repeatedly exchanging


adjacent elements, if necessary. When no exchanges are required, the file is sorted.

We assume list is an array of n elements. We further assume that swap function swaps the
values of the given array elements.

Step 1 − Check if the first element in the input array is greater than the next element in the
array.
Step 2 − If it is greater, swap the two elements; otherwise move the pointer forward in the
array.
Step 3 − Repeat Step 2 until we reach the end of the array.
Step 4 − Check if the elements are sorted; if not, repeat the same process (Step 1 to Step 3)
from the last element of the array to the first.
Step 5 − The final output achieved is the sorted array.

Algorithm: Sequential-Bubble-Sort (A)


fori ← 1 to length [A] do
for j ← length [A] down-to i +1 do
if A[A] < A[j-1] then
Exchange A[j] ⟷ A[j-1]

Pseudocode

We observe in algorithm that Bubble Sort compares each pair of array element unless the
whole array is completely sorted in an ascending order. This may cause a few complexity
issues like what if the array needs no more swapping as all the elements are already
ascending.

To ease-out the issue, we use one flag variable swapped which will help us see if any swap
has happened or not. If no swap has occurred, i.e. the array requires no more processing to be
sorted, it will come out of the loop.

Here, the number of comparisons are

1 + 2 + 3 + ... + (n - 1) = n(n - 1)/2 = O(n2)

Clearly, the graph shows the n2 nature of the bubble sort.

In this algorithm, the number of comparison is irrespective of the data set, i.e. whether the
provided input elements are in sorted order or in reverse order or at random.
Memory Requirement

From the algorithm stated above, it is clear that bubble sort does not require extra memory.

Example

We take an unsorted array for our example. Bubble sort takes Ο(n2) time so we're keeping it
short and precise.

Bubble sort starts with very first two elements, comparing them to check which one is
greater.

In this case, value 33 is greater than 14, so it is already in sorted locations. Next, we compare
33 with 27.

We find that 27 is smaller than 33 and these two values must be swapped.

Next we compare 33 and 35. We find that both are in already sorted positions.

Then we move to the next two values, 35 and 10.


We know then that 10 is smaller 35. Hence they are not sorted. We swap these values. We
find that we have reached the end of the array. After one iteration, the array should look like
this −

To be precise, we are now showing how an array should look like after each iteration. After
the second iteration, it should look like this −

Notice that after each iteration, at least one value moves at the end.

And when there's no swap required, bubble sort learns that an array is completely sorted.
Implementation

One more issue we did not address in our original algorithm and its improvised pseudocode,
is that, after every iteration the highest values settles down at the end of the array. Hence, the
next iteration need not include already sorted elements. For this purpose, in our
implementation, we restrict the inner loop to avoid already sorted values.

# Python program for implementation of Bubble Sort


def bubbleSort(arr):
n = len(arr)
swapped = False
for i in range(n-1):
for j in range(0, n-i-1):
if arr[j] > arr[j + 1]:
swapped = True
arr[j], arr[j + 1] = arr[j + 1], arr[j]
if not swapped:
return
arr = [64, 34, 25, 12, 22, 11, 90]
bubbleSort(arr)
print("Sorted array is:")
for i in range(len(arr)):
print("% d" % arr[i], end=" ")

Output
Sorted array is:
11 12 22 25 34 64 90

Merge Sort

A problem is split into multiple sub-problems in merge sort. Each sub-problem is addressed
separately. The final result is formed by combining sub-problems. The MergeSort method
divides the array in half periodically until we reach a point when we try to MergeSort on a
subarray of size 1. The merge function then takes over, merging the sorted arrays into larger
arrays until the entire array is merged.

Let us have a rough understanding of merge sort:

1. Consider an array
2. Find the middle point to divide the array into two halves
3. Call merge sort for the first half
4. Call merge sort for the second half
5. Merge both the half
6. The result will be in sorted format

For better understanding let's have a look at the diagram below:

Algorithm
As of now, we have a rough understanding of how merge sort is performed. For better
understanding let's dive deep into the algorithm followed by the code:

1. Create a merge_sort() function


2. Initiate array list and divide it into subarrays.
3. Create copies of the subarrays
4. Create three-pointers that maintain indexes.
5. Pick larger elements and place them in the right position
6. Pick the remaining elements and sort them accordingly
7. The result will be sorted array
8. Print the sorted array.

Program for Merge Sort


def Merge_Sort(array):
if len(array) > 1:
# mid is the point where the array is divided into two subarrays
mid = len(array)//2
Left = array[:mid]
Right = array[mid:]

Merge_Sort(Left)
Merge_Sort(Right)

i=j=k=0
print(Left)
print(Right)
while i < len(Left) and j < len(Right):
if Left[i] < Right[j]:
array[k] = Left[i]
i =i+ 1
else:
array[k] = Right[j]
j =j+ 1
k += 1

while i < len(Left):


array[k] = Left[i]
i =i+ 1
k =k+ 1

while j < len(Right):


array[k] = Right[j]
j =j+ 1
k =k+ 1

# Print the array


def printarray(array):
for i in range(len(array)):
print(array[i], end=" ")
print()

# Driver program
if __name__ == '__main__':
array = [7, 2, 5, 6, 3, 1, 8, 4]
print("Orignal Array is: ", array)
Merge_Sort(array)
print("Sorted array is: ")
printarray(array)

Output:
Orignal Array is: [7, 2, 5, 6, 3, 1, 8, 4]
[7]
[2]
[5]
[6]
[2, 7]
[5, 6]
[3]
[1]
[8]
[4]
[1, 3]
[4, 8]
[2, 5, 6, 7]
[1, 3, 4, 8]
Sorted array is:
12345678

You might also like