Insertion Sort
Insertion Sort
INSERTION SORT: -
Insertion sort is a simple sorting algorithm that works similar to the way you sort playing cards in
your hands. The array is virtually split into a sorted and an unsorted part. Values from the unsorted
part are picked and placed at the correct position in the sorted part.
ALGORITHM:
The simple steps of achieving the insertion sort are listed as follows -
Step 1 - If the element is the first element, assume that it is already sorted. Return 1.
Step3 - Now, compare the key with all elements in the sorted array.
Step 4 - If the element in the sorted array is smaller than the current element, then move to the next
element. Else, shift greater elements in the array towards the right.
1. Time Complexity
o Best Case Complexity - It occurs when there is no sorting required, i.e. the array is already
sorted. The best-case time complexity of insertion sort is O(n).
o Average Case Complexity - It occurs when the array elements are in jumbled order that is not
properly ascending and not properly descending. The average case time complexity of
insertion sort is O(n2).
o Worst Case Complexity - It occurs when the array elements are required to be sorted in
reverse order. That means suppose you have to sort the array elements in ascending order, but
its elements are in descending order. The worst-case time complexity of insertion sort
is O(n2).
2. Space Complexity
o The space complexity of insertion sort is O(1). It is because, in insertion sort, an extra
variable is required for swapping.
2. SELECTION SORT: -
Selection sort is a simple and efficient sorting algorithm that works by repeatedly selecting the
smallest (or largest) element from the unsorted portion of the list and moving it to the sorted portion
of the list.
ALGORITHM:
selectionSort(array, size)
end selectionSort
1. Time Complexity
o Best Case Complexity - It occurs when there is no sorting required, i.e. the array is already sorted. The
best-case time complexity of selection sort is O(n2).
o Average Case Complexity - It occurs when the array elements are in jumbled order that is not
properly ascending and not properly descending. The average case time complexity of selection sort
is O(n2).
o Worst Case Complexity - It occurs when the array elements are required to be sorted in reverse order.
That means suppose you have to sort the array elements in ascending order, but its elements are in
descending order. The worst-case time complexity of selection sort is O(n2).
2. Space Complexity
o The space complexity of selection sort is O(1). It is because, in selection sort, an extra
variable is required for swapping.
3. MERGE SORT: -
Merge sort is defined as a sorting algorithm that works by dividing an array into
smaller subarrays, sorting each subarray, and then merging the sorted subarrays back
together to form the final sorted array.
ALGORITHM:
1. Time Complexity
o Best Case Complexity - It occurs when there is no sorting required, i.e. the array is already
sorted. The best-case time complexity of merge sort is O(n*logn).
o Average Case Complexity - It occurs when the array elements are in jumbled order that is
not properly ascending and not properly descending. The average case time complexity of
merge sort is O(n*logn).
o Worst Case Complexity - It occurs when the array elements are required to be sorted in
reverse order. That means suppose you have to sort the array elements in ascending order, but
its elements are in descending order. The worst-case time complexity of merge sort
is O(n*logn).
2. Space Complexity
o The space complexity of merge sort is O(n). It is because, in merge sort, an extra variable is
required for swapping
4. QUICK SORT: -
QuickSort is a sorting algorithm based on the Divide and Conquer algorithm that
picks an element as a pivot and partitions the given array around the picked pivot by
placing the pivot in its correct position in the sorted array.
ALGORITHM:
Though the worst-case complexity of quicksort is more than other sorting algorithms such as Merge
sort and Heap sort, still it is faster in practice. Worst case in quick sort rarely occurs because by
changing the choice of pivot, it can be implemented in different ways. Worst case in quicksort can be
avoided by choosing the right pivot element.
3. Space Complexity
LINEAR SEARCH: -
Linear Search is defined as a sequential search algorithm that starts at one end and
goes through each element of a list until the desired element is found, otherwise the
search continues till the end of the data set.
ALGORITHM:
Linear_Search (a, n, val)
Step 1: set pos = -1
Step 2: set i = 1
Step 3: repeat step 4 while i <= n
Step 4: if a[i] == val
set pos = i
print pos
go to step 6
[end of if]
set ii = i + 1
[end of loop]
Step 5: if pos = -1
print "value is not present in the array "
[end of if]
Step 6: exit
1. Time Complexity
o Best Case Complexity - In Linear search, best case occurs when the element we are finding
is at the first position of the array. The best-case time complexity of linear search is O(1).
o Average Case Complexity - The average case time complexity of linear search is O(n).
o Worst Case Complexity - In Linear search, the worst case occurs when the element we are
looking is present at the end of the array. The worst-case in linear search could be when the
target element is not present in the given array, and we have to traverse the entire array. The
worst-case time complexity of linear search is O(n).
The time complexity of linear search is O(n) because every element in the array is compared only
once.
2. Space Complexity
o The space complexity of linear search is O(1).
PROGRAM: -
#include <stdio.h>
int linearSearch(int a[], int n, int val) {
for (int i = 0; i < n; i++)
{
if (a[i] == val)
return i+1;
}
return -1;
}
int main() {
int a[] = {70, 40, 30, 11, 57, 41, 25, 14, 52};
int val = 41;
int n = sizeof(a) / sizeof(a[0]);
int res = linearSearch(a, n, val);
printf("The elements of the array are - ");
for (int i = 0; i < n; i++)
printf("%d ", a[i]);
printf("\nElement to be searched is - %d", val);
if (res == -1)
printf("\nElement is not present in the array");
else
printf("\nElement is present at %d position of array", res);
return 0;
}
Output
BINARY SEARCH :-
Binary Search is defined as a searching algorithm used in a sorted array
by repeatedly dividing the search interval in half. The idea of binary search is to
use the information that the array is sorted and reduce the time complexity to O(log
N).
ALGORITHM:
1. Time Complexity
o Best Case Complexity - In Binary search, best case occurs when the element to search is
found in first comparison, i.e., when the first middle element itself is the element to be
searched. The best-case time complexity of Binary search is O(1).
o Average Case Complexity - The average case time complexity of Binary search is O(logn).
o Worst Case Complexity - In Binary search, the worst case occurs, when we have to keep
reducing the search space till it has only one element. The worst-case time complexity of
Binary search is O(logn).
2. Space Complexity
o The space complexity of binary search is O(1).
PROGRAM: -
#include <stdio.h>
int binarySearch(int a[], int beg, int end, int val)
{
int mid;
if(end >= beg)
{ mid = (beg + end)/2;
if(a[mid] == val)
{
return mid+1;
}
else if(a[mid] < val)
{
return binarySearch(a, mid+1, end, val);
}
else
{
return binarySearch(a, beg, mid-1, val);
}
}
return -1;
}
int main() {
int a[] = {11, 14, 25, 30, 40, 41, 52, 57, 70};
int val = 40;
int n = sizeof(a) / sizeof(a[0]);
int res = binarySearch(a, 0, n-1, val);
printf("The elements of the array are - ");
for (int i = 0; i < n; i++)
printf("%d ", a[i]);
printf("\nElement to be searched is - %d", val);
if (res == -1)
printf("\nElement is not present in the array");
else
printf("\nElement is present at %d position of array", res);
return 0;
}
Output
Q. WRITE A PROGRAM FOR KRUSHKAL ALGORITHM
In Kruskal’s algorithm, sort all edges of the given graph in increasing order. Then it
keeps on adding new edges and nodes in the MST if the newly added edge does not
form a cycle. It picks the minimum weighted edge at first and the maximum
weighted edge at last. Thus we can say that it makes a locally optimal choice in each
step in order to find the optimal solution. Hence this is a Greedy Algorithm.
ALGORITHM:
KRUSKAL(G):
A=∅
For each vertex v ∈ G.V:
MAKE-SET(v)
For each edge (u, v) ∈ G.E ordered by increasing order by weight(u, v):
if FIND-SET(u) ≠ FIND-SET(v):
A = A ∪ {(u, v)}
UNION(u, v)
return A
o TimeComplexity
The time complexity of Kruskal's algorithm is O(E logE) or O(V logV), where E is the no. of
edges, and V is the no. of vertices.
PROGRAM :
#include <stdio.h>
#define MAX 30
int u, v, w;
} edge;
typedef struct edge_list {
edge data[MAX];
int n;
} edge_list;
edge_list elist;
int Graph[MAX][MAX], n;
edge_list spanlist;
void kruskalAlgo();
void sort();
void print();
void kruskalAlgo() {
elist.n = 0;
if (Graph[i][j] != 0) {
elist.data[elist.n].u = i;
elist.data[elist.n].v = j;
elist.data[elist.n].w = Graph[i][j];
elist.n++;
sort();
for (i = 0; i < n; i++)
belongs[i] = i;
spanlist.n = 0;
if (cno1 != cno2) {
spanlist.data[spanlist.n] = elist.data[i];
spanlist.n = spanlist.n + 1;
return (belongs[vertexno]);
int i;
if (belongs[i] == c2)
belongs[i] = c1;
void sort() {
int i, j;
edge temp;
temp = elist.data[j];
elist.data[j + 1] = temp;
void print() {
int i, cost = 0;
int main() {
int i, j, total_cost;
n = 6;
Graph[0][0] = 0;
Graph[0][1] = 4;
Graph[0][2] = 4;
Graph[0][3] = 0;
Graph[0][4] = 0;
Graph[0][5] = 0;
Graph[0][6] = 0;
Graph[1][0] = 4;
Graph[1][1] = 0;
Graph[1][2] = 2;
Graph[1][3] = 0;
Graph[1][4] = 0;
Graph[1][5] = 0;
Graph[1][6] = 0;
Graph[2][0] = 4;
Graph[2][1] = 2;
Graph[2][2] = 0;
Graph[2][3] = 3;
Graph[2][4] = 4;
Graph[2][5] = 0;
Graph[2][6] = 0;
Graph[3][0] = 0;
Graph[3][1] = 0;
Graph[3][2] = 3;
Graph[3][3] = 0;
Graph[3][4] = 3;
Graph[3][5] = 0;
Graph[3][6] = 0;
Graph[4][0] = 0;
Graph[4][1] = 0;
Graph[4][2] = 4;
Graph[4][3] = 3;
Graph[4][4] = 0;
Graph[4][5] = 0;
Graph[4][6] = 0;
Graph[5][0] = 0;
Graph[5][1] = 0;
Graph[5][2] = 2;
Graph[5][3] = 0;
Graph[5][4] = 3;
Graph[5][5] = 0;
Graph[5][6] = 0;
kruskalAlgo();
print();
OUTPUT:
2-1:2
5-2:2
3-2:3
4-3:3
1-0:4
Spanning tree cost: 14