Unit IV Searching & Sorting (Autosaved)
Unit IV Searching & Sorting (Autosaved)
5. Implement Non-Linear Data Structures like Trees and graphs using programming language.
3. Apply and analyze use of stack and queue with their applications
SEARCHING AND SORTING: Need of searching and sorting, Concept of internal and
external sorting.
Searching methods: Linear and Binary search algorithms
Sorting Methods: Bubble, Selection, Insertion, Quick, Merge. Comparison of all sorting
methods. Analyze Bubble Sort, Insertion sort, Quick Sort for Best, Worst and Average case.
Introduction to Searching
Target Element/Key: It is the element or item that you want to find within the data
collection. This target could be a value, a record, a key, or any other data entity of interest.
Search Space: It refers to the entire collection of data within which you are looking for
the target element. Depending on the data structure used, the search space may vary in
size and organization.
Complexity: Searching can have different levels of complexity depending on the data
structure and the algorithm used. The complexity is often measured in terms of time and
space requirements.
1. Binary search is the search technique that works efficiently on sorted lists.
2. Binary search follows the divide and conquer approach in which the list is divided into
two halves, and the item is compared with the middle element of the list.
3. If the match is found then, the location of the middle element is returned.
4. Otherwise, we search into either of the halves depending upon the result produced
through the match.
Algorithm for binary search
Binary_Search(a, lower_bound, upper_bound, val)
print pos
go to step 6
else
[end of if]
[end of loop]
Step 5: if pos = -1
[end of if]
Step 6: exit
Sorting
Sorting of data involves arranging data elements in a specified order, typically
ascending or descending. This process is crucial for improving the efficiency of other
algorithms that require ordered data, and for organizing data to make it easier to
search and analyze.
2.External Sorting
Concept of Internal & External Sorting
Internal Sorting:
The internal sorting takes place entirely within the primary memory of a
computer i.e RAM.
When the amount of data is less at that time sorting takes place in primary memory.
Examples of Internal sorting:
• Bubble sort
• Insertion sort
• Selection sort, etc
External Sorting:
If the input data is such that it cannot be adjusted in the memory entirely at once, it needs to be stored
in a hard disk, floppy disk, or any other storage device. This is called external sorting.
Examples:
• Merge sort
Bubble Sort Algorithm
Bubble sort is a popular sorting algorithm that we prefer to sort the components of an
array in a clear and particular order. Essentially, in this algorithm, we need to compare
the two adjacent elements and perform a swap operation on them until they do not come
in the planned order. If the positions of the components are right, then we have to move
to the next iteration; else, we need to perform a swap strategy ..
How Bubble Sort Works
Step-by-Step Process:
• Start at the beginning of the list.
• Compare the first two elements.
• Swap them if they are in the wrong order.
• Move to the next pair and repeat.
• Repeat the process until no swaps are needed.
Example:
Let's sort the array: [5, 3, 8, 4, 2]
Pass 3:
Pass 1:
Compare 3 and 4 → No Swap → [3, 4, 2, 5, 8]
Compare 5 and 3 → Swap → [3, 5, 8, 4, 2]
Compare 4 and 2 → Swap → [3, 2, 4, 5, 8]
Compare 5 and 8 → No Swap → [3, 5, 8, 4, 2]
Compare 4 and 5 → No Swap → [3, 2, 4, 5, 8]
Compare 8 and 4 → Swap → [3, 5, 4, 8, 2]
Compare 5 and 8 → No Swap → [3, 2, 4, 5, 8]
Compare 8 and 2 → Swap → [3, 5, 4, 2, 8]
Pass 4:
Pass 2:
Compare 3 and 2 → Swap → [2, 3, 4, 5, 8]
Compare 3 and 5 → No Swap → [3, 5, 4, 2, 8]
Compare 3 and 4 → No Swap → [2, 3, 4, 5, 8]
Compare 5 and 4 → Swap → [3, 4, 5, 2, 8]
Compare 4 and 5 → No Swap → [2, 3, 4, 5, 8]
Compare 5 and 2 → Swap → [3, 4, 2, 5, 8]
Compare 5 and 8 → No Swap → [2, 3, 4, 5, 8]
Compare 5 and 8 → No Swap → [3, 4, 2, 5, 8]
Algorithm
1. Start with the first element as the minimum.
2. Iterate through the unsorted portion of the list to find the minimum element.
3. Swap the minimum element found with the first element of the unsorted portion.
4. Move the boundary between sorted and unsorted portions one element to the right.
5. Repeat until the entire list is sorted.
Pseudocode Selection Sort
function selectionSort(array):
n = length(array)
for i from 0 to n - 1:
minIndex = i
for j from i + 1 to n - 1:
if array[j] < array[minIndex]:
minIndex = j
swap(array[i], array[minIndex])
return array
Example:
Let's sort the array [64, 25, 12, 22, 11] using Selection Sort.
1.Initial Array: [64, 25, 12, 22, 11]
•i = 0: Minimum is 64 (index 0)
•Compare with 25, 12, 22, 11 → found 11 (index 4)
•Swap: [11, 25, 12, 22, 64]
2.Next Iteration: [11, 25, 12, 22, 64]
•i = 1: Minimum is 25 (index 1)
•Compare with 12, 22, 64 → found 12 (index 2)
•Swap: [11, 12, 25, 22, 64]
3.Next Iteration: [11, 12, 25, 22, 64]
•i = 2: Minimum is 25 (index 2)
•Compare with 22, 64 → found 22 (index 3)
•Swap: [11, 12, 22, 25, 64]
4.Next Iteration: [11, 12, 22, 25, 64]
•i = 3: Minimum is 25 (index 3)
•Compare with 64 → no change
•Swap: [11, 12, 22, 25, 64] (no change)
5.Last Element: [11, 12, 22, 25, 64]
•i = 4: Already sorted
Final Sorted Array
The final sorted array is [11, 12, 22, 25, 64].
Summary
•Time Complexity: O(n2)
•Space Complexity: O(1) (in-place)
•Stable: No
Insertion Sort Algorithm
Insertion Sort is a simple and intuitive sorting algorithm that builds the final
sorted array (or list) one item at a time. It is much like sorting playing cards in
your hands. The algorithm is efficient for small data sets and has a time
complexity of O(n2)O(n^2)O(n2) in the average and worst cases, but it
performs better than selection sort on average.
Algorithm
1. Start from the second element (index 1) of the array.
2. Compare the current element with the elements in the sorted portion (to the left).
3. Shift the larger elements in the sorted portion to the right to make space for the
current element.
4. Insert the current element into its correct position.
5. Repeat until the entire array is sorted.
Pseudocode for Insertion Sort
function insertionSort(array):
n = length(array)
for i from 1 to n - 1:
key = array[i]
j=i-1
while j >= 0 and array[j] > key:
array[j + 1] = array[j]
j=j-1
array[j + 1] = key
return array
Example
Let's sort the array [64, 25, 12, 22, 11] using Insertion Sort.
Summary
•Time Complexity:
•Best Case: O(n) (when the array is already sorted)
•Average/Worst Case: O(n2))
•Space Complexity: O(1) (in-place)
•Stable: Yes
Insertion Sort is particularly useful for small arrays or when the array is already partially sorted, making it a
commonly used algorithm in practice.
Quick Sort Algorithm
Quick Sort is a highly efficient sorting algorithm and is based on the divide-and-conquer
principle. It works by selecting a "pivot" element from the array and partitioning the other
elements into two sub-arrays: those less than the pivot and those greater than the pivot. The sub-
arrays are then sorted recursively.
Algorithm
1. Choose a Pivot: Select an element from the array as the pivot. This can be done in various
ways (e.g., first element, last element, random element, median).
2. Partitioning: Rearrange the array so that all elements less than the pivot are on the left, and all
elements greater than the pivot are on the right. The pivot will be in its final position.
3. Recursion: Recursively apply the above steps to the sub-arrays formed on the left and right of
the pivot.
Pseudocode Quick Sort
function quickSort(array, low, high):
if low < high:
pivotIndex = partition(array, low, high)
quickSort(array, low, pivotIndex - 1) // Recursively sort the left sub-array
quickSort(array, pivotIndex + 1, high) // Recursively sort the right sub-array
Summary
•Time Complexity:
•Best/Average Case: O(nlogn)
•Worst Case: O(n2)(occurs when the smallest or largest element is always chosen as the pivot)
•Space Complexity: O(logn) (due to recursion stack)
•Stable: No (equal elements may not preserve their original order)
Quick Sort is often preferred for its efficiency and performance on average
Merge Sort Algorithm
Merge Sort is a classic divide-and-conquer sorting algorithm. It works by recursively dividing the
array into halves until each sub-array contains a single element, and then merging those sub-arrays
back together in sorted order.
Algorithm
1. Divide: Split the array into two halves.
2. Conquer: Recursively sort each half.
3. Merge: Combine the two sorted halves into a single sorted array.
Pseudocode for Merge Sort
continued…
function mergeSort(array): while i < length(left) and j < length(right):
if length(array) <= 1: if left[i] < right[j]:
return array append left[i] to sortedArray
i=i+1
mid = length(array) / 2 else:
left = mergeSort(array[0:mid]) append right[j] to sortedArray
right = mergeSort(array[mid:length(array)]) j=j+1
return merge(left, right) while i < length(left):
append left[i] to sortedArray
function merge(left, right): i=i+1
sortedArray = []
i=0 while j < length(right):
j=0 append right[j] to sortedArray
j=j+1
continued…
return sortedArray
Example
Let's sort the array [64, 25, 12, 22, 11] using Merge Sort.
1.Initial Array: [64, 25, 12, 22, 11]
•Divide:
•Split into two halves: [64, 25] and [12, 22, 11]
•Sorting the Left Half: [64, 25]
•Split further into: [64] and [25]
•Both are single elements, so they are considered sorted.
•Merge: [64] and [25]
•Compare: 25 < 64, so the merged array is [25, 64]
•Sorting the Right Half: [12, 22, 11]
•Split into: [12] and [22, 11]
•[12] is already sorted.
•Sort [22, 11]:
•Split into: [22] and [11]
•Merge: [22] and [11]
•Compare: 11 < 22, so the merged array is [11, 22]
•Now merge: [12] and [11, 22]
•Compare: 11 < 12, so merged array becomes [11], then 12, and finally 22.
•Final merged array is [11, 12, 22]
Example
2.Final Merge: Merge [25, 64] and [11, 12, 22]
•Compare:
•11 < 25 → add 11
•12 < 25 → add 12
•22 < 25 → add 22
•Finally, add 25 and 64
•Resulting sorted array: [11, 12, 22, 25, 64]
Summary
•Time Complexity:
•Best/Average/Worst Case: O(nlogn)
•Space Complexity: O(n) (due to the temporary arrays used for merging)
•Stable: Yes (preserves the order of equal elements)
Merge Sort is especially useful for large data sets and is often preferred in situations where stability is important.
Its predictable performance makes it a popular choice for sorting tasks.
Comparison of all sorting algorithms
THANK YOU