0% found this document useful (0 votes)
15 views

6_Algorithms on Data Structures - part 1

The document discusses sorting algorithms, focusing on bubble sort, including its concept, algorithm, and complexity. It highlights the importance of sorting for data organization and explains various sorting techniques such as exchange sorting, selection sorting, and insertion sorting. Additionally, it covers parameters like stability, adaptability, and memory usage associated with sorting algorithms.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
15 views

6_Algorithms on Data Structures - part 1

The document discusses sorting algorithms, focusing on bubble sort, including its concept, algorithm, and complexity. It highlights the importance of sorting for data organization and explains various sorting techniques such as exchange sorting, selection sorting, and insertion sorting. Additionally, it covers parameters like stability, adaptability, and memory usage associated with sorting algorithms.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

19CCE202 Data Structures and Algorithms

LECTURE 6 – ALGORITHMS ON DATA STRUCTURES: SORTING


PART 1
Dr. R. Ramanathan
Associate Professor
Department of Electronics and Communication Engineering
Amrita School of Engineering, Coimbatore
[email protected]
Objective
To study and analyze algorithms on data structures, in particular sorting on arrays

Key concepts
• Concept of Sorting [CO04]
• Parameters [CO04]
• Bubble sort - concept, algorithm and illustration [CO02, CO04]
• Complexity and Stability [CO02, CO03]
• enumeration sorting: If we know that there are N items
which are smaller than the one we are currently considering,
then its final position will be at number N + 1.
Sorting
• exchange sorting: If two items are found to be out of order,
`Sorting' usually refers to bringing a exchange them. Repeat till all items are in order.
set of items into some well-defined
order.
• selection sorting: Find the smallest item, put it in the first
Sorting is important because having position, find the smallest of the remaining items, put it in the
the items in order makes it much second position .
easier to find a given item.
• insertion sorting: Take the items one at a time and insert
Internal sorting - takes place in the them into an initially empty data structure such that the data
main memory, where we can take structure continues to be sorted at each stage.
advantage of the random access
nature of the main memory;
• divide and conquer: Recursively split the problem into
External sorting - necessary when smaller sub-problems till you just have single items that are
the number and size of objects are trivial to sort. Then put the sorted `parts' back together in a
prohibitive to be accommodated way that preserves the sorting.
in the main memory.
Illustration

Fig. Decision tree for the three item example


Number of Comparisons
• sorting algorithms are classified based on the number of
Parameters comparisons.
• best case behavior is O(nlogn) and worst case behavior is
Stability O(n2).
Sorting algorithm is stable if the • evaluate the elements of the list by key comparison operation
sorting algorithms maintain the and need at least O(nlogn) comparisons for most inputs.
relative order of elements with equal Number of Swaps
keys (equivalent elements retain their • sorting algorithms are categorized by the number of swaps
relative positions even after sorting). (also called inversions).
Memory Usage
Adaptability • sorting algorithms are “in place” and they need O(1) or
The complexity changes based on O(logn) memory to create auxiliary locations for sorting the
presortedness [quick sort] - data temporarily.
presortedness of the input affects the Recursion
running time. Algorithms that take • Sorting algorithms are either recursive [quick sort] or non-
this into account are known to be recursive [selection sort, and insertion sort], and there are
adaptive. some algorithms which use both (merge sort).
• Assume we have array a of size n that we wish to sort.
• Bubble Sort starts by comparing a[n-1] with a[n-2] and swaps
Bubble Sort them if they are in the wrong order. (vice versa also works!)
• It then compares a[n-2] and a[n-3] and swaps those if need be,
and so on.
• simplest sorting algorithm,
follows exchange sort approach. • This means that once it reaches a[0], the smallest entry will be in
the correct place.
• works by iterating the input array
• It then starts from the back again, comparing pairs of
from the first
`neighbours', but leaving the zeroth entry alone (which is known
• comparing each pair of elements to be correct).
and swapping them if needed. • After it has reached the front again, the second-smallest entry
• continues its iterations until no will be in place. It keeps making `passes' over the array until it
more swaps are needed. is sorted.
• gets its name from the way • More generally, at the ith stage Bubble Sort compares
smaller elements “bubble” to the neighbouring entries `from the back', swapping them as needed.
top of the list • The item with the lowest index that is compared to its right
neighbour is a[i-1].
• After the ith stage, the entries a[0],...,a[i-1] are in their final
position.
1

Bubble sort - Illustration


Bubble sort - Algorithm
Algorithm 10: BUBBLE(DATA, N) [Here DATA is an array with N elements. This
algorithm sorts elements in DATA.]
1. Repeat steps 2 and 3 for K=1 to N-1
2. Set PTR = 1. [Initialize pass pointer PTR]
3. Repeat while PTR <= N – K. [Execute pass]
4. If DATA[PTR] > DATA [PTR + 1], then
Interchange DATA[PTR] and DATA [PTR + 1]
[End of If structure]
5. Set PTR = PTR + 1
[End of inner loop]
[End of step 1 outer loop]
6. Exit
For comparison-based sorting algorithms, the time complexity
Complexity and will be measured by counting the number of comparisons that
Stability are being made.

The outer loop is carried out n-1 times. The inner loop is carried out (n-1) - (i - 1) = n - i
times.

The worst case and average case number of comparisons are both proportional to n2, and hence the
average and worst case time complexities are O(n2).

Stability : This is stable because no item is swapped past another unless they are in the wrong
order. So items with identical keys will have their original order preserved.

You might also like