0% found this document useful (0 votes)
41 views23 pages

Week 3 - Part 1

This document discusses sorting algorithms and covers selection sort, insertion sort, and merge sort. Selection sort runs in Θ(n2) time for all inputs but is simple to implement. Insertion sort also runs in O(n2) time but can be faster for nearly sorted data. Merge sort divides the array in half, sorts each half recursively, and then merges the halves together in linear time, resulting in an overall runtime of Θ(nlg(n)).
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
41 views23 pages

Week 3 - Part 1

This document discusses sorting algorithms and covers selection sort, insertion sort, and merge sort. Selection sort runs in Θ(n2) time for all inputs but is simple to implement. Insertion sort also runs in O(n2) time but can be faster for nearly sorted data. Merge sort divides the array in half, sorts each half recursively, and then merges the halves together in linear time, resulting in an overall runtime of Θ(nlg(n)).
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 23

CSCI 270 - Algorithms

Department of Computer Science


Nazarbayev University
Week 3: Sorting Algorithms
The problem of sorting
Input: sequence 〈a1, a2, …, an〉 of numbers.

Output: permutation 〈a'1, a'2, …, a'n〉 such that a'1 ≤ a'2 ≤ … ≤ a'n .

Example:
Input: 8 2 4 9 3 6

Output: 2 3 4 6 8 9
The problem of sorting
Stability: Stable sorting algorithms maintain the relative order of records with equal keys (i.e.,
value)

● A sorting algorithm is stable if whenever there are two records, e.g., R and S, with
the same key, and R appears before S in the original list, then R will always appear
before S in the sorted lists.

In-Place: In-place sorting algorithms do not need an extra space and produces an output in the same
memory that contains the data by transforming the input ‘in-place’.

● However, a small constant extra space used for variables is allowed.


Sorting Algorithms
1. Selection Sort
2. Insertion Sort
3. Merge Sort
4. Quick Sort
5. Linear-time Sorting Algorithms (Radix Sort and Count Sort)
Selection Sort
Algorithm

1. Find the minimum value in the list


2. Swap it with the value in the first position
3. Repeat the steps above for the remainder of the list (starting at the second position
and advancing each time)
Selection Sort (Iterative)
Pseudo-code:

for i = 1 to n do:

minIndex = i

for j = i + 1 to n do:

if A[j] < A[minIndex] then:

minIndex = j

Swap A[i] with A[minIndex]


Selection Sort (Recursive)
Sort(A, left, right)
if left ≥ right return
minIndex = left
for j = left + 1 to right do:
if A[j] < A[minIndex]
minIndex = j
Swap A[i] with A[minIndex].
Sort(A, left+1, right).
Example
Input: 64 25 12 22 11

11 25 12 22 64

11 12 25 22 64

11 12 22 25 64

11 12 22 25 64
Running Time for Selection Sort
● T(n) = time to sort an interval of length n.
● How long does it take to find the minimum?
○ Θ(n).
Running Time for Selection Sort
● T(n) = time to sort an interval of length n.
● How long does it take to find the minimum?
○ Θ(n).
● T(n) = n + T(n-1)
● T(1) = 1
● Solution?
○ 𝑇(𝑛) = Θ(𝑛2).
● What if the input is sorted?
○ Still Θ(𝑛2)
Insertion Sort
Can we do better?

● For example, if the input array is already sorted; the algorithm should run in time Θ
(n).
● Yes! Insertion Sort.
Insertion Sort
Example of Insertion Sort
Correctness of Insertion Sort
We will show (by induction) that, for each i, the algorithm correctly sorts the subarray
A[1…i].

● For i = 1, this is obviously true.


● For larger i, we assume this is true for i – 1; i.e., A[1…i-1] is sorted at the beginning
of i th iteration.
● Then in the inner while loop, the algorithm will find the correct spot for A[i] and
insert A [i] there.
● Therefore A[1…i] is sorted.
● Hence by induction, A[1…n] is sorted.
Insertion sort analysis

For a sorted array in ascending order (A = <1, 2, 3, …, n>), how


many steps does our algorithm take?
● Θ(n).
Better Idea: Divide and Conquer
● n2 is unacceptable.
○ Sorting 106 numbers will take 10 minutes on the fastest CPU.
○ However, with divide and conquer, sorting 106 numbers takes < 1
milliseconds even on an average laptop.
● Suppose SORT(l, r) sorts the interval
○ A[l], A[l+1],..., A[r].
● Let’s sort left and right halves recursively.
● How can we merge these sorted parts?
○ Merge sort.
Merge Sort
Merging Two Sorted Arrays
20 12

13 11

7 9

2 1
Merging Two Sorted Arrays

Time = Θ(n) to merge a total of n elements (linear time).


Time Complexity of Merge Sort

T(n) = Θ(n lg n)
Sorting Overview
● Selection-Sort:
○ Very simple to implement.
○ (In-place) Does not use additional memory.
○ For all inputs, runs in time Θ(𝑛2).
Sorting Overview
● Selection-Sort:
○ Very simple to implement.
○ (In-place) Does not use additional memory.
○ For all inputs, runs in time Θ(𝑛2).
● Insertion-Sort:
○ Relatively easier to implement.
○ In-place.
○ Runs in time O(𝑛2)
○ Or, O(n + I) where I is the # of inversions.
○ So it runs faster on nearly sorted arrays.
Sorting Overview
● Selection-Sort:
○ Very simple to implement.
○ In-place (Does not use additional memory).
○ For all inputs, runs in time Θ(𝑛2).
● Insertion-Sort:
○ Relatively easier to implement.
○ In-place.
○ Runs in time O(𝑛2)
○ Or, O(n + I) where I is the # of inversions.
○ So it runs faster on nearly sorted arrays.
● Merge-Sort:
○ More difficult to implement.
○ Needs 2x extra memory for merging step.
○ Runs always in time Θ (𝑛 lg 𝑛) .

You might also like