0% found this document useful (0 votes)
26 views16 pages

L5 QuickSort Ts

Uploaded by

nerdyall51
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
26 views16 pages

L5 QuickSort Ts

Uploaded by

nerdyall51
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

Lecture 5

QUICKSORT
Learning outcomes
At the end of this class, you should be able to:
 Implement Quicksort algorithm
Determine the running time of Quicksort

COMP 122
In-place and stability
 In-place sorting algorithm is one that uses no additional array
storage
 A sorting algorithm is stable if duplicate elements remain in the
same relative position after sorting.
QuickSort
Another Divide-and-Conquer sorting algorithm…
 As it turns out, MERGESORT and HEAPSORT, although O(n lg n) in
their time complexity, have fairly large constants and tend to move
data around more than desirable (e.g., equal-key items may not
maintain their relative position from input to output).
 We introduce another algorithm with better constants, but a flaw: its
worst case in O(n2). Fortunately, the worst case is “rare enough” so
that the speed advantages work an overwhelming amount of the
time… and it is O(n lg n) on average.
Introduction
Like in MERGESORT, we use Divide-and-Conquer:
1.Divide: partition A[p..r] into two subarrays A[p..q-1] and A[q+1..r] such
that each element of A[p..q-1] is ≤ A[q], and each element of A[q+1..r] is ≥
A[q]. Compute q as part of this partitioning.
2.Conquer: sort the subarrays A[p..q-1] and A[q+1..r] by recursive calls to
QUICKSORT.
3.Combine: the partitioning and recursive sorting leave us with a sorted
A[p..r] – no work needed here.
An obvious difference is that we do most of the work in the divide stage,
with no work at the combine one.
QuickSort
The Pseudo-Code
Partition
Proof of Correctness: PARTITION
We look for a loop invariant and we observe that at the
beginning of each iteration of the loop (line 3-6) for any
array index k:

1.If p ≤ k ≤ i, then A[k] ≤ x;


2.If i+1 ≤ k ≤ j-1, then A[k] > x;
3.If k = r, then A[k] = x.
4.If j ≤ k ≤ r-1, then we don’t know anything about A[k].
The Invariant
Initialization. Before the first iteration: i=p-1, j=p. No values between p and i; no values
between i+1 and j-1. The first two conditions are trivially satisfied; the initial assignment
satisfies 3.
Maintenance. Two cases
◦ 1. A[j] > x.

◦ 2. A[j] ≤ x.
The Invariant
Termination. j=r. Every entry in the array is in one of the three sets described by the
invariant. We have partitioned the values in the array into three sets: less than or equal
to x, greater than x, and a singleton containing x.

Running time of PARTITION on A[p..r] is Q(n), where n = r – p + 1.


Performance – a quick look
QUICKSORT: Performance – a quick look.
We first look at (apparent) worst-case partitioning:
T(n) = T(n-1) + T(1) + Q(n) = T(n-1) + Q(n).
It is easy to show – using substitution - that T(n) = Q(n2).
We next look at (apparent) best-case partitioning:
T(n) = 2T(n/2) + Q(n).
It is also easy to show (case 2 of the Master Theorem) that T(n) = Q(n lg n).
Since the disparity between the two is substantial, we need to look further…
Review of sorting algorithms
 In-place sorting algorithm is one that uses no additional array
storage (however, we allow Quicksort to be called in-place even
though they need a stack of size O(log n) for keeping track of the
recursion).
 A sorting algorithm is stable if duplicate elements remain in the
same relative position after sorting.
Slow algorithms
 BubbleSort, InsertionSort and SelectionSort.
 They are all Θ (n2) in-place sorting algorithms.
 BubbleSort and InsertionSort can be implemented as stable
algorithms, but SelectionSort cannot (without significant
modifications).
Mergesort
 Mergesort is a stable Θ(n log n) sorting algorithm.
 The downside is that MergeSort is the only algorithm of the three that
requires additional array storage, implying that it is not an in-place
algorithm.
Quicksort
 Widely regarded as the fastest of the fast algorithms.
 This algorithm is O(n log n) in the expected case, and O(n2) in the
worst case. The probability that the algorithm takes
asymptotically longer (assuming that the pivot is chosen
randomly) is extremely small for large n.
 It is an(almost) in-place sorting algorithm but is not stable.
Heapsort
 Heapsort is based on a nice data structure, called a heap, which is a fast
priority queue.
 Elements can be inserted into a heap in O(log n) time, and the largest item
can be extracted in O(log n) time. (It is also easy to set up a heap for
extracting the smallest item.) If you only want to extract the k largest values,
a heap can allow you to do this is O(n + k log n) time.
 It is an in-place algorithm, but it is not stable.

You might also like