0% found this document useful (0 votes)
12 views18 pages

323 Lecture Notes 6

This lecture discusses the Quick Sort algorithm, detailing its worst-case, best-case, and average-case running times, as well as its in-place sorting mechanism through a divide-and-conquer approach. It emphasizes the importance of balanced partitioning for optimal performance and introduces a randomized version of Quick Sort to improve average-case efficiency. The correctness of the algorithm is supported by loop invariants and performance analysis, highlighting the implications of partitioning strategies on running time.

Uploaded by

ozenedabusee
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views18 pages

323 Lecture Notes 6

This lecture discusses the Quick Sort algorithm, detailing its worst-case, best-case, and average-case running times, as well as its in-place sorting mechanism through a divide-and-conquer approach. It emphasizes the importance of balanced partitioning for optimal performance and introduces a randomized version of Quick Sort to improve average-case efficiency. The correctness of the algorithm is supported by loop invariants and performance analysis, highlighting the implications of partitioning strategies on running time.

Uploaded by

ozenedabusee
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 18

LECTURE 6:

Divide-and-Conquer: Quick Sort

CMPE 323 Algorithms

Cormen, Leiserson, Rivest, Stein, “Introduction to Algorithms“,


The MIT Press, 2009
Quick Sort
◼ Worst-case running time 𝜃(𝑛2 ); Best-case running time
𝑂(𝑛𝑙𝑔𝑛); Average-case running time 𝜃(𝑛𝑙𝑔𝑛)
◼ Sorts in place. To sort subarray 𝐴[𝑝. . 𝑟]:
1. Divide: Partition 𝐴[𝑝. . 𝑟] into two subarrays 𝐴[𝑝. . 𝑞 − 1] and
𝐴[𝑞 + 1. . 𝑟] where elements of 𝐴 𝑝. . 𝑞 − 1 ≤ 𝐴 𝑞 <
elements of 𝐴[𝑞 + 1. . 𝑟]. Compute such 𝑞
2. Conquer: Sort 𝐴[𝑝. . 𝑞 − 1] and 𝐴[𝑞 + 1. . 𝑟] by recursive calls
3. Combine: No need to combine since subarrays are sorted in
place.

2
Quick Sort

3
Quick Sort
◼ Initial call: 𝑄𝑈𝐼𝐶𝐾𝑆𝑂𝑅𝑇(𝐴, 1, 𝑙𝑒𝑛𝑔𝑡ℎ[𝐴])
◼ Partitioning idea:
❑ Take the last elements as the pivot
❑ Find the pivot element’s correct position where
All the elements before the pivot are ≤ pivot’s value
All the elements after the pivot are > pivot’s value

4
Quick Sort

5
PARTITION

6
Correctness of Quick Sort
◼ We must prove its correctness

◼ Loop Invariant (for Lines 3-6):

For any array index k,

1. If 𝑝 ≤ 𝑘 ≤ 𝑖 then 𝐴[𝑘] ≤ 𝑥

2. If 𝑖 + 1 ≤ 𝑘 ≤ 𝑗 − 1 then 𝐴 𝑘 > 𝑥

3. If 𝑘 = 𝑟 then 𝐴 𝑘 = 𝑥

7
Correctness of Quick Sort
◼ Loop invariant should be satisfied before and during the loop
execution. Also, the loop should terminate:
1. Initialization:
𝑖 = 𝑝 − 1 and 𝑗 = 𝑝 so, (1) and (2) are empty
𝑟 = 𝑘 so, (3) is satisfied since 𝑥 ← 𝐴[𝑟] at Line 1
2. Maintenance: See the Figure at the next slide
3. Termination: 𝑗 = 𝑟 where 𝑛 = 𝑟 − 𝑝 + 1
Running time for PARTITION: 𝜃(𝑛)

8
Correctness of Quick Sort

9
Analysis of Quick Sort
◼ Performance of QuickSort:
❑ Balance of partitioning determines the performance

❑ Which elements used for partitioning is important

❑ Balanced partitioning as fast as MergeSort

❑ Unbalanced partitioning as slow as InsertionSort

10
Analysis of Quick Sort
◼ Worst case partitioning:

◼ Partitioning cost (i.e. divide): 𝜃(𝑛) and 𝑇 0 = 𝜃(1)

◼ So, the recurrence equation:

𝑇 𝑛 =𝑇 𝑛−1 +𝑇 0 +𝜃 𝑛 =𝑇 𝑛−1 +𝜃 𝑛

◼ Master’s Theorem can not be applied !


◼ Use substitution method or since 𝜃(𝑛) called 𝑛 times.
Running time is: 𝜃(𝑛2 )

11
Analysis of Quick Sort
◼ In worst case, it is the same as the worst case for the
InsertionSort !
◼ But in insertion sort, if the array is completely sorted, running
time is 𝑂(𝑛).
◼ When does the worst case occur? Already sorted array.
◼ Best case partitioning recurrence equation:
𝑇 𝑛 ≤ 2𝑇 𝑛Τ2 + 𝜃 𝑛
Master’s Theorem can be applied, Case 2. 𝑇 𝑛 = 𝜃(𝑛𝑙𝑔𝑛)

12
Analysis of Quick Sort
◼ Balanced partitioning:
The average case running time of QuickSort is much closer to
the best case than the worst case
◼ Assume that the partitioning always produces a 9-to-1
proportional split
◼ The recurrence equation:
𝑇 𝑛 ≤ 𝑇 9𝑛Τ10 + 𝑇 𝑛Τ10 + 𝑐𝑛
◼ log10 𝑛 = 𝜃(𝑙𝑔𝑛), log 10Τ9 𝑛 = 𝜃(𝑙𝑔𝑛)
◼ Each time the problem size is divided by 10Τ9
13
Analysis of Quick Sort

14
Analysis of Quick Sort
◼ Even though the split is unbalanced QuickSort runs in 𝑂(𝑛𝑙𝑔𝑛)

◼ Any split of constant proportionality yields tree of depth 𝜃(𝑙𝑔𝑛)


with each level cost 𝑂(𝑛).

◼ Thus, the running time is: 𝑂(𝑛𝑙𝑔𝑛)

15
Randomized Quick Sort
◼ Randomized version of QuickSort:

◼ Add randomization to the algorithm in order to obtain a good


average case performance over all inputs

◼ Apply random sampling to decide on the pivot element

16
Randomized Quick Sort

17
Randomized Quick Sort

18

You might also like