0% found this document useful (0 votes)
8 views19 pages

Quick Sort

The document explains the Merge Sort and Quick Sort algorithms, both of which are based on the divide and conquer principle. Merge Sort divides a list into two halves, sorts them, and recombines them, achieving a time complexity of O(n log n), while Quick Sort selects a pivot, partitions the array around it, and recursively sorts the sub-arrays. The document also discusses the choice of pivot in Quick Sort and provides examples for both sorting methods.

Uploaded by

nomiiking552527
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views19 pages

Quick Sort

The document explains the Merge Sort and Quick Sort algorithms, both of which are based on the divide and conquer principle. Merge Sort divides a list into two halves, sorts them, and recombines them, achieving a time complexity of O(n log n), while Quick Sort selects a pivot, partitions the array around it, and recursively sorts the sub-arrays. The document also discusses the choice of pivot in Quick Sort and provides examples for both sorting methods.

Uploaded by

nomiiking552527
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 19

QuickSort

1
Merge Sort
• Merge sort:
– divide a list into two halves
– sort the halves
– recombine the sorted halves into a sorted whole

• Merge sort is an example of a “divide and conquer”


algorithm

• divide and conquer algorithm: an algorithm that


repeatedly divides the given problem into smaller
pieces that can be solved more easily
– it’s easier to sort the two small lists than the one big list

2
Merge Sort Picture
0 1 2 3 4 5 6 7
22 18 12 -4 58 7 31 42
s
22 18 12 -4 58 7 31 42 p
l
i
22 18 12 -4 58 7 31 42 t

22 18 12 -4 58 7 31 42

m
18 22 -4 12 7 58 31 42
e
r
-4 12 18 22 7 31 42 58 g
e
0 1 2 3 4 5 6 7
-4 7 12 18 22 31 42 58
3
def merge_sort(arr): def merge(left, right):
# Base case: if the list has 1 or 0 elements, result = []
it's already sorted i=j=0
if len(arr) <= 1:
return arr # Merge the two sorted lists into result
while i < len(left) and j < len(right):
# Find the middle index to divide the array if left[i] < right[j]:
into two halves result.append(left[i]) # Take from left
mid = len(arr) // 2 i += 1
else:
# Recursively sort the left half result.append(right[j]) # Take from
left_half = merge_sort(arr[:mid]) right
j += 1
# Recursively sort the right half
right_half = merge_sort(arr[mid:]) # Add any remaining elements from left
result.extend(left[i:])
# Merge the sorted halves
return merge(left_half, right_half) # Add any remaining elements from right
result.extend(right[j:])

return result
4
# Example usage
arr = [7, 2, 1, 6, 8, 5, 3, 4]

# Sort the array using merge sort


sorted_arr = merge_sort(arr)

# Print the sorted array


print("Sorted:", sorted_arr)

5
Complexity of Merge
Sort
• To determine the time complexity, let’s break our merge
sort into pieces and analyze the pieces

• Remember, merge sort consists of:


– divide a list into two halves
– sort the halves
– recombine the sorted halves into a sorted whole

• Dividing the list in half and recombining the lists are


pretty easy to analyze:
– both have O(n) time complexity

• But what about sorting the halves?


6
Complexity of Merge
Sort
• We can think of merge sort as occurring in levels
– at the first level, we want to sort the whole list
– at the second level, we want to sort the two half lists
– at the third level, we want to sort the four quarter lists
– ...

• We know there’s O(n) work at each level from


dividing/recombining the lists

• But how many levels are there?


– if we can figure this out, our time complexity is just
O(n * num_levels)
7
Complexity of Merge
Sort
• Because we divide the array in half each time, there
are log(n) levels

log(n)
levels

O(n) work at each


level

• So merge sort is an O(n log(n)) algorithm


– this is a big improvement over the O(n2) sorting
algorithms
8
Quick Sort

< 28 <

< 15 < < 47 <

1. Pick a “pivot”
2. Divide into less-than & greater-than pivot
3. Sort each side recursively
9
QuickSort is a sorting algorithm based on the Divide and Conquer that
picks an element as a pivot and partitions the given array around the
picked pivot by placing the pivot in its correct position in the sorted
array.
It works on the principle of divide and conquer, breaking down the
problem into smaller sub-problems.
There are mainly three steps in the algorithm:
1.Choose a Pivot: Select an element from the array as the pivot. The
choice of pivot can vary (e.g., first element, last element, random
element, or median).

2.Partition the Array: Rearrange the array around the pivot. After
partitioning, all elements smaller than the pivot will be on its left, and all
elements greater than the pivot will be on its right. The pivot is then in its
correct position, and we obtain the index of the pivot.

3.Recursively Call: Recursively apply the same process to the two


partitioned sub-arrays (left and right of the pivot).
10
4.Base Case: The recursion stops when there is only one element left in
Choice of Pivot

There are many different choices for picking pivots.


•Always pick the first (or last) element as a pivot. The below
implementation picks the last element as pivot. The problem with
this approach is it ends up in the worst case when array is already
sorted.

•Pick a random element as a pivot. This is a preferred approach


because it does not have a pattern for which the worst case
happens.

•Pick the median element is pivot. This is an ideal approach in terms


of time complexity as we can find median in linear time and the
partition function will always divide the input array into two halves.
But it takes more time on average as median finding has high
constants. 11
Example: Sorting [7, 2, 1, 6, 8, 5, 3, 4]

We'll use the last element as the pivot in this example.

Step 1: Initial Array[7, 2, 1, 6, 8, 5, 3, 4]

Pivot = 4 (last element)

Step 2: Partition Around

Rearrange the array so that :


Elements < 4 come before Elements > 4 come after
Process:Start from the left and compare each element to pivot.

If it's smaller than pivot, swap it to the left side.


After partition:[2, 1, 3, 4, 8, 5, 7, 6]
Pivot 4 is now at index 3
Subarrays to sort:
Left: [2, 1, 3] Right: [8, 5, 7, 6]
12
Step 3: Recursively Sort Left [2, 1, 3]
Pivot = 3 Partition around 3 → [2, 1, 3] becomes [2, 1, 3]
(2 and 1 are less than 3, already correct)

Now recursively sort [2, 1] Pivot = 1 → [2, 1] becomes [1, 2]


Left side is now sorted: [1, 2, 3

Step 4: Recursively Sort Right [8, 5, 7, 6]


Pivot = 6 Partition around 6 → becomes [5, 6, 7, 8]

Now recursively sort [5] and [7, 8] → both are already sorted

Right side is now sorted: [5, 6, 7, 8]

Final Sorted Array: Combine all parts:[1, 2, 3] + [4] + [5, 6, 7, 8]


Result: [1, 2, 3, 4, 5, 6, 7, 8]
13
The steps of QuickSort

S 81
43
31 57 select pivot value
13 75
92 0
65 26

S1 0
S2 partition S
31 75
43 65
13 81
92
26 57

QuickSort(S1) and
S1 S2 QuickSort(S2)
0 13 26 31 43 57 65 75 81 92

S 0 13 26 31 43 57 65 75 81 92 Presto! S is sorted
[Weiss]

14
Quick sort vs. Merge sort
Quick sort:
– pick a pivot value from the array
– partition the list around the pivot value
– sort the left half
– sort the right half

Merge sort:
– divide a list into two identically sized halves
– sort the left half
– sort the right half
– recombine the sorted halves into a sorted whole

15
QuickSort Example
i j

5 1 3 9 7 0 4 2 6 8

i j

5 1 3 9 7 0 4 2 6 8

i j

5 1 3 9 7 0 4 2 6 8

i j

5 1 3 2 7 0 4 9 6 8

•Move i to the right to be larger than pivot.


•Move j to the left to be smaller than pivot.
•Swap
16
QuickSort Example
i j

5 1 3 2 7 0 4 9 6 8

i j

5 1 3 2 7 0 4 9 6 8

i j

5 1 3 2 4 0 7 9 6 8

i j

5 1 3 2 4 0 7 9 6 8

j i

5 1 3 2 4 0 7 9 6 8

j i

0 1 4 2 4 5 6 9 7 8

S1 < pivot pivot S2 > pivot


17
Complexity Classes
Complexit Name Example
y Class
O(1) constant time popping a value off a stack
O(log n) logarithmic binary search on an array
time
O(n) linear time scanning all elements of an
array
O(n log n) log-linear binary search on a linked list
time and good sorting algorithms
O(n2) quadratic poor sorting algorithms (like
time inserting n items into
SortedIntList)
O(n3) cubic time (example in lecture 11)
O(2n) exponential Really hard problems. These
time grow so fast that they’re
impractical 18
def quick_sort(arr):
# Base case: if the array has 0 or 1 element, it's already # Example usage
sorted arr = [7, 2, 1, 6, 8, 5, 3, 4]
if len(arr) <= 1:
# Sort the array using quick sort
return arr
sorted_arr = quick_sort(arr)
# Choose the last element as the pivot
pivot = arr[-1] # Print the sorted array
print("Sorted:", sorted_arr)
# All elements less than pivot go to the left
left = [x for x in arr[:-1] if x < pivot]

# All elements greater than or equal to pivot go to the


right
right = [x for x in arr[:-1] if x >= pivot]

# Recursively sort the left and right subarrays, then


combine with pivot
return quick_sort(left) + [pivot] + quick_sort(right)
19

You might also like