0% found this document useful (0 votes)
67 views156 pages

Algo - Mod3 and Mod4 - Extended - Evon

The document discusses insertion sort, an algorithm that iterates through an unsorted array and inserts each element into its sorted position. It provides pseudocode for the insertion sort algorithm and walks through an example of sorting the array [30, 10, 40, 20]. The algorithm iterates from i=2 to the end of the array, sets the current element as the key, and shifts larger elements to the right until it finds the correct insertion point for the key.

Uploaded by

ISSAM HAMAD
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
67 views156 pages

Algo - Mod3 and Mod4 - Extended - Evon

The document discusses insertion sort, an algorithm that iterates through an unsorted array and inserts each element into its sorted position. It provides pseudocode for the insertion sort algorithm and walks through an example of sorting the array [30, 10, 40, 20]. The algorithm iterates from i=2 to the end of the array, sets the current element as the key, and shifts larger elements to the right until it finds the correct insertion point for the key.

Uploaded by

ISSAM HAMAD
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 156

Algorithms

Theory (1901715 )
Evon Abu-Taieh, Associate Professor
[email protected]
Issam AlHadid, Assistant Professor
[email protected]

(Module 3)
Meeting

• Sunday 4-7
• Semester: Spring 2017/2018
• Ref.
• Cormen, Leiserson, Rivest, Stein,
"Introduction to Algorithm", PHI.
• A V Aho, J E Hopcroft, J D Ullman, "Design
and Analysis of Algorithms", Addison-Wesley
Publishing.
Tentative Grading Scale
35- 50- 55- 60- 65- 70- 75- *: 80- 85- 90- 95-
0 34 49 54 59 64 69 74 79 84 89 94 100
F D- D D+ C- C C+ B- B B+ A- A
Grades

• Mid 30% 25/3/2018


• Homework
• Presentation
• Quizzes
• Paper ?
• Final 40% 12/5/2018

Sorting
SMTW
11:00-12:30
In this file

1. Insertion sort cards


2. Merge sort
3. Quick sort pivot
4. Bubble sort two X two
5. Heap Sort one rule
6. Count Sort money
7. Radix sort tree
8. Spaghetti sort
9. Shell Sort
Introduction

• There are more than 43 + sorting algorithms


• Help see:
• http://www.tomgsmith.com/quicksort/content/i
llustration/
Families of sort algorithms
Exchange sorts  Bubble sort Cocktail shaker sort Odd–even sort
 Comb sort Gnome sort Quicksort
 Slowsort Stooge sort Bogo sort
Selection sorts  Selection sort Heapsort Smoothsort
Cartesian tree sort Tournament sort Cycle sort
Weak heapsort
Insertion sorts  Insertion sort Shellsort Splaysort
 Tree sort Library sort Patience sorting
Merge sorts  Merge sort Cascade merge sort Oscillating merge
sort Polyphase merge sort
Distribution sorts  American flag sort Bead sort Bucket sort
Burstsort Counting sort
 Pigeonhole sort Proxmap sort Radix sort Flash sort
Concurrent sorts  Bitonic sorter Batcher odd–even mergesort Pairwise
sorting network
Hybrid sorts  Block merge sort Timsort Introsort
Spreadsort
Other  Topological sorting Pancake sorting Spaghetti sort
General Method

• In sequential method, instructions are


executed sequentially (in series), where order
of instructions is important, since you may
have dependency between instructions.
1. Insertion Sort
Another Example:
• 12, 11, 13, 5, 6
• Let us loop for i = 1 (second element of the array) to 5 (Size of input array)
• i = 1. Since 11 is smaller than 12, move 12 and insert 11 before 12
11, 12, 13, 5, 6
• i = 2. 13 will remain at its position as all elements in A[0..I-1] are smaller
than 13
11, 12, 13, 5, 6
• i = 3. 5 will move to the beginning and all other elements from 11 to 13
will move one position ahead of their current position.
5, 11, 12, 13, 6
• i = 4. 6 will move to position after 5, and elements from 11 to 13 will move
one position ahead of their current position.
5, 6, 11, 12, 13
Insertion Sort algorithm
InsertionSort(A, n) {
for i = 2 to n {
key = A[i]
j = i - 1;
while (j > 0) and (A[j] > key) {
A[j+1] = A[j]
j = j - 1
}
A[j+1] = key
}
}
An Example: Insertion Sort
30 10 40 20 i =  j =  key = 
A[j] =  A[j+1] = 
1 2 3 4
InsertionSort(A, n) {
for i = 2 to n {
key = A[i]
j = i - 1;
while (j > 0) and (A[j] > key) {
A[j+1] = A[j]
j = j - 1
}
A[j+1] = key
}
}
An Example: Insertion Sort
30 10 40 20 i=2 j=1 key = 10
A[j] = 30 A[j+1] = 10
1 2 3 4
InsertionSort(A, n) {
for i = 2 to n {
key = A[i]
j = i - 1;
while (j > 0) and (A[j] > key) {
A[j+1] = A[j]
j = j - 1
}
A[j+1] = key
}
}
An Example: Insertion Sort
30 30 40 20 i=2 j=1 key = 10
A[j] = 30 A[j+1] = 30
1 2 3 4
InsertionSort(A, n) {
for i = 2 to n {
key = A[i]
j = i - 1;
while (j > 0) and (A[j] > key) {
A[j+1] = A[j]
j = j - 1
}
A[j+1] = key
}
}
An Example: Insertion Sort
30 30 40 20 i=2 j=1 key = 10
A[j] = 30 A[j+1] = 30
1 2 3 4
InsertionSort(A, n) {
for i = 2 to n {
key = A[i]
j = i - 1;
while (j > 0) and (A[j] > key) {
A[j+1] = A[j]
j = j - 1
}
A[j+1] = key
}
}
An Example: Insertion Sort
30 30 40 20 i=2 j=0 key = 10
A[j] =  A[j+1] = 30
1 2 3 4
InsertionSort(A, n) {
for i = 2 to n {
key = A[i]
j = i - 1;
while (j > 0) and (A[j] > key) {
A[j+1] = A[j]
j = j - 1
}
A[j+1] = key
}
}
An Example: Insertion Sort
30 30 40 20 i=2 j=0 key = 10
A[j] =  A[j+1] = 30
1 2 3 4
InsertionSort(A, n) {
for i = 2 to n {
key = A[i]
j = i - 1;
while (j > 0) and (A[j] > key) {
A[j+1] = A[j]
j = j - 1
}
A[j+1] = key
}
}
An Example: Insertion Sort
10 30 40 20 i=2 j=0 key = 10
A[j] =  A[j+1] = 10
1 2 3 4
InsertionSort(A, n) {
for i = 2 to n {
key = A[i]
j = i - 1;
while (j > 0) and (A[j] > key) {
A[j+1] = A[j]
j = j - 1
}
A[j+1] = key
}
}
An Example: Insertion Sort
10 30 40 20 i=3 j=0 key = 10
A[j] =  A[j+1] = 10
1 2 3 4
InsertionSort(A, n) {
for i = 2 to n {
key = A[i]
j = i - 1;
while (j > 0) and (A[j] > key) {
A[j+1] = A[j]
j = j - 1
}
A[j+1] = key
}
}
An Example: Insertion Sort
10 30 40 20 i=3 j=0 key = 40
A[j] =  A[j+1] = 10
1 2 3 4
InsertionSort(A, n) {
for i = 2 to n {
key = A[i]
j = i - 1;
while (j > 0) and (A[j] > key) {
A[j+1] = A[j]
j = j - 1
}
A[j+1] = key
}
}
An Example: Insertion Sort
10 30 40 20 i=3 j=0 key = 40
A[j] =  A[j+1] = 10
1 2 3 4
InsertionSort(A, n) {
for i = 2 to n {
key = A[i]
j = i - 1;
while (j > 0) and (A[j] > key) {
A[j+1] = A[j]
j = j - 1
}
A[j+1] = key
}
}
An Example: Insertion Sort
10 30 40 20 i=3 j=2 key = 40
A[j] = 30 A[j+1] = 40
1 2 3 4
InsertionSort(A, n) {
for i = 2 to n {
key = A[i]
j = i - 1;
while (j > 0) and (A[j] > key) {
A[j+1] = A[j]
j = j - 1
}
A[j+1] = key
}
}
An Example: Insertion Sort
10 30 40 20 i=3 j=2 key = 40
A[j] = 30 A[j+1] = 40
1 2 3 4
InsertionSort(A, n) {
for i = 2 to n {
key = A[i]
j = i - 1;
while (j > 0) and (A[j] > key) {
A[j+1] = A[j]
j = j - 1
}
A[j+1] = key
}
}
An Example: Insertion Sort
10 30 40 20 i=3 j=2 key = 40
A[j] = 30 A[j+1] = 40
1 2 3 4
InsertionSort(A, n) {
for i = 2 to n {
key = A[i]
j = i - 1;
while (j > 0) and (A[j] > key) {
A[j+1] = A[j]
j = j - 1
}
A[j+1] = key
}
}
An Example: Insertion Sort
10 30 40 20 i=4 j=2 key = 40
A[j] = 30 A[j+1] = 40
1 2 3 4
InsertionSort(A, n) {
for i = 2 to n {
key = A[i]
j = i - 1;
while (j > 0) and (A[j] > key) {
A[j+1] = A[j]
j = j - 1
}
A[j+1] = key
}
}
An Example: Insertion Sort
10 30 40 20 i=4 j=2 key = 20
A[j] = 30 A[j+1] = 40
1 2 3 4
InsertionSort(A, n) {
for i = 2 to n {
key = A[i]
j = i - 1;
while (j > 0) and (A[j] > key) {
A[j+1] = A[j]
j = j - 1
}
A[j+1] = key
}
}
An Example: Insertion Sort
10 30 40 20 i=4 j=2 key = 20
A[j] = 30 A[j+1] = 40
1 2 3 4
InsertionSort(A, n) {
for i = 2 to n {
key = A[i]
j = i - 1;
while (j > 0) and (A[j] > key) {
A[j+1] = A[j]
j = j - 1
}
A[j+1] = key
}
}
An Example: Insertion Sort
10 30 40 20 i=4 j=3 key = 20
A[j] = 40 A[j+1] = 20
1 2 3 4
InsertionSort(A, n) {
for i = 2 to n {
key = A[i]
j = i - 1;
while (j > 0) and (A[j] > key) {
A[j+1] = A[j]
j = j - 1
}
A[j+1] = key
}
}
An Example: Insertion Sort
10 30 40 20 i=4 j=3 key = 20
A[j] = 40 A[j+1] = 20
1 2 3 4
InsertionSort(A, n) {
for i = 2 to n {
key = A[i]
j = i - 1;
while (j > 0) and (A[j] > key) {
A[j+1] = A[j]
j = j - 1
}
A[j+1] = key
}
}
An Example: Insertion Sort
10 30 40 40 i=4 j=3 key = 20
A[j] = 40 A[j+1] = 40
1 2 3 4
InsertionSort(A, n) {
for i = 2 to n {
key = A[i]
j = i - 1;
while (j > 0) and (A[j] > key) {
A[j+1] = A[j]
j = j - 1
}
A[j+1] = key
}
}
An Example: Insertion Sort
10 30 40 40 i=4 j=3 key = 20
A[j] = 40 A[j+1] = 40
1 2 3 4
InsertionSort(A, n) {
for i = 2 to n {
key = A[i]
j = i - 1;
while (j > 0) and (A[j] > key) {
A[j+1] = A[j]
j = j - 1
}
A[j+1] = key
}
}
An Example: Insertion Sort
10 30 40 40 i=4 j=3 key = 20
A[j] = 40 A[j+1] = 40
1 2 3 4
InsertionSort(A, n) {
for i = 2 to n {
key = A[i]
j = i - 1;
while (j > 0) and (A[j] > key) {
A[j+1] = A[j]
j = j - 1
}
A[j+1] = key
}
}
An Example: Insertion Sort
10 30 40 40 i=4 j=2 key = 20
A[j] = 30 A[j+1] = 40
1 2 3 4
InsertionSort(A, n) {
for i = 2 to n {
key = A[i]
j = i - 1;
while (j > 0) and (A[j] > key) {
A[j+1] = A[j]
j = j - 1
}
A[j+1] = key
}
}
An Example: Insertion Sort
10 30 40 40 i=4 j=2 key = 20
A[j] = 30 A[j+1] = 40
1 2 3 4
InsertionSort(A, n) {
for i = 2 to n {
key = A[i]
j = i - 1;
while (j > 0) and (A[j] > key) {
A[j+1] = A[j]
j = j - 1
}
A[j+1] = key
}
}
An Example: Insertion Sort
10 30 30 40 i=4 j=2 key = 20
A[j] = 30 A[j+1] = 30
1 2 3 4
InsertionSort(A, n) {
for i = 2 to n {
key = A[i]
j = i - 1;
while (j > 0) and (A[j] > key) {
A[j+1] = A[j]
j = j - 1
}
A[j+1] = key
}
}
An Example: Insertion Sort
10 30 30 40 i=4 j=2 key = 20
A[j] = 30 A[j+1] = 30
1 2 3 4
InsertionSort(A, n) {
for i = 2 to n {
key = A[i]
j = i - 1;
while (j > 0) and (A[j] > key) {
A[j+1] = A[j]
j = j - 1
}
A[j+1] = key
}
}
An Example: Insertion Sort
10 30 30 40 i=4 j=1 key = 20
A[j] = 10 A[j+1] = 30
1 2 3 4
InsertionSort(A, n) {
for i = 2 to n {
key = A[i]
j = i - 1;
while (j > 0) and (A[j] > key) {
A[j+1] = A[j]
j = j - 1
}
A[j+1] = key
}
}
An Example: Insertion Sort
10 30 30 40 i=4 j=1 key = 20
A[j] = 10 A[j+1] = 30
1 2 3 4
InsertionSort(A, n) {
for i = 2 to n {
key = A[i]
j = i - 1;
while (j > 0) and (A[j] > key) {
A[j+1] = A[j]
j = j - 1
}
A[j+1] = key
}
}
An Example: Insertion Sort
10 20 30 40 i=4 j=1 key = 20
A[j] = 10 A[j+1] = 20
1 2 3 4
InsertionSort(A, n) {
for i = 2 to n {
key = A[i]
j = i - 1;
while (j > 0) and (A[j] > key) {
A[j+1] = A[j]
j = j - 1
}
A[j+1] = key
}
}
An Example: Insertion Sort
10 20 30 40 i=4 j=1 key = 20
A[j] = 10 A[j+1] = 20
1 2 3 4
InsertionSort(A, n) {
for i = 2 to n {
key = A[i]
j = i - 1;
while (j > 0) and (A[j] > key) {
A[j+1] = A[j]
j = j - 1
}
A[j+1] = key
}
} Done!
Animating Insertion Sort

• Check out the Animator, a java applet at:


http://www.cs.hope.edu/~alganim/animator/Animator.html

• Try it out with random, ascending, and


descending inputs
Insertion Sort
InsertionSort(A, n) { What is the precondition
for i = 2 to n { for this loop?
key = A[i]
j = i - 1;
while (j > 0) and (A[j] > key) {
A[j+1] = A[j]
j = j - 1
}
A[j+1] = key
}
}
Insertion Sort
InsertionSort(A, n) {
for i = 2 to n {
key = A[i]
j = i - 1;
while (j > 0) and (A[j] > key) {
A[j+1] = A[j]
j = j - 1
}
A[j+1] = key
} How many times will
} this loop execute?
Insertion Sort
Statement Effort
InsertionSort(A, n) {
for i = 2 to n { c1 n
key = A[i] c2(n-1)
j = i - 1; c3(n-1)
while (j > 0) and (A[j] > key) { c4T
A[j+1] = A[j] c5(T-(n-1))
j = j - 1 c6(T-(n-1))
} 0
A[j+1] = key c7(n-1)
} 0
}
T = t2 + t3 + … + tn where ti is number of while expression evaluations for the ith
for loop iteration
Analyzing Insertion Sort
• T(n) = c1n + c2(n-1) + c3(n-1) + c4T + c5(T - (n-1)) + c6(T - (n-1)) + c7(n-1)
= c8T + c9n + c10
• What can T be?
 Best case -- inner loop body never executed
o ti = 1  T(n) is a linear function
o If A is sorted: O(n) comparisons
 Worst case -- inner loop body executed for all previous
elements
o ti = i  T(n) is a quadratic function (square)
o If A is reversed sorted: O(n2) comparisons
 Average case ???
 Average case :If A is randomly sorted: O(n2) comparisons
Insertion Sort Space

• Since the insertion algorithm sorts elements in


the input array, this algorithm is sorts elements
in-place, which means it does not need
additional memory space other than the input
array size.
• Therefore, it requires n elements space, which
means it is efficient in terms of memory space.
Advantages & Disadvantages

• Advantages:
 Efficient in terms of time if the input array is
already sorted (best-case scenario).
 Efficient in terms of memory space (in-place).
 Easy to design - only two nested loops.
• Disadvantages:
 Not efficient in terms of running time (worst-case
and average-case scenarios).
Divide-and-Conquer Method

• General Method
• Merge Sort Algorithm
• Quicksort Algorithm
General Method

• Recursive in structure:
• Divide the problem into several smaller sub-
problems that are similar to the original but
smaller in size.
• Conquer the sub-problems by solving them
recursively. If they are small enough, just solve
them in a straightforward manner.
• Combine the solutions to create a solution to
the original problem.
1. Merge Sort
Merge Sort Algorithm

• Divide the n-element sequence to be sorted


into two subsequences of n / 2 elements each.
• Conquer (sort) the two subsequences
recursively using merge sort.
• Combine (merge) the two sorted subsequences
to produce the sorted answer.
Merge Sort Animation
Example 2
Example 2
Example 2
Example 2
Example 2
Example 2
Example 2
Example 2
Merge Sort algorithm
MergeSort(A, left, right) {
if (left < right) {
mid = floor((left + right) / 2);
MergeSort(A, left, mid);
MergeSort(A, mid+1, right);
Merge(A, left, mid, right);
}
}

// Merge() takes two sorted subarrays of A and


// merges them into a single sorted subarray of A
// (how long should this take?)
* It requires O(n)time.
Merge Sort algorithm
• To sort n numbers
 if n = 1 done!
 recursively sort 2 lists of numbers
 ⎣ n / 2⎦ and ⎡n / 2⎤ elements.
 merge 2 sorted lists in O(n) time.
• Strategy
 break problem into similar
 (smaller) sub-problems.
 recursively solve sub-problems.
 combine solutions to answer.
Analysis of Merge Sort

Instead of exact running time, say Θ(n)…. It is call asymptotically


tight bound .
Review: Analysis of Merge Sort
• Divide: computing the middle takes O(1)
• Conquer: solving 2 sub-problem takes 2T(n / 2)
• Combine: merging n-element takes O(n)
• Total:
 T(n) = O(1) if n = 1
 T(n) = 2T(n / 2) + O(n) + O(1) if n > 1
⇒ T(n) = O(n lg n)
• Solving this recurrence (how?) gives T(n) = O(n lg n).
• This expression is a recurrence equation.
• T(n) denote the number of operations required by an algorithm to solve a
given class of problems.
Analysis of Merge Sort
Merge Sort – Space Requirement

• Merge sort algorithm is an out-of-place


algorithm, where it needs additional memory
space in order to sort n elements.
• Exercise: How much space does merge sort
algorithm needs in order to sort an input array
of size n.
Advantages & Disadvantages

• Advantages:
 Time is O(n log n).
 Easy to code, since it uses divide-and-conquer
method.
• Disadvantages:
 Needs more memory space.
3. Quick sort
Quick sort
Another divide-and-conquer algorithm:
• Divide: A[p…r] is partitioned (rearranged) into two
nonempty subarraysA[p…q-1] and A[q+1…r] such
that each element of A[p…q-1] is less than or equal to
each element of A[q+1…r]. Index q is computed here,
called pivot.
• Conquer: two subarrays are sorted by recursive calls
to quicksort (main function).
• Combine: unlike merge sort, no work needed since
the subarrays are sorted in place already.
Quick sort
• The basic algorithm to sort an array A consists of the
following four easy steps:
 Small instance has n ≤ 1
o Every small instance is a sorted instance.
 To sort a large instance:
o select a pivot element from out of the n elements.
 Partition the n elements into 3 groups left, middle and
right:
o The middle group contains only the pivot element.
o All elements in the left group are ≤ pivot.
o All elements in the right group are > pivot.
 Sort left and right groups recursively.
Quick sort
The steps are:
• Pick an element, called a pivot (leftmost), from the array.
• Partitioning: reorder the array so that all elements with values
less than the pivot come before the pivot, while all elements
with values greater than the pivot come after it (equal values
can go either way). After this partitioning, the pivot is in its
final position. This is called the partition operation.
• Recursively apply the above steps to the sub-array of elements
with smaller values and separately to the sub-array of elements
with greater values.
Example 1

Pivot (leftmost)
Example 2
Quick sort Algorithm
HW: write the code
Quick Sort Code…
Partition Function

• Clearly, all the action takes place in the


partition( ) function:
 Rearranges the subarray in-place.
 End result:
o Two subarrays
o All values in first subarray ≤ all values in second
 Returns the index of the “pivot” element
separating the two subarrays.
Choice Of Pivot

• Pivot is rightmost or leftmost element in list


that is to be sorted.
 Rightmost example: when sorting A[6:20], use
A[20] as the pivot.
 Textbook implementation does this.
• Randomly select one of the elements to be
sorted as the pivot.
 Example: When sorting A[6:20], generate a
random number r in the range [6, 20].
 Use A[r] as the pivot.
Choice Of Pivot
• Median-of-Three rule: From the leftmost, middle,
and rightmost elements of the list to be sorted, select
the one with median key as the pivot.
• When the pivot is picked at random or when the
median-of three rule is used, we can use the quick
sort algorithm of the textbook provided we first swap
the rightmost element and the chosen pivot
Partitioning Using Additional
Array
Partitioning Using Additional
Array
Runtime of Quicksort

• Worst-case scenario.
• Best-case scenario.
• Average-case scenario.
Worst-Case Scenario

• Every time nothing to move.


• Pivot = left (right) end of subarray.
• Θ(n2).
• H.W: What is the difference between O(N2) and Θ(N2)?
Worst-Case Partitioning
• The running time of quick sort depends on whether the
partitioning is balanced or not.
• The most unbalanced partition occurs when one of the sublists
returned by the partitioning routine is of size n − 1
• This may occur if the pivot happens to be the smallest or
largest element in the list. Or when all the elements are
equal (in some implementations).
• each recursive call processes a list of size one less
than the previous list. Consequently, we can
make n −1 nested calls before we reach a list of size 1
Worst-Case Partitioning
Worst-Case Partitioning
• Θ(n) time to partition an array of n elements
• Let T(n) be the time needed to sort n elements.
• T(0) = T(1) = c, where c is a constant.
 When n > 1,
 T(n) = T(|left|) + T(|right|) + Θ(n)
• T(n) is maximum (worst-case) when either |left| = 0 or |right| =
0 following each partitioning.
• T(n) = T(1) + T(n-1) + Θ(n)
• Partitioning takes Θ(n).
•  calls = 2 + 3 + 4 + …+ n-1 + n + n =
• = Σk = 2 to nΘ(k) + n = Θ( Σk = 2 to nk) + n = Θ(n2)
Worst-Case Partitioning
• This means that the call tree is a linear chain of n −1
nested partition calls.
• The ith call does O(n − i) work to do the partition, and

• = O(n + n-1 + n-2 + n-3 + n-4 +… +4+3+2+1)

• O(1/2n2 + 1/2n) = O(n2)

Note: N = 5  5+4+3+2+1 = 15 , also (52 + 5) /2 = 15


Worst-Case Partitioning

• This occurs when:


 The input is completely sorted.
• or when:
 The pivot is always the smallest (or largest) element.
Best-Case Scenario

• When the partitioning procedure produces two


regions of size n / 2, we get a balanced
partition with best-case
• T(n) = 2T(n /2) + Θ(n) = Θ(n lgn)
• Hint: Merg Sort time complexity

• Average complexity is also Θ(n lg n).


Best-Case Partitioning
Average-Case Scenario
• Assuming random input, average-case running time is
much closer to Θ(n lg n) than Θ(n2).
• First, a more intuitive explanation/example:
 Suppose that partition() always produces a 1-to-9
proportional split. This looks quite unbalanced!
 The recurrence is thus:
o T(n) = T(n /10) + T(9n /10) + Θ(n) = Θ(n lg n)
 How deep will the recursion go?
Average-Case Example (PAPER)
Average-Case Analysis

• Intuitively, a real-life run of quick sort will produce


a mix of “bad” and “good” splits:
 Randomly distributed among the recursion tree.
 Pretend for intuition (sense) that they alternate between
best-case (n / 2: n / 2) and worst-case (n-1:1).
Average-Case Analysis
• What happens if we bad-split root node, then good-
split, the resulting size (n-1) node?
 We end up with three subarrays, size
o 1, (n-1) / 2, (n-1) / 2
 Combined cost of splits = n + n-1 = 2n -1 = Θ(n)
 No worse than if we had good-split the root node!
Intuition for the Average-Case

• Suppose, we alternate lucky and unlucky cases


to get an average behavior:

The combination of good and bad splits would result in


T(n) = Θ(n lg n), but with slightly larger constant hidden by
the Θ-notation.
Quick sort: Advantages &
Disadvantages
• Quick sort advantages:
 Sorts in place.
 Sorts O(n lg n) in the average case.
 Very efficient in practice (it is quick).
• Quick sort disadvantages:
 Sorts O(n2) in the worst case.
 Not stable:
o Does not preserve the relative order of elements with equal values.
o sorting algo. stable if 2 records with same key stay in original order
 And the worst case doesn’t happen often … Sorted.
Review: Analyzing Quicksort

• What will be the worst case for the algorithm?


 Partition is always unbalanced.
• What will be the best case for the algorithm?
 Partition is balanced.
• Which is more likely?
 The latter, by far, except...
• Will any particular input elicit the worst case?
 Yes: Already-sorted input.
Analyzing Quick sort Space

• Quick sort algorithm is considered as an in-


place algorithm, which means it does not need
additional memory space other than the input
size.
Asymptotic notation
‫مقارب‬

(a) theta -notation (b) O-notation (c) Omega-


bounds a function gives an upper notation gives a
to within constant bound for a lower bound for a
factors. function to function to within
within a a constant factor.
constant factor.
Θ Notation
• 1) Θ Notation: The theta notation bounds a functions from above and
below, so it defines exact asymptotic behavior.
A simple way to get Theta notation of an expression is to drop low order
terms and ignore leading constants. For example, consider the following
expression.
3n3 + 6n2 + 6000 = Θ(n3)
Dropping lower order terms is always fine because there will always be a
n0 after which Θ(n3) has higher values than Θn2) irrespective of the
constants involved.
For a given function g(n), we denote Θ(g(n)) is following set of functions.
Big O Notation
• 2) Big O Notation: The Big O notation defines an upper bound of an algorithm, it
bounds a function only from above. For example, consider the case of Insertion
Sort. It takes linear time in best case and quadratic time in worst case. We can
safely say that the time complexity of Insertion sort is O(n^2). Note that O(n^2)
also covers linear time.
If we use Θ notation to represent time complexity of Insertion sort, we have to use
two statements for best and worst cases:
1. The worst case time complexity of Insertion Sort is Θ(n^2).
2. The best case time complexity of Insertion Sort is Θ(n).
• The Big O notation is useful when we only have upper bound on time complexity
of an algorithm. Many times we easily find an upper bound by simply looking at
the algorithm.
Ω Notation

• 3) Ω Notation: Just as Big O notation


provides an asymptotic upper bound on a
function, Ω notation provides an asymptotic
lower bound.
• Ω Notation< can be useful when we have
lower bound on time complexity of an
algorithm. As discussed in the previous post,
the best case performance of an algorithm is
generally not useful, the Omega notation is the
least used notation among all three.
Big –O Complexity char
http://bigocheatsheet.com/
Common Data Structure operations
http://bigocheatsheet.com/
Array Sorting Algorithms

http://bigocheatsheet.com/
https://he-
s3.s3.amazonaws.com/media/uploads/c950295.png
4. Bubble sort
Bubble sort animation
Example 2- First Pass

(51428) ( 1 5 4 2 8 ), Here, algorithm compares the first two


elements, and swaps since 5 > 1.

(15428) ( 1 4 5 2 8 ), Swap since 5 > 4

(14528) ( 1 4 2 5 8 ), Swap since 5 > 2

(14258) ( 1 4 2 5 8 ), Now, since these elements are already in


order (8 > 5), algorithm does not swap them.
Second Pass

(14258) (14258)

(14258) ( 1 2 4 5 8 ), Swap since 4 > 2

(12458) (12458)

(12458) (12458)
Now, the array is already sorted, but the algorithm does not know if it is
completed. The algorithm needs one whole pass without any swap to
know it is sorted.
Third Pass

(12458) (12458)

(12458) (12458)

(12458) (12458)

(12458) (12458)
5. Heap Sort
Heap Sort

• keep larger numbers above smaller numbers


• Cannot take root you take a leaf
Example of heapfy
Input data: 4, 10, 3, 5, 1

10 2

5 4
10

4 2

5 4

Applying heapify procedure to


index 0
10

5 2

4 4

The heapify procedure calls


itself recursively to build
heap in top down manner.
Heapify and sort
Count Sort
Steps

• Initialize counting array to all zeros.


Count the number of times each value occurs
in the input.
• Modify the counting array to give the number
of values smaller than index
• Transfer numbers from input to output array at
locations provided by counting array
Example
Sorted
Input
array Count Accumulative array
2 array count array 0
2 2 2 0
0
3
1 3 1
3 6 2
4
2
3 6 12 2
3 2 14 3
5
1
2 16 3
3
0
3
3
3
2
3
3
4
5
4
4
5
3
5
Algorithm count first time
• for x =1 to n:
• count[A(x)] = count[A(x)]+1 …… /count aarray
• Next X
• # accumulative array:
• total = 0
• for i = beginning of array C to end of array C
• oldCount = count[i]
• count[i] = total
• total = oldCount + total
• Next I
• # copy to output array, preserving order of inputs with equal keys:
• for x in input:
• output[count[key(x)]] = x
• count[key(x)] = count[key(x)] +1
• Next x
End of lecture 1
4/2/2018
Radix Sort
Radix=‫االصل‬
Radix sort

• Sort
Steps
• Take the least significant digit (or group of bits, both
being examples of radices) of each key.
• Group the keys based on that digit, but otherwise
keep the original order of keys. (This is what makes
the LSD radix sort a stable sort.)
• Repeat the grouping process with each more
significant digit.
Same example
Original, unsorted list:
170, 45, 75, 90, 802, 2, 24, 66
Sorting by least significant digit (1s place) gives:
170, 90, 802, 2, 24, 45, 75, 66
Notice that we keep 802 before 2, because 802 occurred before 2 in the
original list, and similarly for pairs 170 & 90 and 45 & 75.
Sorting by next digit (10s place) gives:
802, 2, 24, 45, 66, 170, 75, 90
Notice that 802 again comes before 2 as 802 comes before 2 in the previous
list.
Sorting by most significant digit (100s place) gives:
2, 24, 45, 66, 75, 90, 170, 802
Another view
1> The integers are enqueued into an array of ten separate queues
based on their digits from right to left. Computers often represent
integers internally as fixed-length binary digits. Here, we will do
something analogous with fixed-length decimal digits. So, using the
numbers from the previous example, the queues for the 1st pass
would be:
0: 170, 090
1: none
2: 802, 002
3: none
4: 024
5: 045, 075
6: 066
7–9: none
2. The queues are dequeued back into an array of
integers, in increasing order. Using the same
numbers, the array will look like this after the first
pass:
170, 090, 802, 002, 024, 045, 075, 066
For the second pass
For the second pass:
Queues:
0: 802, 002
1: none
2: 024
3: none
4: 045
5: none
6: 066
7: 170, 075
8: none
9: 090
Array:
802, 002, 024, 045, 066, 170, 075, 090
(note that at this point only 802 and 170 are out of order
Third Pass
For the third pass:
Queues:
0: 002, 024, 045, 066, 075, 090
1: 170
2–7: none
8: 802
9: none
Array:
002, 024, 045, 066, 075, 090, 170, 802 (sorted)
Spaghetti sorting
Spaghetti sorting
• For simplicity, assume we are sorting a list of natural numbers. The sorting method
is illustrated using uncooked rods of spaghetti:
• For each number x in the list, obtain a rod of length x. (One practical way of
choosing the unit is to let the largest number m in the list correspond to one full rod
of spaghetti. In this case, the full rod equals m spaghetti units. To get a rod of
length x, break a rod in two so that one piece is of length x units; discard the other
piece.)
• Once you have all your spaghetti rods, take them loosely in your fist and lower
them to the table, so that they all stand upright, resting on the table surface. Now,
for each rod, lower your other hand from above until it meets with a rod—this one
is clearly the longest. Remove this rod and insert it into the front of the (initially
empty) output list (or equivalently, place it in the last unused slot of the output
array). Repeat until all rods have been removed.
Shell Sort
Example: sort the
following

• Break to 4 sub groups


• Then, we take interval of 2 and this gap
generates two sub-lists - {14, 27, 35, 42}, {19,
10, 33, 44}
Final step is insertion sort

• Finally, we sort the rest of


the array using interval of
value 1. Shell sort uses
insertion sort to sort the
array.
• We see that it required only
four swaps to sort the rest
of the array.
Shell Sort- different view example
consider the dataset:

16 4 3 13 5 6 8 9 10 11 12 17 15 18 19 7 1 2 14 20

We can divide this into three


smaller slices:

16 4 3 13 5 6 8
9 10 11 12 17 15 18
19 7 1 2 14 20
• Once we have performed this slicing we can sort each slice:
• First we sort the first column ( 16 9 and 19 ) to give 9 16 19
Next the second column ( 4 10 and 7 ) to give 4 7 10
• Column three ( 3 11 and 1 ) to give 1 3 11
• Column four ( 13 12 and 2 ) to give 2 12 13
• Column five ( 5 17 and 14 ) to give 5 14 17
• Column six ( 6 15 and 20 ) to give 6 15 20
and column seven ( 8 and 18 ) to give 8 18
16 4 3 13 5 6 8
9 10 11 12 17 15 18
19 7 1 2 14 20
9 4 1 2 5 6 8 16 7 3 12 14 15 18 19 10 11 13 17 20
Shell Sort Example: Part 2

• We now slice the array into a different number


of slices. For this example we will use five
slices, but other values are possible.

9 4 1 2 5 6 8 16 7 3 12 14 15 18 19 10 11 13 17 20

• Slicing this into five we get:


9 4 1 2
5 6 8 16
7 3 12 14
15 18 19 10
11 13 17 20
• Again, we sort each column to give:
5 3 1 2
7 4 8 10
9 6 12 14
11 13 17 16
15 18 19 20

• Logically reassembling, we now have the


dataset:
5 3 1 2 7 4 8 10 9 6 12 14 11 13 17 16 15 18 19 20
Shell Sort Example: Part 3 5
1
3
2
7 4

• We can now slice the array into 10:


8 10
9 6
12 14
11 13
17 16
15 18
1 2 19 20
5 3
7 4
8 6

• Sorting the columns gives us: 9


11
10
13
12 14
15 16
17 18
19 20
1 2 5 3 7 4 8 6 9 10 11 13 12 14 15 16 17 18 19 20
Shell Sort algorithm
find k0 so that 2k0 - 1 < size
for (k = k0; k > 0; --k) { // from larger increments to
smaller
inc = 2k- 1
for (i = 0; i < inc; ++i) {
insertionSort( [ a[i], a[i+inc], a[i+2*inc], ... ] )
}
}
Quick review
1. Insertion sort cards day 1
2. Merge sort day 1
3. Quick sort pivot day 2
4. Bubble sort two x two day 2
5. Heap Sort one rule day 2+3
6. Count Sort money day 3
7. Radix sort tree day 3
8. Spaghetti sort (‫)معكرونه‬ day 3
9. Shell Sort ‫تقسيم بعدين الشدة‬
http://sorting.at/
More Sort>>>
Timsort

• is a hybrid stable sorting algorithm, derived


from merge sort and insertion sort,
• By Peter McIlroy's
• In 2015, Dutch and German researchers in the
EU FP7 ENVISAGE project found a bug in
the standard implementation of Timsort
Cubesort

• is a parallel sorting algorithm that builds a self-


balancing multi-dimensional array from the
keys to be sorted. As the axes are of similar
length the structure resembles a cube. After
each key is inserted the cube can be rapidly
converted to an array
• Ref
Cypher, Robert; Sanz, Jorge L.C
(1992). "Cubesort: A parallel algorithm for
sorting N data items with S-sorters"
• Advantages
 Low memory overhead.
 Iterating a search cube is faster than a tree, making it an ideal data structure for
programming languages.
 O(1) insertion of sorted data.
 Innate support for O(root n) index searches.
 Balancing is fast and infrequent, at regular intervals.
• Disadvantages
 High memory overhead for small data sets.
 Complex data arrangement to increase cache utilization.
• Ref https://sites.google.com/site/binarysearchcube/

You might also like