0% found this document useful (0 votes)
17 views34 pages

Sorting Final

The document discusses several sorting algorithms including bubble sort, selection sort, insertion sort, merge sort, and quick sort. It provides an overview of each algorithm, describes their time and space complexities, and includes pseudocode examples. Merge sort is highlighted as an example of a divide and conquer algorithm that runs in O(n log n) time, improving upon the O(n^2) time of previous algorithms discussed like bubble, selection, and insertion sort.

Uploaded by

gogagoga2003.lmh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views34 pages

Sorting Final

The document discusses several sorting algorithms including bubble sort, selection sort, insertion sort, merge sort, and quick sort. It provides an overview of each algorithm, describes their time and space complexities, and includes pseudocode examples. Merge sort is highlighted as an example of a divide and conquer algorithm that runs in O(n log n) time, improving upon the O(n^2) time of previous algorithms discussed like bubble, selection, and insertion sort.

Uploaded by

gogagoga2003.lmh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 34

Sorting Algorithms

GROUP 20:
Do An Nam – 1624731
Ly Minh Hoang –
Nguyen Cong Bao -
Table of contents

01 Introduction to
Sorting Algorithms 04 Insertion Sort

02 Bubble Sort
05 Merge Sort

03 Selection Sort
06 Quick Sort
Introduction to Sorting
Sorting refers to arranging a set of data in some logical order.
For example, a telephone directory can be considered as a list where
each record has three fields – name, address and phone number.
Being unique, phone number can work as a key to locate any record
in the list.

Sorting is among the most basic problems


in algorithm design.
We are given a sequence of items, each
associated with given key value. And the
problem is to rearrange the items so that
they are in are increasing(or decreasing)
order by key.
Stable and not stable sorting
Stable Sorting
If a sorting algorithm, after sorting the
contents, does not change the
sequence of similar content in which
they appear, it is called stable sorting.

Not
Stable
Stable

Not Stable Sorting


If a sorting algorithm, after sorting the
contents, changes the sequence of
similar content in which they appear, it
is called unstable sorting.
Efficiency of Sorting Algorithm
01 02 03 04
The complexity of a sorting Various sorting methods are Determine the time Space constrains are usually
algorithm measures the analyzed in the cases like – requirement of sorting less important than time
running time of a function in best case, worst case or technique is to actually run considerations.
which n number of items are average case the program and measure it The reason for this can be, as
to be sorted. Most of sort methods we efficiency. for most sorting programs, the
The choice of sorting method consider have requirements Once a particular sorting amount of space needed is
depends on efficiency that range from 0(nlogn) to technique is selected the need closer to 0(n) than to 0(n 2).
considerations for different 0(n2). is to make the program as The second reason is that, if
problem. A sort should not be selected efficient as possible more space is required, it can
only because its sorting time almost always be found in
Three most important of is 0(nlogn); the relation of the Any improvement in sorting auxiliary storage.
these considerations are: file size n and the other time significantly affect the
-The length of time spent by factors affecting the actual overall efficiency and saves a
programmer in coding a sorting time must be great deal of computer time.
particular sorting program. considered.
-Amount of machine time
necessary for running the
program.
-The amount of memory
necessary for running the
program.
Bubble Sort

The bubble sort is an easy way to


Introduction arrange data in ascending or
descending order.
 Compare 2 elements that are side
by side and swap if necessary
 Repeat that step to the end of the Operating
list principle
 Go through the list again and
again until there is no swap to do
left
Bubble sort pseudo code
Do
Set swap flag to false.
For count is set to each subscript in array from 0
through the next-to-last subscript
If array[count] is greater than array[count + 1]
Swap the contents of array[count] and
array[count + 1].
Set swap flag to true.
End If.
End For.
While any elements have been swapped
Bubble sort efficiency

Time complexity: Space complexity: O(1)


Best case: O(n)
Worst case: O(n^2)
Average time complexity: O(n^2)
Selection Sort

01 About the selection sort


02 Operating Principle

When it comes to large array, the Repeatedly finding the


selection sort is better than bubble minimum/maximum element from the
sort because it moves elements into unsorted part and putting it at the
its right place immediately. beginning.
Selection
sort pseudo
code
• Time complexity:
 Time complexity of selection sort is O(n^2) in
Selection best and worst case
sort  Therefore, average time is also 0(n^2)
 Best case does O(1) swap
efficiency  Worst case does O(n) swaps
• Space complexity: O(1)
• Insertion sort is a sorting algorithm that
places an unsorted element at its suitable
place in each iteration .
• Operating principle:
 Insertion sort works similarly as we sort
cards in our hand in a card game.
Insertion sort
We assume that the first card is already
sorted then, we select an unsorted card. If the
unsorted card is greater than the card in hand,
it is placed on the right otherwise, to the left.
In the same way, other unsorted cards are
taken and put in their right place.
Insertion sort algorithm
Step 1 − If it is the first element, it is already sorted. return 1;
Step 2 − Pick next element
Step 3 − Compare with all elements in the sorted sub-list
Step 4 − Shift all the elements in the sorted sub-list that is greater than
the value to be sorted
Step 5 − Insert the value
Step 6 − Repeat until list is sorted
Insertion sort efficiency

 Time complexity
• Best case: O(n)
• If the array is all but sorted
• Inner Loop won’t execute so only some constant time the statement will run
• Worse case: O(n2)
• Array element in reverse sorted order

Space Complexity
• Since no extra space beside n variables is needed for sorting so
Space Complexity = O(n)
• The sorting algorithms we discussed so far
has the worst-case running time of Θ(n^2).
• When the array become longer and contain
Divide and more elements, the time it cost for sorting is
creasing.
conquer
• Two algorithms that have better running
algorithms time:
1. Merge sort.
2. Quick sort.
• Merge sort and quick sort share a similar idea.
Divide and conquer • Divide

algorithms • Conquer
• Combine
• Divide the array into two smaller array.
Merge sort
• The algorithms keep divide these sub-array by two until
there is only 1 element standing alone.

• Then it working on these element and try to put them in


right order.
Merge sort pseudo code
Main function
1. Declair the unsorted array
2. Call the merge sort array and send into it the array , 0 ,
size_of_array – 1
3. Print the sorted array
4. End program
 MergeSort function receive arr[] ; l ; r from the main function as
argument

1. Find the middle point to divide the array into two halves:
2. middle m = l + (r – l)/2
3. Call the function recursively:
4. Call mergeSort function for first half:
5. Call mergeSort(arr, l, m)
6. Call mergeSort function for second half:
7. Call mergeSort(arr, m + 1, r)
8. Merge the two halves sorted in steps 2 and 3:
9. Call merge(arr, l, m, r)
 Merge ( arr[] , l , m , r )
1. Set value of I and j to 0, k to l
2. Set value of n1 to ( m – l + 1 ) and n2 to ( r – m )
3. Create array L with n1 elements and array R with n2 element
4. Copy the first half to array L
5. Copy the second half to array R
6. While i is smaller than n1 and j is smaller than n2 do steps 7 through 13
7. If element at location i of array L is smaller than element at location j of array R do steps 8 through 12
8. Copy it to position k of the input array
9. Increase i by 1
10. Else ( If element at location i of array L is greater than element at location j of array R )
11. Copy it to position k of the input array
12. Increase j by 1
13. Increase k by 1
14. Stop
 Time efficiency
Merge sort • Total step log( n + 1 ).
efficiency • Running time O(n).
 Time complexity of O(n*log n).

 Time complexity of Merge Sort is O(n*Log n) in all the 3


cases (worst, average and best).
• Worst Case Time Complexity [ Big-O ]: O(n*log n)
• Best Case Time Complexity [Big-omega]: O(n*log n)
• Average Time Complexity [Big-theta]: O(n*log n)

 Space Complexity: O(n)


Quick sort is the fastest known sorting
algorithm in practice.

Best time performance.

Quick sort
Partioning the array to be sorted and
each partition in turn sorted
recursively.

Hence also called partition exchange


sort.
Pseudo code

 main function

1. Declair the unsorted array


2. Set n to hold the number of total element of the array
3. Call the quickSort function
4. Print the sorted array
5. Stop
 quickSort ( arr[] , low, high )

1. If low is less than high, do


2. Set value of index to the returning value of function partition
3. (Index is the location of the element that divided the array in to two sub array)
4. Call the function recursively ( to sort the sub array ):
5. quickSort ( arr[] , low, index – 1 )
6. quickSort ( arr[] , index – 1, high )
7. Stop
 partition ( arr[] , low , high )
1. Set value of p to element at position high of the array
2. Set value if i to low – 1 and j to low
3. While j is less than high
4. If element at position j of the array less than p,
5. Increase i by 1
6. Swap the element at position i with position j of the array
7. Increase j by 1
8. Swap the element at position ( i + 1 ) with position high of the array
9. The function return value of ( i + 1 )
10. Stop
Quick sort efficiency

 Time complexity
• Best case: O(n*logn).
• Average case: O(n*logn).
• Worse case: O(n2).
• Worst case in quick sort rarely occurs because by changing the choice
of pivot, it can be implemented in different ways.
 Worst case in quicksort can be avoided by choosing the right pivot
element.
THANK YOU
Do you have any question ?

You might also like