0% found this document useful (0 votes)
6 views

sorting_algorithm_animations

sorting

Uploaded by

Santosh Kumar
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views

sorting_algorithm_animations

sorting

Uploaded by

Santosh Kumar
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 3

Sorting

Insertion sort: works by taking elements from the list one by one and inserting
them in their correct position into a new sorted list. In arrays, the new list and
the remaining elements can share the array's space

Selection sort: works as follows:

1. Find the minimum value in the list


2. Swap it with the value in the first position
3. Repeat the steps above for the remainder of the list (starting at the
second position and advancing each time)

Selection sort always performs Θ(n) swaps, while insertion sort performs Θ(n2)
swaps in the average and worst cases. Because swaps require writing to the
array, selection sort is preferable if writing to memory is expensive.

Bubble sort: The algorithm starts at the beginning of the data set. It compares
the first two elements, and if the first is greater than the second, it swaps
them. It continues doing this for each pair of adjacent elements to the end of
the data set. It then starts again with the first two elements, repeating until no
swaps have occurred on the last pass.

Shell Sort improves insertion sort by comparing elements separated by a gap


of several positions. This lets an element take "bigger steps" toward its
expected position. Multiple passes over the data are taken with smaller and
smaller gap sizes. The last step of Shell sort is a plain insertion sort, but by
then, the array of data is guaranteed to be almost sorted.

Merge Sort: starts by comparing every two elements (i.e., 1 with 2, then 3 with
4...) and swapping them if the first should come after the second. It then
merges each of the resulting lists of two into lists of four, then merges those
lists of four, and so on; until at last two lists are merged into the final sorted
list.

Heap Sort: works by determining the largest element of the list, placing that at
the end of the list, then continuing with the rest of the list, but accomplishes
this task efficiently by using a data structure called a heap, a special type of
binary tree.
Quick Sort is a divide and conquer algorithm which relies on
a partition operation: to partition an array, we choose an element, called
a pivot, move all smaller elements before the pivot, and move all greater
elements after it. This can be done efficiently in linear time and in place. We
then recursively sort the lesser and greater sublists. The most complex issue
in quicksort is choosing a good pivot element; consistently poor choices of
pivots can result in drastically slower O(n²) performance, but if at each step
we choose a random entry as the pivot, the runtime is O(n log n).

More information about these sorts and the code can be see by clicking on the
algorithm name in the next activity.

Watching the algorithms work is good, can you apply it?

For each of the following questions, first use your understanding of the

algorithm to form a conjecture about the answer. Second, run an experiment

to test your solution. Third, if your conjecture did not agree with the actual

results, explain the differences.

1. If the data is mostly in order, which algorithm is the fastest?

2. How is quicksort affected by the initial order of data? In particular, if we

have few unique values, does that make the quicksort faster or slower?

3. Which algorithms require extra space?

Understanding Sorting Methods

Sorts have been studied for decades. The ideal sort depends on the initial
order of the data and whether stability is required.

Stability: A sort is said to be stable if it maintains the relative order of records


with equal keys. If all keys are different then this distinction is not necessary.
But if there are equal keys, then a sorting algorithm is stable if whenever there
are two records (let's say R and S) with the same key, and R appears before
S in the original list, then R will always appear before S in the sorted list.

Non-oblivious (adaptable): A sort is said to be non-oblivious or adaptable if the


presortedness of the input reduces the running time.

Memory Usage: Some algorithms store the values in the original array while it
is being sorted and don't need any extra space. We call these algoirthms in
place. Recursive methods need extra space for a runtime stack. Others
require working space in addition to the original storage space.

Complexity: Some sorts are O(n log n) while others are O(n2).

Some sorting methods are negatively impacted by data that is in reverse order
or data that has very few unique entries.

In the sort detective which follows, you are able to explore various properties
of sorting algorithms. Notice:

a. The specific data to be sorted affects running time. The same sort doesn't always
take the same time for a specific data size.

b. For large data sets, the difference in speed between algorithms is dramatic.

You might also like