Sorting algorithm
From Wikipedia, the free encyclopedia
A sorting algorithm is an algorithm that puts elements ofa list in a certain order. ‘The most-used orders are numerical order and
lexicographical order. Efficient sorting is important for optimizing the use of other algorithms (such as search and merge
algorithms) which require input data to be in sorted lists; i is also often useful for canonicalizing data and for producing human-
readable output. Mote formally, the output must satisfy two conditions:
1. The output is in nondecreasing order (each element is no smaller than the previous element according to the desired total
order);
2. The output is a permutation (reordering) of the input
Further, the data is often taken to be in an array, which allows random access, rather than a list, which only allows sequential
access, though often algorithms can be applied with suitable modification to either type of data,
Since the dawn of computing, the sorting problem has attracted a great deal of research, perhaps due to the complexity of solving it
efficiently despite its simple, familiar statement. For example, bubble sort was analyzed as carly as 1956.11 A fundamental limit of
comparison sorting algorithms is that they require linearithmic time — O(1 log n)— in the worst case, though better performance is
possible om real-world data (such as almost-sorted data), and algorithms not based on comparison, such as counting sort, can have
better performance. Although many consider sorting a solved problem — asymptotically optimal algorithms have been known since
the mid-20th century — useful new algorithms are still being invented, with the now widely used Timsort dating to 2002, and the
library sort being first published in 2006.
Sorting algorithms are prevalent in introductory computer science classes, where the abundance of algorithms for the problem
provides a gentle introduction to a variety of care algorithm concepts, such as big O notation, divide and conquer algorithms, data
structures such as heaps and binary trees, randomized algorithms, best, worst and average case analysis, time-space tradeoffs, and
upper and lower bounds.
Contents
= 1 Classification
= LI Stability
‘= 2 Comparison of algorithms
= 3 Popular sorting algorithms
= 3.1 Simple sorts
= 3.1.1 Insertion sort
= 3.1.2 Selection sort
= 3.2 Efficient sorts
= 3.2.1 Merge sort
= 3.2.2 Heapsort
= 3.2.3 Quicksort
= 3.3 Bubble sort and variants
= 33.1 Bubble sort
= 3.3.2 Shell sort
= 3.3.3 Comb sort
= 34 Distribution sort
= 3.4.1 Counting sort
= 3.4.2 Bucket sort
= 3.4.3 Radix sort
= 4 Memory usage patterns and index sorting
1 5 Inefficient sorts
= 6 Related algorithms
= 7 See also
= 8 References
= 9 Further reading
= 10 External links
Classification
Sorting algorithms are often classified by:
= Computational complexity (worst, average and best behavior) of element comparisons in terms of the size of the list (n). Fortypical serial sorting algorithms good behavior is O(n log n), with parallel sort in O(log? n, and bad behavior is O(4?). (See
Big O notation.) Ideal behavior for a serial sort is O(n), but this is not possible in the average case. Optimal parallel sorting is
O(log n). Comparison-based sorting algorithms, which evaluate the element ofthe list via an abstract key comparison
‘operation, need atleast O(n log n) comparisons for most inputs
+ Computational complexity of swaps (for "in place” algorithms).
+ Memory usage (and use of other computer resources). In particular, some sorting algorithms are “in place". Strictly, an in
place sort needs only O(1) memory beyond the items being sorted; somtimes O(log(n) additional memory is considered "in
place’
+ Recursion, Some algorithms ae either recursive or non-recusive, while others may be both (eg, merge sor)
+ Stability: stable sorting algorithms maintain the relative order of records with equal keys (i, values).
+ Whether or not they are a comparison sort. A comparison sort examines the data only by comparing two elements with a
comparison operator.
+ Gencral method: insertion, exchange, sclection, merging, ete, Exchange sorts include bubble sort and quicksort. Selection
sorts include shaker sort and heapsor. Also whether the algorithm is serial or parallel. The remainder of this discussion
almost exclusively concentrates upon serial algorithms and assumes serial operation.
+ Adaptability: Whether or not the presortedness ofthe input affect the running time. Algorithms that take this into account
are known to be adaptive
Stability
‘When sorting some kinds of data, only part ofthe data is examined when determining the
sort order, For example, in the card sorting example to the right, the cards are being sorted
by their rank, and their suit is being ignored. This allows the possibility of multiple different
correctly sorted versions of the original list. Stable sorting algorithms choose one of these,
according to the following rule: if two items compare as equal, like the two 5 cards, then
their relative order will be preserved, so that if one came before the other in the input, it will
also come before the other in the output.
More formally, the data being sorted can be represented as a record or tuple of values, and
the part of the data that is used for sorting is called the key. In the card example, cards are
represented as a record (rank, suit), and the key is the rank. A sorting algorithm is stable if
whenever there are two records R and S with the same key, and R appears before S$ in the
original list, then R will always appear before S in the sorted list. ae tee loo bee
ae
‘When equal elements ate indistinguishable, such as with integers, or more generally, any +8)
data where the entire element is the key, stability is not an issue. Stability is also not an
issue if all keys are different. :
Unstable sorting algorithms can be specially implemented to be stable. One way of doing.
this is t artificially extend the key comparison, so that comparisons between two objects agile eg
with otherwise equal keys are decided using the order ofthe entries inthe original input lst
{An example of sable sorting on
asa tie-breaker, Remembering this order, however, may require additional time and space, P ‘
playing cards. When the cards are
sorted by rank with a stable sor, the
‘ovo Ss must remain in the same order
in the sorted output that they were
originally in, When they are sorted
with @ non-stable sort, the §s may end
up in the opposite order in the sorted
One application for stable sorting algorithms is sorting a list using a primary and secondary
key. For example, suppose we wish to sort a hand of cards such that the suits are in the order
clubs (#), diamonds (4), hearts (¥), spades (4), and within cach suit, the cards are sorted by
rank. This can be done by first sorting the cards by rank (using any sort), and then doing a
stable sort by suit
Bye fie fee
“Ee . ee
73
9 £ ee
Fe lee fee
‘ ate
* * + 2s
¢ 2% #8
4 2
fee Tas lee
St tt
eer ee vy
L 2L? 94
Within each suit, the stable sort preserves the ordering by rank that was already done. This idea can be extended to any number of
keys, and is leveraged by radix sort, The same effect can be achieved with an unstable sort by using a lexicographic key
comparison, which e.g, compares first by suits, and then compares by rank if the suits are the same.Comparison of algorithms
In this table, m is the number of records to be sorted, The columns "Average" and "Worst" give the time complexity in each case,
under the assumption that the length of each key is constant, and that therefore all comparisons, swaps, and other needed operations
can proceed in constant time, "Memory" denotes the amount of auxiliary storage needed beyond that used by the list itself, under
the same assumption, The run times and the memory requirements listed below should be understood to be inside big O notation,
Logarithms are of any base; the notation Jog? n, means (log n)*.
‘These are all comparison sorts, and so cannot perform better than O(7 log n) in the average or worst case.Comparison sorts
‘Name Best [Average Worst Memory [Stable | Method | Other notes
‘Quicksor is
usually done in
place with
‘O(log n) stack
space. Most
implementations
are unstable, as
typical stable in-place
in-place partitioning is
sortis ‘more complex.
not "Naive variants
stable; |PATEHOMINE ase an O(n) space
stable array to store the
versions partition
exist ‘Quicksort variant
‘using three-way
(at) partitioning
takes O(n)
‘comparisons
‘when sorting an
array of equal
keys
log non
average, worst
Sedgewick
variation is
log n worst
case
Quicksort nlogn) nlogn
Highly
parallelizable (up
{to O(log n) using
the Three
Hungarians’
Mergesot — |nlogn) nlogn nlogn n Yes | Merging Algorithm?! or,
‘more practically,
‘Cole's parallel
‘merge sort) for
processing large
‘amounts of data.
‘Can be
implemented as a
In-place merge sort — — nlog?n 1 Yes | Merging stable sort hase
‘on stable in-place
merging.)
Heapsort nlogn) nlogn 1 No | Selection
(O(n-+ d), in the
\worst case over
sequences that
have d inversions.
Insertion sort n | 7 1 Yes | Insertion
attoning Used Fever
Introsort nlogn) nlogn nlogn logn No |g Selection
implementations.
Stable with O(n)
extra space, for
‘example using
lists.
Selection sort
1 No | Selection
Makes
‘comparisons
‘when the data is
already sorted or
reverse sorted.
Insertion &
Timsort Yes
n | nlogn nlogn n enon
Makes 1
somprisons
Cubesort n | nlogn — nlogn | Yes | Insertion when the data is
already sorted or
reverse sorted
Small code size,Shell sort n
nlog?n
2
Depends on gap
sequence;
best known is
log? n
Insertion
‘no use of call
stack, reasonably
fast, useful where
memory is ata
‘premium such as
‘embedded and
‘older mainframe
‘applications.
Bubble sort n
=
Yes
Exchanging |
‘Tiny code size.
Binary tree sort n
nlogn (balanced)
Yes
Insertion
When using a
self-balancing
binary search tree.
Cycle sort
Insertion
In-place with
‘theoretically
‘optimal number
of waite.
Library sort n
Yes
Insertion
Patience sorting n
No
Insertion &
Selection
‘Finds all the
longest increasing
subsequences in
(O(n ox m).
‘Smoothsort n
No
Selection
‘An adaptive sort:
comparisons
‘when the data is
already sorted,
land 0 swaps.
Strand sort n
Yes
Selection
Tournament sort nlogn
No
Selection
‘Variation of Heap
Sort.
Cocktail sort n
‘Yes
Exchanging |
Comb sort n
‘No
Exchanging |
Small code size.
Gnome sort n
Yes
Exchanging |
Tiny code size.
UnShuffle Sort! kN
kN
kN
In place for
linked lists.
N*sizeoftlink)
for array.
Distribution
and Merge
[No exchanges are
performed. The
‘parameter’ is
proportional to
‘the entropy in the
input. K= 1 for
‘ordered or
ordered by
reversed input
Francesehini's method!) —
nlogn
nlogn
‘Yes
Block sort n
nlogn
nlogn
Yes
Insertion &
‘Merging
‘Combine a block-
‘based O(n) in-
place merge
algorithm'®) with
‘bottom-up
‘merge sort.
Odd-even sort n
Yes
Exchanging
‘Can be run on
parallel
‘processors easily,
The following table describes integer sorting algorithms and other sorting algorithms that are not comparison sorts. As such, they
are not limited by a(n log m) lower bound. Complexities below assume nr items to be sorted, with keys of size k, digit size d,
and r the range of numbers to be sorted. Many of them are based on the assumption that the ey size is large enough that all entries
have unique key values, and hence that 2 << 2, where << means "much less than."Non-comparison sorts
Name [Best [Average | Worst [Memory | Stable | <<2* Notes
Pigeonhole
eet | | e+ | on tok a Yes Yes
Bucket
sort 2 ‘Assumes uniform distribution of elements
(uniform | — | +k | n?-k nek | Yes No lfrom the domain in the array. (9!
keys)
Bucket
integer — | ntr n+r n+r | Yes Yes —_Ifris O(n), then Average is O(n).
keys)
Counting —|ntr ntr n+r | Yes Yes fri O(n), then Average is O(n).
LsD k k a
—| nt] nes | neat] ves No 400)
Radix Sort f Goi mt
MSD. k k a Stable version uses an external array of
Radix Sot) MG MG | MHA YS | NO yo holdall ofthe bins
MSD k k L
Radix Sot) — one ones ge No No “recursion levels, 2 for count array.
(in-place) d d
k k k ‘Asymptotic are based on the assumption
Spreadsort n-— |n-[—4+d]) =.2¢ No No that m << 2, but the algorithm does not
a 8 a require this.
k k k as better constant factor than radix sort
Bustor| — | nk | ae® n*® | we No __forsorting strings. Though relies somewhat
a a a ‘on specifics of commonly encountered
strings.
Can be ‘Requires uniform distribution of elements
‘made from the domain in the array to run in
stable with lincar time. If distribution is extremely
Fhashsot |) om) mtr n additional | No _|skewed then it can go quadratic if
O(n) underlying sort is quadratic (it is usually an
space insertion sort) In-place version is not
used. stable,
Post k k ‘A variation of bucket sort, which works
ostman ee f at ‘very similar to MSD Radix Sort. Specific
—in n = Ne
sort a a | m+? 10 fo post service needs.
‘Samplesort can be used to parallelize any of the non-comparison sorts, by efficiently distributing data into several buckets and then
‘passing down sorting to several processors, with no need to merge as buckets are already sorted between each other.
‘The following table describes some sorting algorithms that are impractical for real-life use due to extremely poor performance or
specialized hardware requirementsName | Best | Average | Worst [Memory | Stable [Comparison Other notes
‘Works only with positive
integers. Requires specialized
hardware for i to run in
{guaranteed O(a) time. There is
« possibility for software
Beadsot) = s s ne NA No implementation, but running
time will be O(S), where S is
sum of ll integers to be
sorted, in case of small
integers it can be considered
to be linear.
Simple
pancake | — n n logn No Yes Count is number of flips.
sort
‘This is a linear-time, analog
algorithm for sorting a
Spaghetti sequence of items, requiring
(oll) n n n n Yes Polling O(n) stack space, and the sort
sort is stable. This requires n
parallel processors. See
Spaghetti sort#Analysis.
Varies
(stable (Order of comparisons are set
Sorting 2 2 2 2, | sorting in advance based on a fixed
network| log'm | log’n | log'n | mlog*n | petworks Yes network size, Imprectical for
require more ‘more than 32 items.
comparisons)
Bitonie 2 2 2 2 ‘An eflective variation of
Bitoni | tog? | dog?n | log?n | nlog?n No Ye eon
‘Random shuffling, Used for
‘example purposes only, as
Bogosort| =m. nent 0° 1 No Yes ‘sorting with unbounded worst
case running time,
‘Slower than most of the
Stooge . . sorting algorithms (even naive
AOOBE | Hog 3/ og 1.5] lo 3/106 15| og 3/005 1.5, No Yes ones) wth atime complexity
O¢albe 0815) = O42 755.»
‘Theoretical computer scientists have detailed other sorting algorithms that provide better than O(n log n) time complexity assuming
additional constraints, including:
= Han's algorithm, a deterministic algorithm for sorting keys from a domain of finite size, taking O(n log log n) time and O(n)
ce 0h
space.
+ Thorup's algorithm, a randomized algorithm for sorting keys from a domain of finite size, taking O(n log log n) time and.
O(n) space.)
= A randomized integer sorting algorithm taking O(n y/log log n) expected time and O(n) space.!"9)
Popular sorting algorithms
While there are a large number of sorting algorithms, in practical implementations a few algorithms predominate, Insertion sort is
widely used for small data sets, while for large data sets an asymptotically efficient sort is used, primarily heap sort, merge sort, or
uicksort, Efficient implementations generally use a hybrid algorithm, combining an asymptotically efficient algorithm for the
overall sort with insertion sort for small lists atthe bottom of a recursion. Highly tuned implementations use more sophisticated
vatiants, such as Timsort (merge sor, insertion sort, and additional logic), used in Android, Java, and Python, and introsort
(quicksort and heap sort), used (in variant forms) in some C++ sort implementations and in NET.
For mote restricted data, such as numbers in a fixed interval, distribution sorts such as counting sort or radix sort are widely used.
Bubble sort and variants are rarely used in practice, but are commonly found in teaching and theoretical discussions.When physically sorting objects, such as alphabetizing papers (such as tests or books), people intuitively generally use insertion
sorts for small sets, For larger sets, people often first bucket, such as by initial letter, and multiple bucketing allows practical sorting
of very large sets, Often space is relatively cheap, such as by spreading objects out on the floor or over a large area, but operations
are expensive, particularly moving an object a large distance — locality of reference is important, Merge sorts are also practical for
physical objects, particularly as two hands can be used, one for each list to merge, while other algorithms, such as heap sort or
quick sort, are poorly suited for human use. Other algorithms, such as library sort, a variant of insertion sort that leaves spaces, are
also practical for physical use.
Simple sorts
Two of the simplest sorts are insertion sort and selection sort, both of which are efficient on small data, due to low overhead, but
not efficient on lange data. Insertion sort is gencrally faster than selection sort in practice, due to fewer comparisons and good
performance on almost-sorted data, and thus is preferred in practice, but selection sort uses fewer writes, and thus is used when
write performance is a limiting factor.
Insertion sort
Main article: Insertion sort
Insertion sort isa simple sorting algorithm that is relatively efficient for small lists and mostly sorted lists, and i often used as part.
of more sophisticated algorithms. It works by taking elements from thelist one by one and inserting them in their correct position
into a new sorted list. In arrays, the new lst andthe remaining clements can share the arrays space, but insertion is expensive,
requiring shifting al following elements over by one. Shell sor (see below) isa variant of insertion sort that is more efficient for
larger lists.
Main article: Selection sort
Selection sort is an in-place comparison sort. It has O(n?) complexity, making it inefficient on large lists, and generally performs
‘worse than the similar insertion sort, Selection sort is noted for its simplicity, and also has performance advantages over more
complicated algorithms in certain situations.
The algorithm finds the minimum value, swaps it with the value in the first position, and repeats these steps for the remainder of the
list!51[t does no more than n swaps, and thus is useful where swapping is very expensive.
Efficient sorts
Practical general sorting algorithms are almost always based on an algorithm with average complexity (and generally worst-case
complexity) O(a log n), of which the most common are heap sort, merge sort, and quicksort. Each has advantages and drawbacks,
with the most significant being that simple implementation of merge sort uses O(n) additional space, and simple implementation of
quicksort has O(2°) worst-case complexity. These problems can be solved or ameliorated at the cost ofa more complex algorithm,
While these algorithms are asymptotically efficient on random data, for practical efficiency on real-world data various
modifications are used. First, the overhead of these algorithms becomes significant on smaller data, so often a hybrid algorithm is
used, commonly switching to insertion sort once the data is small enough. Second, the algorithms often perform poorly on already
sorted data or almost sorted data - these are common in real-world data, and can be sorted in O(n) time by appropriate algorithms.
Finally, they may also be unstable, and stability is often a desirable property in a sort. Thus more sophisticated algorithms are often
employed, such as Timsort (based on merge sort) or introsort (based on quicksort, falling back to heap sort)
Merge sort
Main article: Merge sort
Merge sort takes advantage of the ease of merging already sorted lists into a new sorted list. It starts by comparing every two
elements (ie., 1 with 2, then 3 with 4...) and swapping them if the first should come after the second, It then merges each of the
resulting lists of two into lists of four, then merges those lists of four, and so on; until at last two lists are merged into the final
sorted list."6] Of the algorithms described here, this is the first that scales well to very large lists, because its worst-case running
time is O(n log n). [tis also easily applied to lists, not only arrays, as it only requires sequential access, not random access.
However, it has additional O(n) space complexity, and involves a large number of copies in simple implementations.Merge sort has seen a relatively recent surge in popularity for practical implementations, due to its use in the sophisticated
algorithm Timsort, which is used for the standard sort routine in the programming languages Python!"7I and Java (as of JDK7l'$),
‘Merge sort itself is the standard routine in Perl,!"*! among others, and has been used in Java at least since 2000 in JDK1.3,20121
Heapsort
Main article: Heapsort
Heapsort is a much more efficient version of selection sort, It also works by determining the largest (or smallest) element of the list,
placing that at the end (or beginning) of the list, then continuing with the rest ofthe list, but accomplishes this task efficiently by
using a data structure called a heap, a special type of binary tree. 22] Once the data list has been made into a heap, the root node is
guaranteed to be the largest (or smallest) element. When itis removed and placed at the end of the list, the heap is rearranged so the
largest element remaining moves to the root, Using the heap, finding the next largest element takes O(log n) time, instead of O(n)
for a linear scan as in simple selection sort. This allows Heapsort to run in O(a log n) time, and this is also the worst case
complexity,
Quicksort
Main article: Quicksort
Quicksortisa divide and conquer algorithm which relies on a partition operation: to patton an array an element called a pivor is
selected.124 All elements smaller than the pivot are moved before it and all greater elements are moved after it. This ean be done
efficiently in linear ime and in-place, The lesser and greater sublists are then recursively sorted, This yields average time
complexity of O(n log n), with low ovethead, and thus this is a popular algorithm. Efficient implementations of quicksort (with in-
place partitioning) are typically unstable sorts and somewhat complex, but are among the fastest sorting algorithms in practice.
Together with its modest O(log m) space usage, quicksor is one ofthe most popular sorting algorithms and is available in many
standard programming libraries,
The important caveat about quicksort is that its worst-case performance is O(n”); while this is rare, in naive implementations
(choosing the first or last element as pivot) this occurs for sorted data, which is a common case. The most complex issue in
uicksort is thus choosing a good pivot element, as consistently poor choices of pivots can result in drastically slower O(n?)
performance, but good choice of pivots yields O( log n) performance, which is asymptotically optimal. For example, if at each
step the median is chosen as the pivot then the algorithm works in O( log 7). Finding the median, such as by the median of
medians selection algorithm is however an O() opcration on unsorted lists and therefore exacts significant overhead with sorting.
In practice choosing a random pivot almost certainly yields O(n log n) performance.
Bubble sort and variants
Bubble sort, and variants such as the cocktail sort, are simple but highly inefficient sorts. They are thus frequently seen in
introductory texts, and are of some theoretical interest due to ease of analysis, but they are rarely used in practice, and primarily of
recreational interest, Some variants, such as the Shell sort, have open questions about their behavior.
Bubble sort
Main article: Bubble sort
Bubble sort isa simple sorting algorithm. The algorithm starts at the beginning of the data
set, It compares the first two elements, and if the first is greater than the second, it swaps
them, It continues doing this for each pair of adjacent elements to the end of the data set. It
then starts again with the first two elements, repeating until no swaps have occurred on the
last pass.!5) This algorithm's average and worst-case performance is O(n”), so it is rarely
used to sort large, unordered data sets, Bubble sort can be used to sort a small number of
items (where its asymptotic inefficiency is not a high penalty). Bubble sort can also be used.
efficiently on a list of any length that is nearly sorted (that is, the elements are not
significantly out of place). For example, if any number of elements are out of place by only
one position (¢.g. 0123546789 and 1032547698), bubble sors exchange will get them in
order on the first pass, the second pass will find all elements in order, so the sort will take
only 2n time.
A bubble sort, a sorting algorithm
‘that continuously steps through alist,
‘swapping items until they appear in
the correct order,
Shell sort
Main article: Shell sort‘Shell sort was invented by Donald Shell in 1959. It improves upon bubble sort and insertion sort by moving out of order elements
more than one position at a time. One implementation can be described as arranging the data
sequence in a two-dimensional array and then sorting the columns of the array using
insertion sort,
Comb sort
Main article: Comb sort
Comb sort isa relatively simple sorting algorithm originally designed by Wlodzimierz
Dobosiewicz in 1980.5) It was later rediscovered and popularized by Stephen Lacey and
Richard Box with a yfe Magazine article published in April 1991. Comb sort improves on
bubble sort. The basic idea isto eliminate turtles, or small values near the end ofthe lst
since in a bubble sort these slow the sorting down tremendously. (Rabbis, large values [A Shell sot, diferent from bubble
around the beginning ofthe list, do not pose a problem in bubble sort) sor in tat it moves elements
merous swapping positions
Distribution sort
‘See also: External sorting
Distribution sort refers to any sorting algorithm where data are distributed from their input to multiple intermediate structures
which are then gathered and placed on the output. For example, both bucket sort and flashsort are distribution based sorting,
algorithms. Distribution sorting algorithms can be used on a single processor, or they can be a distributed algorithm, where
individual subsets are separately sorted on different processors, then combined, This allows extemal sorting of data too large to fit
into a single computer's memory.
Counting sort
Main article: Counting sort
Counting sort is applicable when each input is known to belong to @ particular set, S, of possibilities. The algorithm runs in O({S] +
1) time and O(|S]) memory where 7 is the length of the input. It works by creating an integer array of size |S] and using the ith bin to
count the occurrences of the ith member of Sin the input. Each input is then counted by incrementing the value of its corresponding,
bin, Afterward, the counting array is looped through to arrange all of the inputs in order. This sorting algorithm often cannot be
used because S needs to be reasonably small for the algorithm to be efficient, but it is extremely fast and demonstrates great
asymptotic behavior as n inereases. It also can be modified to provide stable behavior.
Bucket sort
Main article: Bucket sort
Bucket sort is a divide and conquer sorting algorithm that generalizes Counting sort by partitioning an array into a finite number of
buckets. Each bucket is then sorted individually, either using a different sorting algorithm, or by recursively applying the bucket
sorting algorithm.
A bucket sort works best when the elements of the data set are evenly distributed across all buckets.
Radix sort
Main article: Radix sort
Radix sort is an algorithm that sorts numbers by processing individual digits. n numbers consisting of k digits each ate sorted in O(n
4) time, Radix sort can process digits of cach number cither starting from the least significant digit (LSD) or starting from the
‘most significant digit (MSD). The LSD algorithm first sorts the list by the least significant digit while preserving their relative order
using a stable sort. Then it sorts them by the next digit, and so on from the least significant fo the most significant, ending up with a
sorted list. While the LSD radix sort requires the use of a stable sort, the MSD radix sort algorithm does not (unless stable sorting is
desired). In-place MSD radix sort is not stable. It is common for the counting sort algorithm to be used internally by the radix sort
A hybrid sorting approach, such as using insertion sort for small bins improves performance of radix sort significantly.
Memory usage patterns and index sorting
When te sizeof the aray tobe sorted approaches or exceeds the available primary memory, so that (mach slower) disk o swap
space must be employed, the memory usage pattern of a sorting algorithm becomes important, and an algorithm that might have
been fairly efficient when the array fit easily in RAM may become impractical. In this scenario, the total number of comparisonsbecomes (relatively) less important, and the number of times sections of memory must be copied or swapped to and from the disk
can dominate the performance characteristics of an algorithm, Thus, the number of passes and the localization of comparisons ean
bbe more important than the raw number of comparisons, since comparisons of nearby elements to one another happen at system bus
speed (or, with caching, even at CPU speed), which, compared to disk speed, is virtually instantaneous.
For example, the popular recursive quicksort algorithm provides quite reasonable performance with adequate RAM, but due to the
recursive way that it copies portions of the array it becomes much less practical when the array does not fit in RAM, because it may
ccause a number of slow copy or move operations to and from disk. In that scenario, another algorithm may be preferable even if it
requires more total comparisons.
(One way to work around this problem, which works well when complex records (such as in a relational database) are being sorted
by a relatively small key field, is to create an index into the array and then sort the index, rather than the entire array. (A sorted
version of the entire array can then be produced with one pass, reading from the index, but often even that is unnecessary, as having
the sorted index is adequate.) Because the index is much smaller than the entire array, it may fit easily in memory where the entire
array would not, effectively eliminating the disk-swapping problem. This procedure is sometimes called "tag sort [27
Another technique for overcoming the memory-size problem is to combine two algorithms in a way that takes advantage of the
strength of each to improve overall performance. For instance, the array might be subdivided into chunks of a size that will fit in
RAM, the contents of each chunk sorted using an efficient algorithm (such as quicksort), and the results merged using a kway
‘merge similar to that used in mergesort. This is faster than performing cither mergesort or quicksort over the entire list.
Techniques can also be combined. For sorting very large sets of data that vastly exceed system memory, even the index may need
to be sorted using an algorithm or combination of algorithms designed to perform reasonably with virtual memory, ic., to reduce
the amount of swapping required
Inefficient sorts
‘Some algorithms are slow compared to those discussed above, such as the bogosort O(n-n!) and the stooge sort O(n”),
Related algorithms
Related problems include partial sorting (sorting only the & smallest elements of a list, or alternatively computing the k smallest
elements, but unordered) and selection (computing the th smallest element). These can be solved inefficiently by a total sort, but
‘mote efficient algorithms exist, often derived by generalizing a sorting algorithm, The most notable example is quickselect, which
is related to quicksort. Conversely, some sorting algorithms can be derived by repeated application of a selection algorithm;
quicksort and quickselect can be seen as the same pivoting move, differing only in whether one recurses on both sides (quicksort,
divide and conquer) or one side (quickselect, decrease and conquer).
A kind of opposite of a sorting algorithm is a shuffling algorithm. These are fundamentally different because they require a source
of random numbers. Interestingly, shuffling can also be implemented by a sorting algorithm, namely by a random sort: assigning a
random number to each element of the list and then sorting based on the random numbers. This is generally not done in practice,
however, and there is a well-known simple and efficient algorithm for shuffling: the Fisher-Yates shufMl.
Sorting algorithms are also given for parallel computers. These algorithms can all be run on a single instruction stream mi
data stream computer. Habermann's Paaller Neighbourhood Sort 2! sorts k elements using k processors ink steps. This article!)
introduces Optimal Algorithms for Paraller Computers where rk elements ean be sorted using k processors ink steps
See also
= Collation
Sorting network (compare)
Schwartzian transform
Scarch algorithm
Quantum sort
References
1. Demuth, H. Electronie Data Sorting, PRD thesis, Stanford University, 1956.
Atal, M.; Komlos, J; Szemerédi, E. (1983), An Ofn log n) sorting network, STOC '83. Proceedings ofthe fifteenth annual ACM
‘symposium on Theory of computing: |-9, doi:10.1145/800061,808726, ISBN 0-89791-099-0,
Huang, B, C.; Langston, M. A. (December 1992). "Fast Stable Merging and Sorting in Constant Extra Space" (PDF). Comput, J. 35 (6)
643-650, doi:10,109S/cominl35.6.643, CiteSeerX: 10.1.1,54,8381,
4, hitps/wnw:algolist-netAlgorithms/Sortng/Selection_sort,
5. hitpy/dbs.uni-leipzig.delshripte!ADSU/PDP4/kap4, pdt6, Kagel, Art (November 1985). "Unshutfle, Not Quite a Sort". Computer Language 2 (11)
17. Franceschini, G. (June 2007). "Sorting Stably, in Place, with O(n log n) Comparisons and O{n) Moves". Theory of Computing Systems 40
(4): 327-353. doi:10,1007'800224-006-1311-1
8. Kim, P.S.; Kutaner, A. (2008). Ratio Based Stable In-Place Merging. TAMC 2008. Theory and Applications of Models of Computation.
LNCS 4978: 246-257. doi:10.1007/978-3-540-79228-4 22. ISBN 978-3-540-79227-7.
9. Cormen, Thomas H.; Leiserson, Charles E.; Rivest, Ronald L.; Stein, Clifford (2001) [1990]. Introduction to Algorithms (2nd ed). MIT
Press and McGraw-Hill. ISBN 0-262-03293-7.
10. Goodrich, Michael T.; Tamassia, Roberto (2002). "4.5 Bucket-Sort and Radix-Sor. Algorithm Design: Foundations, Analysis, and
Internet Examples. John Wiley & Sons. pp. 241-243. ISBN 0-471-38365-1
1 Tan, ¥. (lanuary 2004), "Deterministic sorting in O(a log log 2) time and linear space", Journal of Algorithms $0: 96-105.
doi 1010165 jalgor.2003, 09.001
12, Thorup, M. (February 2002). "Randomized Sorting in O(n log log n) Time and Linear Space Using Addition, Shif, and Bit-wise Boolean
Operations". Journal of Algorithms 42 (2): 205-230. dois101006/jagm 2002. 1211,
13, Yijie Han; Thorup, M. (2002). Integer sorting in Ofn log log n)) expected time and linear space. FOCS 2002, The 4rd Annual IEEE
‘Spmposivon on Foundations of Computer Science, 2002. Proceedings: 135-144, doi:10, O9/SFCS,2002, 1181890, ISBN 0-769-1822-2,
14, Wirth, Niklaus (1986), Algorithms & Data Structures, Upper Saddle River, NJ: Prentice-Hall, pp. 76-77, ISBN 0130220051
15. Wirth 1986, pp. 79-80,
16. Wirth 1986, pp. 101-102
17, Tim Peter's original description of timsort (htp!/svn.python.org/projests/pythonvtrunk/Objectsilistsor.xt)
18, OpenJDK's TimSort java
itp: er. openjdk java ned~martin'webrevs/openjdk7 imsortraw_filesinewisrelshare/classes/java/utivTimSort java)
19, Perl sort documentation (hp:/iperldoe,perl.org/functions/sort nl)
20. Merge Sort in Java 1.3 (hitp:/java.sun,com/2se!1.3/does/apifjava ul’ Arrays. himl#sor{java lang, Object
21, Java 1.3 live since 2000
22. Wirth 1986, pp. 87-89
23, Wirth 1986, p, 93
24, Cormen, Thomas H.; Leiserson, Charles E.; Rivest, Ronald L.; Stein, Clifford (2009), Introduction to Algorithms (3rd ed), Cambridge,
MA: The MIT Press, pp. 171-172, ISBN 0262033844
25. Wirth 1986, pp. 81-82
26, Brejovi, B. (15 September 2001). "Analyzing variants of Shellsort". Inform. Process. Lett 79 (5): 223-227. doi-10.1016/S0020-
10190(00)00223.
27, Definition of "tag sort” according to PC Magazine (http://www. pemag, com/encyclopedia_termi/0,2542,t-tag*sort&i=52532,00.asp)
28, Inpz/repository.crmu.edlegi'viewcontentcgarticle-30868context-compsci
3D), Sun
29, hitps/repository.crou.edulcgi'viewcontent.cgarticle=28768&context=compsci
Further reading
= Knuth, Donald E, (1998), Sorting and Searching, The Art of Computer Programming 3 (2nd ed.), Boston: Addison-Wesley,
ISBN 0201896850
External links
= Sorting Algorithm Animations (http://www sorting-algorithms.com/) - Graphical “The Wikibook Algorithm
Strath Gite igen nde bss ies of ns ee To wink rt
felabondelatggoriincgsatrclioenhtn)Eeplenaonsandanayeesof | seen
sony ag aeton
= Dictionary of Algorithms, Data Structures, and Problems (http://www.nist.gov/dads/) e
oe ee ee nna mee Tew Adee
(htip://www.sofipanorama.org/Algorithms/sorting shtml) Discusses several classic gorithms
eran ee nanan ance wos naan ree
« eStrang Algetonnin intr (Yomib fag pie omens ‘ads cman
taneaiiktcy vendita! wiht oS Sng Ania | gS maeatoan
Minutes. algorithms,
= A036604 sequence in OEIS database titled "Sorting numbers: minimal number of
‘comparisons needed to sort n elements" (https:/oeis.org/A036604) which performed by Ford-Johnson algorithm.
Retrieved from "https://en.wikipedia.org/w/index.php?title~Sorting_algorithmé&oldid=684365641"
Categories: Sorting algorithms | Data processing
'= This page was last modified on 6 October 2015, at 05:32.
1 Text is available under the Creative Commons Attribution-S
site, you agree to the Terms of Use and Privacy Policy. Wi
Inc., a non-profit organization,
hareAlike License; additional terms may apply. By using this
redia® is a registered trademark of the Wikimedia Foundation,