0% found this document useful (0 votes)
2 views

Sorting_Algorithms_Complexity

The document outlines the time complexity of various sorting algorithms under special cases, such as when all elements are the same or when the array is already sorted. QuickSort can degrade to O(n²) in these scenarios without a good pivot, while algorithms like Insertion Sort and Counting Sort perform efficiently at O(n). Merge Sort and Heap Sort maintain O(n log n) complexity regardless of the input condition.

Uploaded by

priyanka
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Sorting_Algorithms_Complexity

The document outlines the time complexity of various sorting algorithms under special cases, such as when all elements are the same or when the array is already sorted. QuickSort can degrade to O(n²) in these scenarios without a good pivot, while algorithms like Insertion Sort and Counting Sort perform efficiently at O(n). Merge Sort and Heap Sort maintain O(n log n) complexity regardless of the input condition.

Uploaded by

priyanka
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 1

Time Complexity of Sorting Algorithms in Special Cases

Sorting Algorithm All Elements are Same Already Sorted Array


Bubble Sort O(n²) (still compares all O(n) (best case, only checks
elements) once)
Selection Sort O(n²) (still makes all O(n²) (no improvement)
comparisons)
Insertion Sort O(n) (no swaps needed) O(n) (best case, minimal
swaps)
Merge Sort O(n log n) (still splits and O(n log n) (no advantage)
merges)
QuickSort (Lomuto/Naïve O(n²) (bad pivot selection, O(n²) (if always picking the
Partitioning) unbalanced partitions) worst pivot)
QuickSort O(n log n) (if implemented O(n log n) (balanced pivot
(Hoare/Randomized well) helps)
Partitioning)
Heap Sort O(n log n) (builds heap) O(n log n) (no
improvement)
Counting Sort O(n) (efficient, works well) O(n) (linear time)
Radix Sort O(nk) (depends on digit O(nk) (same as above)
count k)

### Explanation of Worst-Case Behavior

1. **QuickSort:**
- If all elements are the same, it does not split efficiently, leading to O(n²) complexity.
- If already sorted and a bad pivot is chosen (like first or last element), it leads to O(n²).
- A randomized pivot improves it to O(n log n).

2. **Insertion Sort:**
- Works in O(n) for both cases because it simply scans without swaps.

3. **Bubble Sort:**
- Takes O(n²) for identical elements but O(n) if already sorted (since no swaps happen).

4. **Merge Sort and Heap Sort:**


- Always take O(n log n) since they process every element fully.

5. **Counting Sort:**
- Works optimally in O(n) when elements are the same or already sorted.

You might also like