Any sorting algorithm that only uses comparisons between elements has a lower bound of O(n log n). This is provable via decision-tree analysis.
Algorithm
Best
Average
Worst
Space
Stable?
Merge sort
O(n log n)
O(n log n)
O(n log n)
O(n)
Yes
Quick sort
O(n log n)
O(n log n)
O(n²)
O(log n)
No
Heap sort
O(n log n)
O(n log n)
O(n log n)
O(1)
No
Insertion sort
O(n)
O(n²)
O(n²)
O(1)
Yes
Merge Sort — Divide & Conquer
defmerge_sort(arr):
iflen(arr) <= 1: return arr
mid = len(arr) // 2
L = merge_sort(arr[:mid])
R = merge_sort(arr[mid:])
result, i, j = [], 0, 0while i < len(L) and j < len(R):
if L[i] <= R[j]: result.append(L[i]); i += 1else: result.append(R[j]); j += 1return result + L[i:] + R[j:]
Quick Sort & Pivot Choice
Quick sort’s worst case (O(n²)) occurs when the pivot is always the smallest or largest element (sorted input with first-element pivot). Fix: random pivot or median-of-three.
import random
defquicksort(arr):
iflen(arr) <= 1: return arr
pivot = arr[random.randint(0, len(arr)-1)]
lo = [x for x in arr if x < pivot]
mid = [x for x in arr if x == pivot]
hi = [x for x in arr if x > pivot]
returnquicksort(lo) + mid + quicksort(hi)
Non-Comparison: Counting Sort
Counting sort runs in O(n + k) where k is the value range. Not based on comparisons — bypasses the O(n log n) lower bound.
defcounting_sort(arr, max_val):
count = [0] * (max_val + 1)
for x in arr: count[x] += 1return [x for x, c inenumerate(count) for _ inrange(c)]
Practice
1. Python's built-in sort uses which algorithm, and why?
Timsort — a hybrid of merge sort and insertion sort.
Insertion sort is used for small runs (very fast for nearly-sorted data
and small arrays due to low overhead). Merge sort merges runs together.
Result: O(n) best case (sorted input), O(n log n) worst, stable, O(n) space.
2. When is counting sort impractical?
When k (the value range) is very large relative to n.
E.g., sorting 100 integers with values up to 10^9 requires a 10^9-element
count array — 4 GB of memory. Use comparison sort instead.
Knowledge Check
The O(n log n) lower bound for comparison-based sorting means
Code length and time complexity are unrelated.
Correct — decision-tree proof: n! orderings require at least log(n!) = O(n log n) comparisons.
Radix/counting sort are non-comparison sorts; they beat the bound.
Quick sort is O(n log n) on average but O(n²) worst case.
Recap: comparison-based lower bound = O(n log n). Non-comparison sorts (counting, radix, bucket) can be O(n+k) but require constraints on input.
Merge sort is preferred over quick sort when
Merge sort uses O(n) space; heap sort is in-place.
Correct — merge sort is stable and always O(n log n).
Insertion sort is best for nearly-sorted; merge sort doesn't exploit it.
That's the scenario for counting sort.
Recap: merge sort = stable, O(n log n) guaranteed, O(n) space. Quicksort = faster in practice (cache), O(log n) space, but O(n²) worst case and not stable.
Quick sort's worst case O(n²) occurs when
Equal elements cause many comparisons but not necessarily O(n²) with good implementation.
Correct — one partition gets all elements; recursion depth = n. Sorted/reverse-sorted input with first-element pivot triggers this.
Array size doesn't cause worst case; pivot choice does.
Stack overflow is a practical concern for very large n, not the theoretical worst case.
Recap: worst case = pivot always splits into 0 and n-1 elements. T(n) = T(n-1) + O(n) = O(n²). Fix: random pivot or median-of-three.
A stable sorting algorithm
Stability and complexity are unrelated. Insertion sort is stable AND O(n²).
Correct — if a=b and a came before b in input, a stays before b in output.
Stability is a property of any sort over comparable elements.
Space complexity is unrelated to stability.
Recap: stable sort = equal elements keep their original order. Matters when sorting objects by one key that were already sorted by another key (e.g., sort employees by dept, then by name).
Counting sort is O(n + k). The k refers to
Counting sort makes no comparisons.
Correct — you need a count array of size k.
Counting sort is not recursive.
Equal elements affect individual counts but don't change the O(n+k) bound.
Recap: counting sort allocates an array of size k to count occurrences. O(n) to count + O(k) to output = O(n+k). Practical when k is small (e.g., ages 0–120, ASCII codes 0–127).