Sorting Algorithms
Sorting is one of the most studied problems in computer science. Choosing the right algorithm depends on your data size, memory budget, stability requirements, and value distribution. Click any card below to dive into its full interactive walkthrough.
Basic Sorts
Simple, educational — great starting point
Bubble Sort
Swap neighbors until sorted
Repeatedly compares adjacent pairs and swaps them if out of order. Larger elements 'bubble up' to the end each pass.
Selection Sort
Find min, place at front
Scans the unsorted portion, finds the minimum, and places it at the correct position. Makes at most n swaps.
Insertion Sort
Build sorted list one by one
Takes each element and inserts it into its correct position in the already-sorted prefix. Excellent on nearly-sorted data.
Efficient Sorts
Used in real applications — optimal complexity
Merge Sort
Divide, sort, merge
Recursively splits the array in half, sorts each half, then merges them back. Guaranteed O(n log n) always.
Quick Sort
Pivot, partition, conquer
Picks a pivot, partitions elements smaller/larger, recursively sorts both sides. Fastest in practice due to cache efficiency.
Heap Sort
Heap structure, extract max
Builds a max-heap, then repeatedly extracts the maximum into sorted position. In-place with guaranteed O(n log n).
Non-Comparison Sorts
Beat the O(n log n) barrier with special tricks
Counting Sort
Count occurrences, reconstruct
Counts how many times each value appears, then reconstructs the sorted array. Blazing fast when the value range k is small.
Radix Sort
Sort digit by digit
Sorts integers digit by digit from least to most significant, using a stable sort (like counting sort) for each digit.
Bucket Sort
Distribute into buckets, sort each
Scatters elements into buckets based on value range, sorts each bucket (often with insertion sort), then concatenates.
Special Mentions
Hybrid or niche — worth knowing
Tim Sort
Merge + Insertion hybrid
Detects naturally sorted 'runs', uses insertion sort on small runs, then merges with a highly optimized merge. The real-world king.
Shell Sort
Gap-based insertion sort
Generalizes insertion sort by sorting elements far apart first, then reducing the gap. Bridges basic and efficient sorts.
Why can't comparison sorts beat O(n log n)?
Any algorithm that sorts by comparing pairs must make at least log₂(n!) comparisons in the worst case. By Stirling's approximation, log₂(n!) ≈ n log n. This is a proven theoretical lower bound — you cannot do better with comparisons alone. Non-comparison sorts (Counting, Radix, Bucket) sidestep this by using extra information about the data.
Full Complexity Comparison
| Algorithm | Best | Average | Worst | Space | Stable | Fast? |
|---|---|---|---|---|---|---|
| 🫧Bubble Sort | O(n) | O(n²) | O(n²) | O(1) | ✅ | ❌ |
| 🎯Selection Sort | O(n²) | O(n²) | O(n²) | O(1) | ❌ | ❌ |
| 🃏Insertion Sort | O(n) | O(n²) | O(n²) | O(1) | ✅ | ❌ |
| 🔀Merge Sort | O(n log n) | O(n log n) | O(n log n) | O(n) | ✅ | 🔥 |
| ⚡Quick Sort | O(n log n) | O(n log n) | O(n²) | O(log n) | ❌ | 🔥 |
| 🌲Heap Sort | O(n log n) | O(n log n) | O(n log n) | O(1) | ❌ | 🔥 |
| 🧮Counting Sort | O(n+k) | O(n+k) | O(n+k) | O(k) | ✅ | 🔥 |
| 🔢Radix Sort | O(nk) | O(nk) | O(nk) | O(n+k) | ✅ | 🔥 |
| 🐍Tim Sort | O(n) | O(n log n) | O(n log n) | O(n) | ✅ | 🔥 |
How to Pick the Right Sort
No single sort wins everything. Here's a practical decision guide:
📌 Small dataset (< 20 items)
🃏Insertion Sort
Low overhead, excellent on tiny or nearly-sorted data
📌 Need guaranteed O(n log n)
🔀Merge Sort
No worst-case degeneracy, always splits evenly
📌 In-place, fast in practice
⚡Quick Sort
Best cache performance, tiny constant factors
📌 Integers in small range
🧮Counting Sort
Beats comparison lower bound — O(n+k) is linear
📌 Need stability (e.g. sort by name then age)
🐍Tim Sort / Merge Sort
Equal elements preserve original relative order
📌 Streaming / online data
🃏Insertion Sort
Can insert new element into sorted prefix immediately
📌 Memory constrained
🌲Heap Sort
O(1) extra space + guaranteed O(n log n)
📌 Uniformly distributed floats
🪣Bucket Sort
Average O(n) when distribution is uniform
What is Stability?
A sort is stable if equal elements maintain their original relative order after sorting.
Example: You have a list of students sorted by name. You then sort by grade. A stable sort will keep students with the same grade in alphabetical order (their previous order). An unstable sort may scramble them.