Selection Sort
Selection sort repeatedly selects the minimum element from the unsorted portion and moves it to the front. Unlike bubble sort, it minimises swaps — but it's input-blind: it always does the same number of comparisons.
💡 The Core Idea
Scan the entire unsorted portion (arr[i..n-1]) to find the index of the smallest element.
Swap the found minimum with arr[i] — the first unsorted position. One swap per pass.
The sorted region grows by one element per pass. After n−1 passes, the entire array is sorted.
All Cases — Step by Step
😰 Worst Case — Reversed Array
Worst case: Reversed array [5, 4, 3, 2, 1]
Selection sort's complexity doesn't actually change between best and worst case for comparisons — it always does n(n−1)/2. But a reversed array maximises swaps.
⚠️ Best Case — Already Sorted (still O(n²)!)
Best case: Already sorted [1, 2, 3, 4, 5]
This is NOT O(n) for selection sort! Unlike bubble sort, it has no early exit. It still does n(n−1)/2 comparisons.
⚡ Average Case — Nearly Sorted
Nearly sorted: [1, 2, 4, 3, 5]
Nearly sorted arrays give no advantage to selection sort. It still scans exhaustively each pass — only 1 swap needed, but still O(n²) comparisons.
🟰 Edge Case — All Equal
All equal: [3, 3, 3, 3, 3]
All equal elements mean no swap is ever needed (the minimum is always the current position). But comparisons still happen.
Instability Explained
Selection sort is NOT stable — non-adjacent swaps can flip the relative order of equal elements. Step through the demo:
Initial array
We have two elements with value 3: 3ᵃ (original index 0) and 3ᵇ (original index 2). A stable sort preserves their order.
Selection Sort's Key Property — Fewest Swaps
✅ At most n−1 swaps
Selection sort makes at most one swap per pass and zero swaps when the min is already in place. Total swaps ≤ n−1.
Pass 2: find min → 1 swap
Pass 3: already in place → 0 swaps
Max total: n−1 swaps
❌ Always O(n²) comparisons
There's no way to know the minimum without checking every unsorted element. No early exit. Input order doesn't matter.
Pass 2: n−2 comparisons
...
Total: n(n−1)/2 always
💡 When does this matter? When writes are extremely expensive (e.g. flash memory, EEPROM), selection sort's minimal swaps can be a practical advantage over algorithms with more writes — even though comparisons are still O(n²).
Complexity Deep Dive
| Case | Input | Comparisons | Swaps | Time | Why |
|---|---|---|---|---|---|
| 😰 Worst | Reversed | n(n−1)/2 | n−1 | O(n²) | Max swaps: every element starts in wrong place |
| 📊 Average | Random | n(n−1)/2 | ~n/2 | O(n²) | Same comparisons; about half the passes need swaps |
| ⚠️ Best | Already sorted | n(n−1)/2 | 0 | O(n²) | No swaps, but comparisons are still exhaustive |
| 🟰 Equal | All equal | n(n−1)/2 | 0 | O(n²) | Strict < means no swaps; comparisons unavoidable |
| — | Space (all cases) | — | — | O(1) | In-place — no extra array needed |
⚠️ Key insight: Selection sort always does exactly n(n−1)/2 comparisons — no variation based on input. This makes it predictable but also means it can't exploit nearly-sorted data like bubble or insertion sort can.
Implementation
def selection_sort(arr):
n = len(arr)
for i in range(n - 1): # n-1 passes
min_idx = i # assume current position is min
for j in range(i + 1, n): # scan the unsorted portion
if arr[j] < arr[min_idx]:
min_idx = j # found a smaller element
if min_idx != i: # only swap if needed
arr[i], arr[min_idx] = arr[min_idx], arr[i]
return arr
# Test all cases
print(selection_sort([64, 34, 25, 12])) # Random → [12, 25, 34, 64]
print(selection_sort([1, 2, 3, 4])) # Sorted → [1, 2, 3, 4] (still O(n²)!)
print(selection_sort([4, 3, 2, 1])) # Reversed → [1, 2, 3, 4] (O(n²))
print(selection_sort([])) # Empty → []function selectionSort(arr: number[]): number[] {
const n = arr.length;
for (let i = 0; i < n - 1; i++) { // n-1 passes
let minIdx = i; // assume position i holds the min
for (let j = i + 1; j < n; j++) { // scan unsorted portion
if (arr[j] < arr[minIdx]) {
minIdx = j; // update minimum tracker
}
}
if (minIdx !== i) { // swap only if min isn't already in place
[arr[i], arr[minIdx]] = [arr[minIdx], arr[i]];
}
}
return arr;
}
// Edge cases
selectionSort([]); // [] — 0 iterations
selectionSort([42]); // [42] — single element, already sorted
selectionSort([3, 3, 3]); // [3,3,3] — equal, 0 swaps (but still O(n²) comparisons!)
selectionSort([5,4,3,2,1]); // reversed — O(n²) comparisons, n-1 swapsEdge Cases to Handle
Empty array []
O(1)n=0, outer loop never runs. Return immediately. O(1).
Single element [x]
O(1)n=1, n−1=0 passes. Nothing to compare. Already sorted.
Two elements [a, b]
O(1)One comparison, at most one swap. Simplest non-trivial case.
All equal [x, x, x]
O(n²)Zero swaps — min is always already at position i. Still O(n²) comparisons.
Already sorted
O(n²)Zero swaps, but still O(n²) comparisons. No early exit possible!
Reversed array
O(n²)Maximum swaps: n−1. But comparisons are still same O(n²) as sorted.
When to Use (and Not Use)
✅ Good for
- • Minimising writes — at most n−1 swaps; ideal when write cost is high (flash memory, EEPROM)
- • Teaching — simple to reason about: find min, place it, repeat
- • Tiny arrays (n < 10) — overhead of complex algorithms isn't worth it
- • Predictable performance — always exactly n(n−1)/2 comparisons, no surprises
❌ Avoid when
- • Stability is required — use Merge Sort or Insertion Sort instead
- • Nearly sorted data — no early exit means wasted work vs. bubble/insertion
- • Large datasets — O(n²) comparisons are catastrophic at n=100,000+
- • Production code — Tim Sort (Python/Java) is always faster in practice
- • Unknown or variable input — insertion sort adapts better to real-world data
Selection Sort vs Others
| Algorithm | Best | Average | Worst | Space | Stable | Notes |
|---|---|---|---|---|---|---|
| 🎯 Selection Sort | O(n²) | O(n²) | O(n²) | O(1) | ❌ | Fewest swaps; input-blind |
| 🫧 Bubble Sort | O(n) | O(n²) | O(n²) | O(1) | ✅ | Early exit on sorted data |
| 🃏 Insertion Sort | O(n) | O(n²) | O(n²) | O(1) | ✅ | Best for nearly sorted data |
| 🔀 Merge Sort | O(n log n) | O(n log n) | O(n log n) | O(n) | ✅ | Best stable sort for large n |
| ⚡ Quick Sort | O(n log n) | O(n log n) | O(n²) | O(log n) | ❌ | Fastest in practice |
| 🐍 Tim Sort | O(n) | O(n log n) | O(n log n) | O(n) | ✅ | Python built-in; best all-rounder |
💡 Insertion Sort beats Selection Sort in almost every practical scenario — it's adaptive (O(n) best case), stable, and has similar swap counts on average. The only edge case where Selection Sort wins: when the cost of writes vastly exceeds reads.