More Sorting — Selection, Insertion & Merge Sort
Master different ways to put things in order and learn when to use each one!
Open interactive version (quiz + challenge)Real-world analogy
Imagine you're a coach lining up kids by height for a class photo. With Selection Sort, you scan the whole line, find the shortest kid, and move them to the front — then find the next shortest, and so on. With Insertion Sort, it's like sorting playing cards: you pick up one card at a time and slide it into the right spot in your hand. Merge Sort is like splitting the class into two groups, having each group sort themselves, then carefully merging them back into one perfect line. Each method works, but some are faster when the line gets really long!
What is it?
This lesson covers four important sorting algorithms beyond Bubble Sort: Selection Sort (find the min, swap to front), Insertion Sort (insert each element in its place), Merge Sort (split, sort, merge), and Counting Sort (count occurrences, place in order). Each has different strengths depending on the data you're sorting.
Real-world relevance
Sorting algorithms are everywhere! Your music app uses efficient sorting to arrange songs by name or date. Libraries sort books by call number using methods similar to Insertion Sort. Even mail sorting machines use Counting Sort-like approaches to sort letters by zip code into bins.
Key points
- Selection Sort — Find the Smallest First — Selection Sort works by scanning through the entire list to find the smallest element, then swapping it to the front. Then it finds the second-smallest and puts it in position two, and so on. It's simple to understand but not the fastest — it always takes O(n²) time no matter what.
- Insertion Sort — Like Sorting Playing Cards — Insertion Sort picks up one element at a time and slides it into its correct position among the already-sorted elements. Think of it like sorting cards in your hand — you pick up a new card and insert it where it belongs. It's really fast when the list is almost sorted already!
- Merge Sort — Divide and Conquer Magic — Merge Sort splits the list in half, sorts each half separately, then merges them back together in order. This 'divide and conquer' approach is super efficient — it always runs in O(n log n) time. The trade-off is it needs extra memory to hold the temporary halves.
- Counting Sort — Count and Place — Counting Sort doesn't compare elements at all! Instead, it counts how many of each value you have, then places them in order based on those counts. It's blazing fast — O(n + k) where k is the range of values — but only works well when your numbers are in a small range.
- Comparing Time Complexities — Bubble Sort, Selection Sort, and Insertion Sort are all O(n²) — they get slow with big lists. Merge Sort is O(n log n) — much faster for large data. Counting Sort is O(n + k) — the fastest of all when the value range is small. Knowing these speeds helps you pick the right tool!
- When Selection Sort Shines — Selection Sort is great when you need the simplest possible algorithm and the list is small. It always does the same number of comparisons no matter what, and it does the minimum number of swaps — exactly n-1 swaps. Use it when swapping is expensive but comparing is cheap.
- When Insertion Sort Shines — Insertion Sort is the champion for nearly-sorted data — it can run in almost O(n) time! It's also great for small lists and is used inside more complex algorithms as a base case. Many real-world sorting implementations switch to Insertion Sort for small sub-arrays.
- Stability in Sorting — A 'stable' sort keeps equal elements in their original order. Insertion Sort and Merge Sort are stable, but Selection Sort is NOT stable by default. In competitive programming, stability matters when you sort by one field but want to keep the order from a previous sort.
- Which Sort to Use in CP? — In competitive programming, you'll almost always use the built-in sort() which is O(n log n). But understanding these algorithms helps you solve problems that ASK about sorting, or when you need a custom approach. Sometimes Counting Sort is the secret weapon for problems with small value ranges!
Code example
#include <bits/stdc++.h>
using namespace std;
// Selection Sort: find minimum, swap to front
void selectionSort(vector<int>& a) {
int n = a.size();
for (int i = 0; i < n - 1; i++) {
int minIdx = i; // assume current is smallest
for (int j = i + 1; j < n; j++) {
if (a[j] < a[minIdx]) minIdx = j; // found smaller
}
swap(a[i], a[minIdx]); // put smallest at position i
}
}
// Insertion Sort: insert each element in correct spot
void insertionSort(vector<int>& a) {
int n = a.size();
for (int i = 1; i < n; i++) {
int key = a[i]; // card to insert
int j = i - 1;
while (j >= 0 && a[j] > key) {
a[j + 1] = a[j]; // shift bigger elements right
j--;
}
a[j + 1] = key; // place card in correct spot
}
}
int main() {
vector<int> v1 = {64, 25, 12, 22, 11};
vector<int> v2 = v1; // copy for second sort
// Test Selection Sort
selectionSort(v1);
cout << "Selection Sort: ";
for (int x : v1) cout << x << " ";
cout << endl;
// Test Insertion Sort
insertionSort(v2);
cout << "Insertion Sort: ";
for (int x : v2) cout << x << " ";
cout << endl;
// Output: 11 12 22 25 64 (both give same result!)
return 0;
}Line-by-line walkthrough
- 1. We include bits/stdc++.h for all standard library tools and use namespace std for convenience.
- 2. The selectionSort function takes a vector by reference so it modifies the original array directly.
- 3. The outer loop (i) represents which position we're filling next — starting from position 0.
- 4. For each position i, we assume it holds the minimum (minIdx = i), then scan the rest of the array to find if there's anything smaller.
- 5. If we find a smaller element at position j, we update minIdx. After scanning, we swap the minimum into position i.
- 6. The insertionSort function also takes a vector by reference. It starts from index 1 (index 0 is already 'sorted' by itself).
- 7. For each element at position i, we save it as 'key' — this is the card we need to insert into the right place.
- 8. The while loop shifts all elements bigger than key one position to the right, making room for key.
- 9. Once we find the right spot (where a[j] is not bigger than key, or we reach the beginning), we place key there.
- 10. In main(), we test both sorts on the same data and print the results — both give the same sorted output!
Spot the bug
#include <bits/stdc++.h>
using namespace std;
void selectionSort(vector<int>& a) {
int n = a.size();
for (int i = 0; i < n; i++) {
int minIdx = i;
for (int j = i + 1; j < n; j++) {
if (a[j] < a[minIdx]) minIdx = j;
}
swap(a[i], a[minIdx]);
}
}
int main() {
vector<int> v = {5, 3, 1, 4, 2};
selectionSort(v);
for (int x : v) cout << x << " ";
}Need a hint?
This code actually works correctly, but it does one unnecessary iteration. Look at the outer loop's condition — when i reaches the last element, is there anything left to compare or swap?
Show answer
The outer loop should be 'i < n - 1' instead of 'i < n'. When i is at the last element (n-1), there are no elements after it to compare with, so that last iteration is wasted work. It still gives the correct answer, but in competitive programming, every tiny optimization counts! The fix: change 'i < n' to 'i < n - 1'.
Explain like I'm 5
Imagine you have a messy pile of toy blocks with numbers on them and you want to line them up from smallest to biggest. One way is to keep looking through the whole pile for the smallest block and put it at the front — that's Selection Sort. Another way is to pick up one block at a time and slide it into the right spot, just like when you sort cards in a game — that's Insertion Sort. The cleverest way is to split your pile into two smaller piles, sort each one, then carefully put them together like a zipper — that's Merge Sort, and it's the fastest for big piles!
Fun fact
The famous Timsort algorithm (used in Python and Java) is actually a hybrid of Merge Sort and Insertion Sort! It was invented by Tim Peters in 2002 and takes advantage of the fact that real-world data often has runs of already-sorted elements.
Hands-on challenge
Implement Counting Sort for an array where all values are between 0 and 100. Then test it with the array {4, 2, 2, 8, 3, 3, 1}. Bonus: Can you modify your Counting Sort to work with negative numbers by shifting all values up?
More resources
- Sorting Algorithms Visualized (YouTube)
- Sorting Algorithms - CP-Algorithms (CP-Algorithms)
- CSES Sorting and Searching (CSES)