• Home
  • About
    • Thoughts To Pen photo

      Thoughts To Pen

      My thoughts on Computer Programming || Psychology || Personal Finances || & much more...

    • Learn More
    • Twitter
    • Instagram
    • Github
    • StackOverflow
  • Posts
    • All Posts
    • All Tags
  • Projects
  • Portfolio
  • Resources
  • About

Selection Sort & Insertion Sort

28 Jan 2026

In the previous post, we learned Bubble Sort — the simplest comparison sort. Now let’s look at two more $O(n^2)$ algorithms that each use a different strategy:

  • Selection Sort — finds the minimum element and places it at the front.
  • Insertion Sort — builds a sorted section one element at a time, inserting each new element into its correct position.

Both are simple to implement and excellent for understanding fundamental sorting mechanics.


Selection Sort

How It Works

Imagine you have a pile of coins of different sizes on a table. To sort them:

  1. Scan the entire pile and pick out the smallest coin. Place it in a new row as the first coin.
  2. Scan the remaining pile, find the next smallest coin, and place it second.
  3. Repeat until no coins remain in the pile.

In code, instead of using a separate row, we swap the smallest element with the element at the current starting position.

Algorithm Steps

  1. Start at index 0. Scan the rest of the array to find the minimum element.
  2. Swap the minimum with the element at index 0.
  3. Move to index 1. Scan from index 1 onward for the minimum.
  4. Swap it into index 1.
  5. Repeat for every index until the array is sorted.

Step-by-Step Walkthrough

Let’s sort [64, 25, 12, 22, 11].

Pass 1 — Find minimum in [64, 25, 12, 22, 11]

  • Minimum is 11 at index 4.
  • Swap arr[0] (64) with arr[4] (11).
  • Result: [11, 25, 12, 22, 64] ✅ Position 0 is set.

Pass 2 — Find minimum in [25, 12, 22, 64] (indices 1–4)

  • Minimum is 12 at index 2.
  • Swap arr[1] (25) with arr[2] (12).
  • Result: [11, 12, 25, 22, 64] ✅ Position 1 is set.

Pass 3 — Find minimum in [25, 22, 64] (indices 2–4)

  • Minimum is 22 at index 3.
  • Swap arr[2] (25) with arr[3] (22).
  • Result: [11, 12, 22, 25, 64] ✅ Position 2 is set.

Pass 4 — Find minimum in [25, 64] (indices 3–4)

  • Minimum is 25 at index 3 (already in place).
  • Result: [11, 12, 22, 25, 64] ✅ Positions 3 and 4 are set.

Final sorted array: [11, 12, 22, 25, 64] ✅


Java Implementation

public class SelectionSort {

    /**
     * Sorts an integer array using Selection Sort.
     *
     * Strategy: in each pass, find the smallest element in the unsorted
     * portion and swap it into the next position of the sorted portion.
     */
    public static void selectionSort(int[] arr) {
        int n = arr.length;

        for (int i = 0; i < n - 1; i++) {
            // Assume the current index holds the minimum
            int minIndex = i;

            // Scan the unsorted portion to find the actual minimum
            for (int j = i + 1; j < n; j++) {
                if (arr[j] < arr[minIndex]) {
                    minIndex = j;
                }
            }

            // Swap the minimum into position i (only if it's not already there)
            if (minIndex != i) {
                int temp = arr[i];
                arr[i] = arr[minIndex];
                arr[minIndex] = temp;
            }
        }
    }

    public static void main(String[] args) {
        int[] data = {64, 25, 12, 22, 11};
        selectionSort(data);
        // Output: [11, 12, 22, 25, 64]
        for (int num : data) System.out.print(num + " ");
    }
}

Key Details

  • Number of swaps: Selection Sort makes at most $n - 1$ swaps — one per pass. This makes it efficient when writes are expensive (e.g., writing to flash memory).
  • Not stable: Equal elements may have their relative order changed because of long-range swaps.

Selection Sort — Complexity

Case Time Explanation
Best $O(n^2)$ Even if already sorted, we still scan for the minimum each pass.
Average $O(n^2)$ Always performs $\frac{n(n-1)}{2}$ comparisons.
Worst $O(n^2)$ Same — no shortcut exists.

| Space | $O(1)$ — in-place | |:—:|:—:|

Key insight: Unlike Bubble Sort, Selection Sort has no best-case optimisation. It always does the same number of comparisons regardless of input order.


Insertion Sort

How It Works

Think of how you sort a hand of playing cards. You hold some cards in your left hand (already sorted), and you pick up one new card at a time from the table with your right hand. You slide the new card into the correct position among the cards in your left hand.

That is Insertion Sort:

  1. Start with the first element — a single element is trivially “sorted.”
  2. Pick the next element.
  3. Compare it with elements in the sorted portion (going right to left).
  4. Shift larger elements one position to the right to make room.
  5. Insert the element into its correct spot.
  6. Repeat until all elements are processed.

Step-by-Step Walkthrough

Let’s sort [8, 4, 6, 2, 9].

The bold section represents the sorted portion at each step.

Step 1 — Insert 4 into [8]

  • Compare 4 with 8. Since 4 < 8, shift 8 right.
  • Insert 4 at index 0.
  • Result: [4, 8], 6, 2, 9

Step 2 — Insert 6 into [4, 8]

  • Compare 6 with 8. Since 6 < 8, shift 8 right.
  • Compare 6 with 4. Since 6 > 4, stop.
  • Insert 6 at index 1.
  • Result: [4, 6, 8], 2, 9

Step 3 — Insert 2 into [4, 6, 8]

  • Compare 2 with 8 → shift 8 right.
  • Compare 2 with 6 → shift 6 right.
  • Compare 2 with 4 → shift 4 right.
  • Insert 2 at index 0.
  • Result: [2, 4, 6, 8], 9

Step 4 — Insert 9 into [2, 4, 6, 8]

  • Compare 9 with 8. Since 9 > 8, no shifting needed.
  • 9 stays in place.
  • Result: [2, 4, 6, 8, 9] ✅

Final sorted array: [2, 4, 6, 8, 9] ✅


Java Implementation

public class InsertionSort {

    /**
     * Sorts an integer array using Insertion Sort.
     *
     * Strategy: build a sorted sub-array from left to right. For each new
     * element, shift larger sorted elements to the right, then insert.
     */
    public static void insertionSort(int[] arr) {
        int n = arr.length;

        // Start from index 1 (index 0 is trivially sorted)
        for (int i = 1; i < n; i++) {
            int key = arr[i];  // the element to be inserted
            int j = i - 1;

            // Shift elements of the sorted portion that are greater than key
            while (j >= 0 && arr[j] > key) {
                arr[j + 1] = arr[j]; // move element one position to the right
                j--;
            }

            // Place the key in its correct position
            arr[j + 1] = key;
        }
    }

    public static void main(String[] args) {
        int[] hand = {8, 4, 6, 2, 9};
        insertionSort(hand);
        // Output: [2, 4, 6, 8, 9]
        for (int num : hand) System.out.print(num + " ");
    }
}

Key Lines Explained

Line Purpose
int key = arr[i]; Save the element we are about to insert. The position arr[i] will be overwritten by shifting.
while (j >= 0 && arr[j] > key) Walk backwards through the sorted portion. Keep shifting while we find elements larger than our key.
arr[j + 1] = arr[j]; Shift the larger element one position to the right to make room.
arr[j + 1] = key; Drop the key into the gap we created. The j + 1 is correct because j stopped one position before the insertion point.

Insertion Sort — Complexity

Case Time Explanation
Best $O(n)$ Array is already sorted. The inner while loop never executes — we just compare each element once.
Average $O(n^2)$ Elements are in random order. On average, half the sorted portion needs shifting.
Worst $O(n^2)$ Array is in reverse order. Every element must be shifted all the way to the front.

| Space | $O(1)$ — in-place | |:—:|:—:|

Key insight: Insertion Sort is stable (equal elements keep their relative order) and adaptive (it is fast on nearly-sorted data). This makes it the go-to $O(n^2)$ algorithm in practice.


Head-to-Head Comparison

Feature Selection Sort Insertion Sort Bubble Sort
Best-case time $O(n^2)$ $O(n)$ ✅ $O(n)$ ✅
Worst-case time $O(n^2)$ $O(n^2)$ $O(n^2)$
Number of swaps $O(n)$ ✅ $O(n^2)$ $O(n^2)$
Stable? ❌ No ✅ Yes ✅ Yes
Adaptive? ❌ No ✅ Yes ✅ Yes
Best for Minimising writes Nearly sorted data Teaching concepts

When to Use / When NOT to Use

✅ Use When ❌ Avoid When
Data is nearly sorted (Insertion Sort shines: $O(n)$) Dataset is large (> 1,000 elements) — switch to Merge Sort or Quick Sort
You need a stable sort with minimal code (Insertion Sort) You need guaranteed fast performance regardless of input order
Writes are very expensive, like flash memory (Selection Sort — fewest swaps) You need $O(n \log n)$ or better

Practice Problems

  1. Sort an Array – LeetCode #912 — Implement both Selection and Insertion Sort.
  2. Insertion Sort List – LeetCode #147 — Apply Insertion Sort to a Linked List.
  3. Challenge: Given a nearly-sorted array where each element is at most k positions away from its sorted position, which of the three $O(n^2)$ sorts would you choose and why?

What’s Next?

We have now covered three $O(n^2)$ sorting algorithms. They are great for learning, but for real-world performance, we need algorithms that can sort in $O(n \log n)$. Next up: Merge Sort — Divide and Conquer.



programmingalgorithmssorting Share Tweet Msg