Whether you are preparing for a coding interview, studying computer science, or simply want to write faster and smarter programs, algorithms and data structures are the foundation you need. This guide organises every topic into a logical learning path — start from the top and work your way down.
Every post in this series includes step-by-step walkthroughs, Java code you can run, and complexity analysis so you truly understand why an algorithm behaves the way it does.
Phase 1 — Sorting Algorithms
-
Introduction to Algorithms & Big-O Notation
Before diving into any algorithm, you need to understand how we measure performance. Big-O notation gives us a language to describe how an algorithm’s running time or memory usage grows as the input gets larger.
O(1) < O(log n) < O(n) < O(n log n) < O(n²) < O(2ⁿ) -
Bubble Sort — The Simplest Sort
The most intuitive sorting algorithm — compare adjacent elements and swap them if they are in the wrong order. Repeat until the list is sorted.
// After each pass, the largest unsorted element "bubbles" to its correct position if (arr[j] > arr[j + 1]) { swap(arr, j, j + 1); } -
Selection Sort & Insertion Sort
Two fundamental $O(n^2)$ sorts that are surprisingly practical for small datasets. Selection Sort picks the minimum and places it; Insertion Sort builds a sorted portion one element at a time — like sorting playing cards in your hand.
-
Merge Sort — Divide and Conquer
Split the array in half, sort each half, then merge them back together. Guaranteed $O(n \log n)$ performance every single time.
mergeSort(left); mergeSort(right); merge(arr, left, right); -
Quick Sort — The Versatile Performer
Choose a pivot, partition the array around it, then recursively sort each partition. On average $O(n \log n)$ and often the fastest in practice.
-
Counting Sort, Radix Sort & Bucket Sort
When your data has special properties (integers in a known range, fixed-length strings), you can sort in linear time $O(n)$ — breaking the $O(n \log n)$ comparison-sort barrier.
Phase 2 — Searching & Data Structures
-
Linear Search & Binary Search
From the simplest scan to the elegant “half the search space each time” technique. Binary Search is the first algorithm where you experience the power of $O(\log n)$.
-
Two Pointer Technique
A strategy where two indices walk through a sorted array from opposite ends (or the same end at different speeds), solving problems that would otherwise need nested loops.
-
Sliding Window Technique
Maintain a “window” of elements as you slide across an array. Perfect for problems involving contiguous sub-arrays or sub-strings.
-
Stacks & Queues — Core Operations
Two of the most fundamental data structures — LIFO (Last In, First Out) and FIFO (First In, First Out). Used everywhere from browser history to print queues.
-
Hash Tables & Hashing
How to achieve $O(1)$ average-case lookups by converting keys into array indices. Understanding hash collisions and resolution strategies.
-
Linked Lists — Singly, Doubly & Circular
Dynamic data structures where elements are connected via pointers. Learn insertions, deletions, and classic interview problems like detecting cycles.
Phase 3 — Trees & Graphs
-
Binary Trees — Traversals & Operations
Understand tree terminology and master the four traversal orders: In-order, Pre-order, Post-order, and Level-order (BFS).
-
Binary Search Trees (BST)
A tree where every left child is smaller and every right child is larger. This structure gives us $O(\log n)$ search, insert, and delete — when the tree is balanced.
-
Heaps & Priority Queues
A complete binary tree that satisfies the heap property. The backbone of priority queues and the secret behind Heap Sort.
-
Graph Representation & BFS
How to represent a graph (adjacency list vs. matrix) and explore it level-by-level using Breadth-First Search.
-
Depth First Search (DFS) & Applications
Go as deep as possible before backtracking. DFS powers cycle detection, topological sorting, and connected component discovery.
-
Dijkstra’s Shortest Path Algorithm
Find the shortest path from a source node to every other node in a weighted graph — the algorithm behind GPS navigation.
Phase 4 — Advanced Techniques
-
Dynamic Programming — Introduction & Memoization
When a problem has overlapping subproblems and optimal substructure, DP saves you from redundant computation. Start with the top-down (memoization) approach.
-
Dynamic Programming — Tabulation & Classic Problems
The bottom-up approach to DP. Solve the smallest subproblems first, then build up to the answer. Tackle classics like Knapsack and Longest Common Subsequence.
-
Greedy Algorithms
Make the locally optimal choice at every step and hope it leads to a globally optimal solution. Learn when greedy works and when it fails.
-
Backtracking — N-Queens, Sudoku Solver
Explore all candidates and abandon (“backtrack”) a path as soon as you determine it cannot lead to a valid solution.
-
Recursion Patterns & Tail Recursion
Understand the mental models behind recursive thinking, the call stack, and how tail recursion can optimise memory usage.
-
Bit Manipulation Tricks
Work directly with binary representations to solve problems with extraordinary speed and minimal memory.