Two Sum: The Classic Interview Starter
If you have done any LeetCode at all, you have seen Two Sum. It is problem number 1, literally. And for good reason: it is simple enough that anyone can understand the problem, but the optimal solution teaches you a technique that shows up in dozens of harder problems later.
Let's break it down.
The problem
Given an array of integers nums and an integer target, return the indices of the two numbers that add up to target.
You can assume each input has exactly one solution, and you cannot use the same element twice.
Example: nums = [2, 7, 11, 15], target = 9. The answer is [0, 1] because nums[0] + nums[1] = 2 + 7 = 9.
Seems straightforward. But how you solve it matters a lot.
The brute force approach
The most obvious way: check every pair of numbers. For each element, loop through the rest of the array and see if any other element adds up to the target.
def two_sum_brute(nums, target):
for i in range(len(nums)):
for j in range(i + 1, len(nums)):
if nums[i] + nums[j] == target:
return [i, j]
return []
This works. But the nested loops make it O(n^2). For a small array, no big deal. For 10,000 elements, that is 50 million pair checks. For 100,000 elements, 5 billion. It gets ugly fast.
The inner loop is doing one thing: searching for a specific value. "I have 2, I need 7. Is 7 anywhere in the rest of this array?" That search is O(n) each time, and we do it n times.
What if we could make that search O(1)?
The hash map solution
Here is the key insight. Instead of searching the array for the complement, store what you have already seen in a hash map. Then for each new number, just check: "Is my complement already in the map?"
The complement of a number is target - num. If the target is 9 and the current number is 2, the complement is 7. If 7 is already in the map, we have our answer. If not, we store 2 and its index and move on.
One pass through the array. One O(1) lookup per element. O(n) total.
Visual walkthrough
Let's walk through nums = [2, 7, 11, 15] with target = 9, step by step. Watch how the hash map builds up as we scan, and how the complement lookup finds the answer.
Step 1: Look at nums[0] = 2. Complement = 9 - 2 = 7.
7 is not in the map (it is empty). Store 2 with index 0.
Step 2: Look at nums[1] = 7. Complement = 9 - 7 = 2.
2 IS in the map at index 0. Return [0, 1]. Done!
Step 3: (If we had not found it) nums[2] = 11. Complement = -2.
-2 is not in the map. Store 11 with index 2 and keep going.
Step 4: (Continuing) nums[3] = 15. Complement = -6.
-6 is not in the map. Store 15 with index 3. No pair found (but we already found it at step 2).
That is the whole algorithm. At each step, we compute the complement and check the map. The moment we find a match, we return both indices. We never need to look backwards or use nested loops.
The code
Here is the clean Python Two Sum solution:
def two_sum(nums, target):
seen = {}
for i, num in enumerate(nums):
complement = target - num
if complement in seen:
return [seen[complement], i]
seen[num] = i
return []
Let's walk through what each line does:
seen = {}creates an empty hash map. Keys will be numbers we have visited, values will be their indices.complement = target - numcomputes the number we need to pair with the current one.if complement in seenis the O(1) lookup. If we have already encountered the complement, we found a valid pair.return [seen[complement], i]returns the index of the complement (stored earlier) and the current index.seen[num] = istores the current number and its index for future lookups.
The order matters: check the map before adding the current number. If you add first, an element could match with itself. For example, with target = 6 and the number 3, you do not want to return [i, i].
Complexity analysis
Time: O(n). We iterate through the array once. Each hash map lookup and insertion is O(1) on average, so the total is O(n).
Space: O(n). In the worst case, we store every element in the hash map before finding the pair (or not finding one at all).
Compare that to the brute force O(n^2) time and O(1) space. We are trading a small amount of memory for a massive speedup. For n = 100,000, that is the difference between 10 billion operations and 100,000. Easy tradeoff.
There is also an O(n log n) approach: sort the array and use two pointers converging from both ends. But sorting destroys the original indices, so you need extra bookkeeping to track them. For Two Sum specifically, the hash map approach is simpler and faster.
Edge cases to watch for
A few things that trip people up on this problem:
- Duplicate values. If the array has
[3, 3]and the target is 6, the algorithm handles it correctly. The first 3 gets stored at index 0, and when we reach the second 3, we compute complement = 3, find it in the map, and return [0, 1]. - Negative numbers. Nothing changes. Complement = target - num works the same way with negatives.
- No solution. The problem guarantees exactly one solution exists, but in real code you should return an empty result or raise an error if no pair is found.
The Building Blocks
The Two Sum solution is built from two fundamental building blocks that appear in many other problems:
1. Complement lookup. The idea of computing target - current and checking whether you have seen it before. This same technique powers problems like pair with given difference, two-sum variants with sorted arrays, and three-sum (which reduces to two-sum internally). Any time you can frame a problem as "I have X, do I need Y?" you are using complement lookup.
2. Seen-set tracking. Storing previously visited elements in a hash map (or hash set) so you can reference them later in O(1) time. This shows up everywhere: detecting duplicates, finding the first repeated character, checking if an element was already processed, tracking prefix sums for subarray problems. The pattern is always the same: iterate, check, store.
These two bricks combine to give you the LeetCode Two Sum solution, but they also combine with other bricks to solve much harder problems. Three-sum uses complement lookup inside a loop. Subarray sum equals K pairs complement lookup with prefix sums. Longest consecutive sequence uses a seen-set with clever boundary checks.
That is the whole point. You are not memorizing "the Two Sum solution." You are learning building blocks that snap together across dozens of problems.
Problems that use the same bricks
Once you have complement lookup and seen-set tracking down, try these:
- Three Sum: sort + two pointers, or reduce to two-sum with a loop
- Two Sum II (sorted array): two pointers instead of a hash map since the data is sorted
- Subarray Sum Equals K: complement lookup on prefix sums
- Contains Duplicate: seen-set tracking at its simplest
- Two Sum III (data structure): design a class around the complement lookup pattern
Why you will forget this (and how to fix it)
Two Sum feels obvious once you see the solution. "Of course you use a hash map. Of course you check the complement." But here is the thing: three weeks from now, if someone hands you a slightly different problem that uses the same complement lookup trick, there is a real chance you stare at it and do not make the connection.
That is not a you problem. That is just how memory works. You understood it, but understanding and recall are different skills. In an interview, nobody is testing whether you can follow along with a tutorial. They are testing whether you can produce the solution from scratch.
Spaced repetition is the fix. You practice typing the solution from memory at increasing intervals. First after a day, then three days, then a week. Each time you reconstruct it, not just re-read it. After a few cycles, the pattern is locked in permanently. You do not just recognize complement lookup when you see it. You reach for it automatically when a problem calls for it.
Related posts
- Hash Map Patterns - The three core hash map patterns, including complement lookup, frequency counting, and grouping
- The Two Pointers Pattern - The alternative approach for sorted arrays that solves Two Sum II in O(1) space
New to algorithms? Learn the fundamentals with our What is an Algorithm? visual guide, or understand time complexity with our Big O Notation Guide.