The Sliding Window Pattern: A Complete Guide
If you have ever solved an array or string problem by checking every possible subarray, you have probably noticed how slow it gets. Nested loops, O(n^2) runtime, timeouts on large inputs. There is a better way.
The sliding window pattern lets you solve these problems in a single pass through the array. Once it clicks, you will start seeing it everywhere.
The problem with brute force
Let's say you have an array of positive integers and you need to find the smallest subarray whose sum is at least 7.
The brute force approach checks every possible subarray. Start at index 0, try length 1, then 2, then 3... then start at index 1 and do the same thing. For an array of length n, that is roughly n^2 / 2 subarrays to check.
It works. But it is doing a ton of redundant work. When you move from checking [2, 3, 1] to checking [2, 3, 1, 2], you are re-adding 2 + 3 + 1 all over again even though you already know that sum.
What if you could just... keep the running total and add one more element?
The key insight
Instead of starting over each time, keep a running window. Add one element to the right side. If the window satisfies your condition, try shrinking it from the left to see if a shorter window also works.
Every element gets added to the window once and removed once. That is O(n) total work, regardless of how big the array is.
This is the sliding window.
Walking through it step by step
Let's find the minimum length subarray with sum >= 7, using our array [2, 3, 1, 2, 4, 3].
We maintain two pointers: left (orange) and right (green). The window is everything between them. We also track the running sum.
Step 1: Expand right to index 0. Window sum = 2. Not enough.
sum = 2, best = none yet
Step 2: Expand to index 1. Window sum = 5. Still not enough.
sum = 5, best = none yet
Step 3: Expand to index 2. Window sum = 6. Almost.
sum = 6, best = none yet
Step 4: Expand to index 3. Window sum = 8. Valid! Record length 4.
sum = 8 >= 7. best = 4
Step 5: Shrink from the left. Remove 2, sum = 6. Invalid again.
Removed 2 from left. sum = 6, best = 4
Step 6: Expand to index 4. Sum = 10. Valid! Length 4, no improvement.
sum = 10 >= 7. best = 4 (same length)
Step 7: Shrink. Remove 3, sum = 7. Still valid! Record length 3.
sum = 7 >= 7. best = 3!
Step 8: Keep shrinking and expanding. Eventually we find [4, 3] with length 2.
sum = 7 >= 7. best = 2. That is our answer!
The answer is 2. The subarray [4, 3] sums to 7 with the shortest possible length.
Notice how each element was visited at most twice: once when right expanded over it, once when left shrank past it. That is O(n), not O(n^2). For an array with a million elements, that is the difference between one million operations and one trillion.
The sliding window is one of about 60 reusable patterns that show up across hundreds of coding problems. Learning them as building blocks is a much better strategy than memorizing individual solutions.
The code
Here is what the pattern looks like in Python:
def min_subarray_len(target, nums):
left = 0
window_sum = 0
best = float('inf')
for right in range(len(nums)):
# Expand: add the right element to our window
window_sum += nums[right]
# Shrink: while our window is valid, try to make it smaller
while window_sum >= target:
best = min(best, right - left + 1)
window_sum -= nums[left]
left += 1
return best if best != float('inf') else 0
Five lines of real logic. The outer for loop expands the window one element at a time. The inner while loop shrinks it as much as possible whenever the window is valid.
The inner while loop might look like it makes this O(n^2), but it does not. The left pointer only moves forward, and it moves at most n times total across the entire run. So the total work is O(n) + O(n) = O(n).
Two types of sliding window
Almost every sliding window problem falls into one of two categories. Getting these mixed up is the number one source of bugs, so it is worth understanding the difference clearly.
1. Find the minimum (shrinking window)
You expand until the window is valid, then shrink to find the smallest valid window. That is what we just did above.
The pattern: expand, and whenever your window satisfies the condition, keep shrinking from the left and recording the best. Problems that look like this:
- Minimum size subarray with sum >= target
- Minimum window substring containing all required characters
- Shortest subarray with sum at least K
2. Find the maximum (growing window)
This is the flip side. You expand while the window is valid, recording the best size as you go. You only shrink when the window becomes invalid.
The classic example: longest substring without repeating characters.
def length_of_longest_substring(s):
seen = set()
left = 0
best = 0
for right in range(len(s)):
# Shrink until valid: remove the duplicate
while s[right] in seen:
seen.remove(s[left])
left += 1
# Expand: add the new character
seen.add(s[right])
best = max(best, right - left + 1)
return best
Here is the crucial difference: in the minimum version, we shrink while the window is valid (because we want to make it smaller). In the maximum version, we shrink while the window is invalid (because we want to make it valid again so we can keep growing).
Getting these backwards will give you answers that are "close but wrong," which is the most frustrating kind of bug.
Picking the right window state
The sliding window structure stays the same across problems. What changes is how you track the state of the window. Think of it as a plug-in: same frame, different engine.
| What you are tracking | Use this |
|---|---|
| Running sum | A single integer |
| Unique elements | A set |
| Character counts | A dict or Counter |
| At most K distinct | A dict + check len(dict) <= k |
| Fixed-size window | No shrink condition, just slide |
The state is whatever lets you quickly answer "is this window valid?" without re-scanning every element inside it. If you find yourself looping over the window contents to check validity, you are missing an opportunity to track that information incrementally.
Common mistakes
If you are getting wrong answers on sliding window problems, check these first. They account for probably 90% of the bugs I see.
1. Forgetting to update state when shrinking.
Every time left moves forward, the element at the old left position is leaving the window. You have to remove it from your state. Subtract from your sum, remove from your set, decrement your frequency count. If you forget this, your state will be wrong and your window will "remember" elements that are no longer in it.
2. Off-by-one on window size.
The number of elements from index left to index right (inclusive) is right - left + 1, not right - left. This trips people up when comparing against a target length.
3. Using if instead of while for shrinking.
When you need to shrink, you might need to shrink multiple times before the window is the right size. Using if instead of while means you only shrink once per expansion. The code will look like it is working on small examples but fail on larger inputs.
4. Mixing up the two types.
Worth repeating: minimum problems shrink while valid, maximum problems shrink while invalid. Write this on a sticky note if you have to.
How to recognize it in interviews
Here are the signals that a problem is a sliding window problem:
- The input is a contiguous sequence (array or string)
- You need a subarray or substring (not a subsequence; elements must be adjacent)
- The problem asks for a minimum or maximum length, sum, or count
- There is a condition you can check incrementally
- The condition has monotonic behavior: adding elements can only make it "more valid" or "less valid," not both
If the problem asks about subsequences (elements do not have to be adjacent), sliding window will not work. That is usually dynamic programming territory.
The sliding window is a specific type of two-pointer technique where both pointers move in the same direction. If a problem needs pointers moving toward each other from opposite ends (like two-sum on a sorted array), that is a different pattern called converging pointers.
Problems you should practice
Once you have the pattern down, you will recognize it instantly. These are the classics:
- Minimum Size Subarray Sum: shrinking window, sum tracking
- Longest Substring Without Repeating Characters: growing window, set tracking
- Minimum Window Substring: shrinking window, frequency counting
- Longest Substring with At Most K Distinct Characters: growing window, frequency + size check
- Maximum Average Subarray: fixed-size window (no shrinking needed)
- Permutation in String: fixed-size window, frequency matching
These problems look different on the surface, but they all follow the same expand-and-shrink structure. The only thing that changes is the window state and the validity condition.
Reading about it is not enough
Here is the thing about patterns like sliding window: understanding the concept is the easy part. The hard part is writing the code from scratch, under time pressure, six months from now.
That is the gap between "I have seen this before" and "I can solve this." Reading through a tutorial gives you recognition. But recognition is not the same as recall. In an interview, nobody asks you to recognize a pattern from a multiple choice list. You have to produce the code from memory.
Spaced repetition is built to close exactly this gap. Instead of solving a sliding window problem once and moving on, you revisit the core pattern at increasing intervals. First after a day, then three days, then a week. Each time you write the code from scratch. After a few repetitions, the pattern is permanent.
The sliding window is one of about 60 fundamental patterns that show up across hundreds of algorithm problems. Each one is a small, reusable building block that you can learn once and apply everywhere.
The key is not to memorize solutions to individual problems. That does not scale, and you forget them anyway. The key is to memorize the building blocks and assemble them on the fly. That is what separates people who grind LeetCode from people who actually get better at problem-solving.
Related posts
- The Two Pointers Pattern - The closely related technique for sorted arrays and converging searches
- Hash Map Patterns - Frequency counting with hash maps pairs well with sliding window state tracking
- Stack-Based Patterns - Another essential pattern family for interviews, covering matching and monotonic tricks
New to algorithms? Start with our visual What is an Algorithm? guide, or learn Big O Notation to understand why sliding window is O(n).