Queues: First In, First Out
If you have ever waited in line at a grocery store, you already understand queues. The person who gets in line first gets served first. Nobody cuts. Nobody reverses the order. That is it. A queue data structure works exactly the same way, and this one simple rule, FIFO (First In, First Out), turns out to be powerful enough to drive BFS, task scheduling, and a whole category of interview problems.
What is a queue?
A queue is a linear data structure where elements are added at one end (the back) and removed from the other end (the front). The element that has been waiting the longest is always the next one to come out. This is the FIFO principle: the first element enqueued is the first element dequeued.
Compare this to a stack, which follows LIFO (Last In, First Out). With a stack, the most recently added item comes out first. With a queue, the oldest item comes out first. That difference in ordering is what makes each structure suited to completely different problems.
The FIFO principle in action
The best way to internalize FIFO is to watch a queue process a sequence of operations step by step. Let's enqueue A, B, C, D, and dequeue them, and pay attention to the order things come out.
Step 1: Enqueue A
A enters the queue. It is both the front and the back.
Step 2: Enqueue B
B goes to the back. A is still at the front.
Step 3: Enqueue C
C joins the back. The queue now holds A, B, C in arrival order.
Step 4: Dequeue -> returns A
A was first in, so A is first out. B moves to the front.
Step 5: Enqueue D
D enters at the back. The order is B, C, D.
Step 6: Dequeue -> returns B
B was next in line, so B comes out. C is now the front.
Step 7: Dequeue -> returns C, then D
C and D leave in order. The queue is empty. Every element left in the exact order it arrived.
Notice: every element left in exactly the order it arrived. A was first in and first out. D was last in and last out. There is no reversal, no reordering. That predictability is what makes queues useful whenever you need to process things in arrival order.
Queue operations and time complexity
A queue supports four core operations, and all of them run in O(1) time:
| Operation | What it does | Time |
|---|---|---|
enqueue(x) | Add x to the back of the queue | O(1) |
dequeue() | Remove and return the front element | O(1) |
peek() / front() | Look at the front element without removing it | O(1) |
isEmpty() | Check if the queue has no elements | O(1) |
Everything is constant time. You never need to search through the queue, shift elements around, or resize (with the right implementation). That efficiency is what makes queues practical for high-throughput systems like message brokers and task schedulers.
Implementing a queue
There are two main ways to build a queue, and each has different tradeoffs.
Array-based queue (circular buffer)
The naive approach of using a plain array is problematic. If you dequeue from the front by shifting every element left, that is O(n) per dequeue. Not great.
The solution is a circular buffer (also called a ring buffer). You maintain two indices, front and back, and let them wrap around the end of the array. When back reaches the end, it wraps to index 0. This gives you O(1) enqueue and dequeue without ever shifting elements.
class CircularQueue:
def __init__(self, capacity):
self.data = [None] * capacity
self.front = 0
self.size = 0
self.capacity = capacity
def enqueue(self, val):
if self.size == self.capacity:
raise IndexError("queue is full")
back = (self.front + self.size) % self.capacity
self.data[back] = val
self.size += 1
def dequeue(self):
if self.size == 0:
raise IndexError("dequeue from empty queue")
val = self.data[self.front]
self.front = (self.front + 1) % self.capacity
self.size -= 1
return val
def peek(self):
if self.size == 0:
raise IndexError("peek at empty queue")
return self.data[self.front]
The % (modulo) operator is the trick. It wraps the index back to the start of the array when it overflows. No shifting needed.
In Python, you almost never need to build your own circular buffer. Use collections.deque instead. It is implemented in C, gives you O(1) append() and popleft(), and handles resizing automatically.
from collections import deque
q = deque()
q.append(10) # enqueue 10
q.append(20) # enqueue 20
q.append(30) # enqueue 30
front = q[0] # peek: 10
val = q.popleft() # dequeue: 10
# q is now deque([20, 30])
A common interview mistake: using list.pop(0) as a queue. This is O(n) because it shifts every remaining element left. Interviewers will notice. Always use deque for BFS queue operations in Python.
Linked-list-based queue
You can also implement a queue with a linked list by keeping pointers to both the head (front) and the tail (back). Enqueue appends at the tail, dequeue removes from the head. Both are O(1) with no amortization.
class Node:
def __init__(self, val, next_node=None):
self.val = val
self.next = next_node
class LinkedQueue:
def __init__(self):
self.head = None
self.tail = None
def enqueue(self, val):
node = Node(val)
if self.tail:
self.tail.next = node
self.tail = node
if not self.head:
self.head = node
def dequeue(self):
if not self.head:
raise IndexError("dequeue from empty queue")
val = self.head.val
self.head = self.head.next
if not self.head:
self.tail = None
return val
def peek(self):
if not self.head:
raise IndexError("peek at empty queue")
return self.head.val
When would you choose a linked-list queue over deque? In practice, almost never in Python. But in languages without a built-in deque, or when you need guaranteed O(1) per operation without any amortized resizing, a linked-list queue is a solid choice.
Real-world analogies
Queues are everywhere once you start looking:
- Waiting in line. The classic. First person in line is the first person served. Every checkout counter, every coffee shop, every theme park ride operates as a queue.
- Print queue. When you send multiple documents to a printer, they print in the order you submitted them. FIFO.
- Task scheduling. Operating systems use queues to manage processes. CPU tasks get enqueued and processed in order (with priority variations). Web servers queue incoming HTTP requests during load spikes.
- Message queues. Systems like RabbitMQ, Kafka, and SQS are all built on the queue concept. Producers enqueue messages, consumers dequeue and process them in order.
- Breadth-first search. BFS uses a queue to explore nodes level by level. Every node at distance
dis processed before any node at distanced + 1.
The common thread is fairness: whoever arrived first gets handled first.
Deque: the double-ended queue
A deque (double-ended queue, pronounced "deck") generalizes the queue by allowing insertion and removal at both ends. You can push and pop from the front or the back, all in O(1).
from collections import deque
d = deque()
d.append(1) # add to back
d.appendleft(0) # add to front
d.pop() # remove from back -> 1
d.popleft() # remove from front -> 0
A deque can act as both a queue and a stack, depending on which end you use. It is also the foundation of the sliding window maximum pattern, where you maintain a monotonic deque to track the current window's maximum in O(1) per step.
In Python, collections.deque is the standard implementation. In Java, ArrayDeque is preferred over LinkedList for both stack and queue operations.
When to use a queue
Reach for a queue when:
- You need to process elements in the order they arrived (fairness)
- The problem involves levels or layers (level-order tree traversal, shortest path in unweighted graph)
- You are implementing breadth-first search on a graph or grid
- The problem asks for topological ordering of dependencies (Kahn's algorithm uses a queue)
- You need a buffer between a producer and a consumer operating at different speeds
Here is a handy mental model: if the problem says "process everything at this distance before going further," that is BFS, and you need a queue. If it says "go as deep as possible first, then backtrack," that is DFS, and you want a stack (or recursion).
Queue interview problems
Queues are the backbone of breadth-first search, which makes them essential for graph, grid, and tree problems. Here are the key problems where the BFS queue pattern is central.
BFS on grids
- Number of Islands - Count connected components in a 2D grid. BFS with a queue explores each island level by level, marking visited cells as you go. This is the classic BFS queue problem.
Topological sort with BFS
- Course Schedule - Given prerequisite pairs, determine if you can finish all courses. The BFS approach (Kahn's algorithm) processes nodes with zero in-degree using a queue, peeling off one layer of the dependency graph at a time.
Level-order tree traversal
- Maximum Depth of Binary Tree - While this problem is commonly solved with recursion, the BFS approach uses a queue to traverse the tree level by level, counting the number of levels to determine the maximum depth.
Queue vs stack: a quick comparison
| Queue | Stack | |
|---|---|---|
| Order | FIFO (First In, First Out) | LIFO (Last In, First Out) |
| Add | enqueue (to back) | push (to top) |
| Remove | dequeue (from front) | pop (from top) |
| Python | collections.deque with append()/popleft() | list with append()/pop() |
| Use cases | BFS, scheduling, level-order | Matching, backtracking, DFS |
| Key pattern | "Oldest first" | "Most recent first" |
For a deeper comparison with side-by-side animations, see the Stacks and Queues post.
Building fluency with queues
Understanding what a queue does is the easy part. The hard part is reaching for deque automatically during BFS, recognizing when a problem needs level-by-level processing, and writing the enqueue/dequeue loop without hesitation under interview pressure.
That is where deliberate, spaced practice makes the difference. Instead of reading about FIFO once and moving on, you practice the core BFS queue pattern at increasing intervals until it becomes automatic. Each time you drill, the pattern embeds a little deeper.
CodeBricks breaks queue interview problems into their reusable building blocks and schedules them with spaced repetition. You type the code from scratch each session, and the spacing adapts to how well you actually remember each piece.
Related posts
- Stacks and Queues: LIFO vs FIFO - Side-by-side comparison of both structures
- Number of Islands - BFS grid traversal with a queue
- Course Schedule - Topological sort with Kahn's algorithm (queue-based BFS)
- Maximum Depth of Binary Tree - Level-order traversal approach using a queue