Introduction
Queue is abstract data structure enforcing First-In-First-Out (FIFO) access. First element enqueued is first dequeued. Real-world analogy: line at store (customers served in arrival order). Applications: breadth-first search, task scheduling, printer queues, message passing systems.
Complementary to stack (LIFO). While stack processes newest element first (natural for recursion, parsing), queue processes oldest first (natural for fairness, breadth-first exploration). Both restrict access (only ends), enabling efficient implementations.
Implementation: array-based (circular to avoid shifting) or linked list. Circular array optimal (O(1) enqueue/dequeue, minimal memory waste). Linked list simpler conceptually, slightly less efficient in practice.
"Queues model fairness and order. FIFO discipline ensures no element starves, fundamental in operating systems, networks, and anywhere real-world sequencing matters." -- System design principles
FIFO Principle and Queue Semantics
FIFO Behavior
First element added first removed. Sequence: enqueue 1, enqueue 2, enqueue 3. Dequeue order: 1, 2, 3. Preserves sequence.
Queue visualization (front on left, rear on right):
Enqueue 1: [1]
Enqueue 2: [1, 2]
Enqueue 3: [1, 2, 3]
Dequeue: 1, queue becomes [2, 3]
Dequeue: 2, queue becomes [3]
Dequeue: 3, queue becomes []
Properties
- Insertion at rear: Always append to back
- Deletion at front: Always remove from front
- Order preservation: FIFO order strictly maintained
- Fairness: No element can "cut in line"
Front and Rear Pointers
Queue maintains two pointers: front (next to dequeue) and rear (next insertion point). Operations modify appropriate pointer. Circular queue: pointers wrap around (modulo capacity).
Queue Interface: Enqueue, Dequeue
Core Operations
| Operation | Description | Return Value | Time |
|---|---|---|---|
| enqueue(value) | Add element to rear | None | O(1) |
| dequeue() | Remove and return front element | Front element | O(1) |
| front() / peek() | Return front without removing | Front element | O(1) |
| is_empty() | Check if queue empty | Boolean | O(1) |
| size() | Return number of elements | Integer | O(1) |
Error Handling
Dequeue on empty: raise exception (underflow). Modern queues dynamic. Defensive: check is_empty() before dequeue().
Array-Based Implementation
Naive Approach (Inefficient)
def dequeue(self):
if self.front == self.rear:
raise IndexError("Queue empty")
value = self.items[self.front]
self.front += 1
return value
Problem: front pointer always increases. Never reuse space.
After n dequeues, items array [unused, unused, ..., valid elements].
Space waste: O(n)
Better: Shift on Dequeue
On dequeue, shift remaining elements left. Reclaims space. Problem: O(n) time per dequeue (shift cost). Inefficient.
Optimal: Dynamic Resizing
When front occupies too much space (front > capacity/2), compact: shift elements to front, reset pointers. Amortized O(1) per operation.
def dequeue(self):
if self.is_empty():
raise IndexError("Queue empty")
value = self.items[self.front]
self.front += 1
# Compact if wasting space
if self.front > len(self.items) // 2 and self.size() > 0:
self.items = self.items[self.front:self.rear]
self.rear = self.size()
self.front = 0
return value
Circular Queue Optimization
Circular Array Concept
Treat array as circular: after last element comes first. Wrap-around using modulo. Front and rear pointers advance, wrapping around capacity. Reuses space without shifting.
Capacity 5, enqueue 1,2,3:
[1, 2, 3, _, _] front=0, rear=3
Dequeue 1, enqueue 4:
[4, 2, 3, _, _] front=1, rear=0 (wrapped)
Dequeue 2, enqueue 5:
[4, 5, 3, _, _] front=2, rear=1 (wrapped)
Implementation
class CircularQueue:
def __init__(self, capacity):
self.items = [None] * capacity
self.front = 0
self.rear = -1
self.size = 0
def enqueue(self, value):
if self.size == len(self.items):
self._resize()
self.rear = (self.rear + 1) % len(self.items)
self.items[self.rear] = value
self.size += 1
def dequeue(self):
if self.size == 0:
raise IndexError("Queue empty")
value = self.items[self.front]
self.front = (self.front + 1) % len(self.items)
self.size -= 1
return value
def _resize(self):
# Allocate larger array, copy elements in order
new_capacity = len(self.items) * 2
new_items = [None] * new_capacity
for i in range(self.size):
new_items[i] = self.items[(self.front + i) % len(self.items)]
self.items = new_items
self.front = 0
self.rear = self.size - 1
Advantages
- O(1) enqueue, dequeue (no shifting)
- O(1) amortized with resizing
- Space-efficient (reuses array)
- No external pointers (linked list overhead)
Linked List Implementation
Structure
class Node:
def __init__(self, value):
self.value = value
self.next = None
class Queue:
def __init__(self):
self.front = None
self.rear = None
self.size = 0
def enqueue(self, value):
new_node = Node(value)
if self.rear:
self.rear.next = new_node
else:
self.front = new_node
self.rear = new_node
self.size += 1
def dequeue(self):
if self.front is None:
raise IndexError("Queue empty")
value = self.front.value
self.front = self.front.next
if self.front is None:
self.rear = None
self.size -= 1
return value
Analysis
Enqueue, dequeue: O(1) (add/remove at ends). Space: O(n) plus pointer overhead. No resizing needed (unbounded). Simpler than circular array conceptually, slightly slower in practice.
Comparison with Circular Array
Circular array: better cache locality, no pointer overhead. Linked list: unbounded, simpler code. Modern systems favor circular array (containers like std::deque in C++ use circular arrays).
Time and Space Complexity
| Operation | Circular Array | Linked List |
|---|---|---|
| Enqueue | O(1) amortized | O(1) |
| Dequeue | O(1) | O(1) |
| Peek | O(1) | O(1) |
| Space | O(n) + capacity | O(n) + pointers |
Practical Performance
Circular array faster (cache). Both optimal theoretically. Implementation choice: language and library preference.
Priority Queues and Heaps
Priority Queue Extension
Each element has priority. Dequeue returns highest-priority element (regardless of insertion order). Different from FIFO queue.
Binary Heap Implementation
Priority queues efficiently implemented via binary heaps. Min-heap: parent <= children. Enqueue: insert at end, bubble up. Dequeue: remove root, move last to root, bubble down.
Enqueue (add, priority):
1. Insert at end of heap
2. Bubble up until heap property restored
Time: O(log n)
Dequeue:
1. Remove root (highest priority)
2. Move last element to root
3. Bubble down until heap property restored
Time: O(log n)
Applications
- Dijkstra's algorithm: expand lowest-cost vertex
- Huffman coding: combine lowest-frequency nodes
- Operating system task scheduling: process highest-priority task
Double-Ended Queues (Deques)
Definition
Deque: insert and delete at both ends. More flexible than queue (FIFO only). Supports queue and stack operations.
Deque operations:
- push_front(value): add to front
- push_back(value): add to rear
- pop_front(): remove from front
- pop_back(): remove from rear
- peek_front(), peek_back()
Implementation
Circular array or doubly-linked list naturally support deque. Circular array with front/rear pointers. Doubly-linked list: manipulate both ends. C++ std::deque optimal for this.
Applications
- Sliding window maximum: deque stores indices of useful elements
- Undo/Redo: deque maintains action history (can undo from both ends)
- Palindrome checking: compare elements from both ends
Breadth-First Search Applications
BFS Algorithm
def bfs(graph, start):
visited = set()
queue = Queue()
queue.enqueue(start)
visited.add(start)
while not queue.is_empty():
vertex = queue.dequeue()
print(vertex)
for neighbor in graph[vertex]:
if neighbor not in visited:
visited.add(neighbor)
queue.enqueue(neighbor)
Time: O(V + E) (vertices + edges)
Shortest Path
BFS on unweighted graph finds shortest path. Track parent pointers, reconstruct path by backtracking.
Level-Order Tree Traversal
Visit tree nodes level-by-level. Enqueue root, dequeue and enqueue children. Natural queue application.
Connected Components
BFS from unvisited vertex finds all reachable vertices (connected component). Repeat until all vertices visited.
Real-World Applications
Operating System Task Scheduling
Ready queue: processes waiting for CPU. Scheduler picks front process (FIFO fairness). Or priority queue (prioritize interactive tasks).
Printer Queue
Documents queued for printing. Processes in order received (fairness). Can implement priority (print high-priority jobs first).
Message Passing Systems
Asynchronous message queues (Kafka, RabbitMQ). Producer enqueues messages, consumer dequeues and processes. Decouples producer/consumer.
Traffic Light Simulation
Cars queue at intersection. Traffic light controls flow. Queue discipline determines fairness (FIFO vs. other).
Event-Driven Simulation
Event queue ordered by time. Dequeue next event, process, enqueue resulting events. Simulates system behavior over time.
References
- Cormen, T. H., Leiserson, C. E., Rivest, R. L., and Stein, C. "Introduction to Algorithms." MIT Press, 3rd edition, 2009.
- Knuth, D. E. "The Art of Computer Programming, Volume 1: Fundamental Algorithms." Addison-Wesley, 3rd edition, 1997.
- Sedgewick, R., and Wayne, K. "Algorithms, Part I." Addison-Wesley, 4th edition, 2011.
- Skiena, S. S. "The Algorithm Design Manual." Springer, 2nd edition, 2008.
- Lafore, R. "Data Structures and Algorithms in Java." Sams Publishing, 2nd edition, 2002.