Introduction
Jump search (also known as block search) is a searching algorithm for sorted arrays that works by checking elements at fixed intervals (blocks) rather than examining every element sequentially. It occupies a middle ground between linear search and binary search in terms of both complexity and implementation simplicity. The algorithm first jumps ahead in fixed-size blocks to identify a block that may contain the target, then performs a linear search within that block.
The optimal block size is the square root of the array length, sqrt(n), which yields a time complexity of O(sqrt(n)). This is better than linear search's O(n) but worse than binary search's O(log n). Despite this, jump search has practical advantages in certain scenarios, particularly when backward jumps are costly (as in systems that read data from tape or other sequential-access media).
"Jump search combines the simplicity of linear search with a block-skipping strategy that reduces the number of comparisons from n to approximately 2*sqrt(n)." -- Udi Manber, Introduction to Algorithms: A Creative Approach
The algorithm requires only forward movement during the jumping phase and backward movement only within a single block during the linear search phase. This property makes it well-suited for scenarios where moving backward through the data is expensive, which is the key advantage jump search holds over binary search.
How It Works
Two-Phase Approach
Jump search operates in two distinct phases:
Phase 1: Block Jumping
- Choose a block size m (optimally sqrt(n)).
- Starting from index 0, compare the target with the element at the end of each block: A[m], A[2m], A[3m], and so on.
- Continue jumping forward until you find a block where A[k*m] >= target, or you reach the end of the array.
- At this point, the target (if present) must lie in the block between A[(k-1)*m] and A[k*m].
Phase 2: Linear Search
- Perform a linear search within the identified block, from index (k-1)*m to min(k*m, n-1).
- If the target is found, return its index.
- If the linear search reaches the end of the block without finding the target, return -1.
Visual Illustration
For an array of 16 elements with block size m = 4:
Array: [1, 3, 5, 7, | 9, 11, 13, 15, | 17, 19, 21, 23, | 25, 27, 29, 31]Blocks: Block 0 Block 1 Block 2 Block 3To find 19: Jump to index 4 (A[4]=9, 9 < 19, continue) Jump to index 8 (A[8]=17, 17 < 19, continue) Jump to index 12 (A[12]=25, 25 >= 19, stop jumping) Linear search from index 8 to 11: A[8]=17 (no), A[9]=19 (found!) -> return 9Algorithm and Pseudocode
Standard Jump Search
procedure JumpSearch(A, n, target) Input: A = sorted array of n elements, target = value to find Output: index of target in A, or -1 if not found m := floor(sqrt(n)) // Block size prev := 0 curr := m // Phase 1: Jump forward in blocks while curr < n and A[min(curr, n) - 1] < target do prev := curr curr := curr + m if prev >= n then return -1 end if end while // Phase 2: Linear search within the block while prev < min(curr, n) do if A[prev] = target then return prev end if prev := prev + 1 end while return -1end procedureJump Search with Custom Block Size
procedure JumpSearchCustom(A, n, target, blockSize) Input: A = sorted array, n = length, target = value to find, blockSize = jump size Output: index of target in A, or -1 if not found step := blockSize prev := 0 // Phase 1: Find the block while A[min(step, n) - 1] < target do prev := step step := step + blockSize if prev >= n then return -1 end if end while // Phase 2: Linear search in the block for i := prev to min(step, n) - 1 do if A[i] = target then return i end if end for return -1end procedureComplexity Analysis
Time Complexity
| Case | Complexity | Explanation |
|---|---|---|
| Best Case | O(1) | Target is the first element checked (at the end of the first block). |
| Average Case | O(sqrt(n)) | At most n/m jumps plus m linear steps; optimized when m = sqrt(n). |
| Worst Case | O(sqrt(n)) | Target is at the end of a block, requiring maximum jumps and a full block scan. |
Total Comparisons
With block size m, the jumping phase performs at most n/m comparisons, and the linear phase performs at most m - 1 comparisons. The total number of comparisons is at most:
C(m) = n/m + m - 1This is minimized when m = sqrt(n), giving approximately 2*sqrt(n) - 1 total comparisons.
Space Complexity
O(1): Jump search uses only a constant amount of auxiliary space (a few index variables), making it an in-place algorithm.
Comparison Count Examples
| Array Size (n) | Block Size (sqrt(n)) | Max Comparisons (Jump Search) | Max Comparisons (Linear) | Max Comparisons (Binary) |
|---|---|---|---|---|
| 100 | 10 | 19 | 100 | 7 |
| 10,000 | 100 | 199 | 10,000 | 14 |
| 1,000,000 | 1,000 | 1,999 | 1,000,000 | 20 |
Worked Example
Search for target = 55 in the sorted array:
A = [0, 1, 3, 5, 8, 13, 21, 34, 55, 89, 144, 233, 377, 610, 987, 1597] (n = 16)
Block size: m = floor(sqrt(16)) = 4
Phase 1: Block Jumping
| Jump | Index Checked | Value | Comparison | Action |
|---|---|---|---|---|
| 1 | 3 | 5 | 5 < 55 | Jump forward |
| 2 | 7 | 34 | 34 < 55 | Jump forward |
| 3 | 11 | 233 | 233 >= 55 | Stop jumping; block is [8..11] |
Phase 2: Linear Search in Block [8..11]
| Step | Index | Value | Match? |
|---|---|---|---|
| 1 | 8 | 55 | Yes |
Result: Target 55 found at index 8 after 3 jump comparisons + 1 linear comparison = 4 total comparisons. A linear search would have required 9 comparisons; binary search would have required 4 comparisons as well (by coincidence on this particular input).
Optimal Block Size Derivation
Mathematical Analysis
Let n be the array size and m be the block size. In the worst case:
- The jumping phase performs n/m comparisons (jumping through all blocks before finding the right one).
- The linear phase performs m - 1 comparisons (scanning the entire block without finding the target until the last element).
Total comparisons in the worst case:
C(m) = n/m + m - 1To minimize C(m), take the derivative with respect to m and set it to zero:
dC/dm = -n/m^2 + 1 = 0n/m^2 = 1m^2 = nm = sqrt(n)Substituting m = sqrt(n) back into C(m):
C(sqrt(n)) = n/sqrt(n) + sqrt(n) - 1 = sqrt(n) + sqrt(n) - 1 = 2*sqrt(n) - 1Therefore, the optimal block size is sqrt(n), yielding approximately 2*sqrt(n) total comparisons in the worst case.
"The square root block size balances the two phases of jump search: making blocks too small increases the number of jumps, while making blocks too large increases the cost of the linear scan within a block." -- Udi Manber, Introduction to Algorithms: A Creative Approach
Non-Optimal Block Sizes
Choosing a block size other than sqrt(n) increases the worst-case comparison count. For example, with n = 10,000:
- m = 10: C = 1000 + 9 = 1009 comparisons
- m = 100 (= sqrt(10000)): C = 100 + 99 = 199 comparisons (optimal)
- m = 1000: C = 10 + 999 = 1009 comparisons
Advantages and Disadvantages
Advantages
- Only Jumps Forward: During the block-jumping phase, the algorithm only moves forward through the array. This is advantageous on sequential-access media (like magnetic tape) or in systems where backward seeks are expensive.
- Simple Implementation: The algorithm is straightforward to implement and understand, with no recursion or complex index calculations.
- Better Than Linear Search: O(sqrt(n)) is a significant improvement over O(n), especially for moderately sized arrays.
- Constant Space: Uses O(1) auxiliary space.
- Predictable Performance: The worst case and average case are both O(sqrt(n)), providing consistent behavior regardless of input distribution.
Disadvantages
- Slower Than Binary Search: O(sqrt(n)) is significantly worse than O(log n) for large arrays. For n = 1,000,000, jump search needs up to 1,999 comparisons while binary search needs only 20.
- Requires Sorted Data: Like binary search, the array must be sorted.
- Not Adaptive: Unlike interpolation search, jump search does not use the actual data values to improve probe positions.
- Fixed Block Size: The block size is determined by the array length and does not adapt during the search.
Comparison with Other Search Algorithms
| Algorithm | Average Case | Worst Case | Forward Only? | Requires Sorted |
|---|---|---|---|---|
| Linear Search | O(n) | O(n) | Yes | No |
| Jump Search | O(sqrt(n)) | O(sqrt(n)) | Mostly (backward only within one block) | Yes |
| Binary Search | O(log n) | O(log n) | No (random access) | Yes |
| Interpolation Search | O(log log n) | O(n) | No (random access) | Yes (uniform) |
| Exponential Search | O(log n) | O(log n) | No | Yes |
Jump search is most useful when binary search's random access pattern is problematic. On modern hardware with RAM-based arrays, binary search is almost always preferred. However, for sequential-access storage or when the cost of a backward seek is high relative to a forward read, jump search provides a practical compromise.
Applications
Sequential-Access Media
Jump search was originally designed for scenarios where backward movement through data is expensive. On magnetic tape drives, seeking backward is much slower than reading forward, making jump search more practical than binary search in such environments.
Sorted Linked Lists
While binary search requires O(n) time on linked lists (since random access costs O(n)), jump search can be adapted for linked lists by maintaining skip pointers at every sqrt(n)-th node, achieving O(sqrt(n)) search time with O(sqrt(n)) additional space.
Teaching Algorithm Design
Jump search serves as an excellent pedagogical example of the block decomposition technique, where a problem is decomposed into sqrt(n) blocks of size sqrt(n). This same technique appears in more advanced data structures like sqrt-decomposition for range queries.
Hybrid Search Strategies
Jump search can be combined with other search strategies. For example, performing a jump search to narrow down the block and then using binary search within the block yields O(sqrt(n) + log(sqrt(n))) = O(sqrt(n)) time, but in practice the constant factor is smaller.
Embedded Systems
In embedded systems with limited computational resources, the simplicity and predictability of jump search make it an attractive option when the data is sorted and the array size is moderate.
References
- Manber, U. Introduction to Algorithms: A Creative Approach. Addison-Wesley, 1989. Section 4.1.3: Jump Search.
- Shneiderman, B. "Jump Searching: A Fast Sequential Search Technique." Communications of the ACM, 21(10):831-834, 1978.
- Cormen, T. H., Leiserson, C. E., Rivest, R. L., and Stein, C. Introduction to Algorithms (3rd ed.). MIT Press, 2009.
- Knuth, D. E. The Art of Computer Programming, Volume 3: Sorting and Searching (2nd ed.). Addison-Wesley, 1998.
- Sedgewick, R. and Wayne, K. Algorithms (4th ed.). Addison-Wesley, 2011.