!main_tags! Uninformed Search - Search Algorithms | What's Your IQ !main_header!

Introduction

Uninformed search: algorithms exploring state space without domain knowledge. No heuristics guide search. Systematically explore neighbors until goal found. Contrast informed search (uses heuristics). Trade-off: uninformed guarantee optimality/completeness, but often slow.

Fundamental search strategies: breadth-first (explore level-by-level), depth-first (follow paths deeply). Each has trade-offs. Analyzed via completeness (finds solution if exists), optimality (finds best solution), time/space complexity.

Uninformed search foundational: establishes baselines, provides guarantees. Practical systems use informed variants. Understanding uninformed builds intuition for search algorithms.

"Uninformed search provides guaranteed correctness without problem-specific knowledge. Though often inefficient, establishes foundations for all search algorithms and offers optimality guarantees when heuristics unavailable." -- Search algorithm fundamentals

Search Problem Basics

Problem Formulation

Search problem: initial state, actions (transitions), goal test, path cost. State: configuration of problem. Action: move to neighbor state. Goal: desired final state(s). Cost: value of path.

Search Tree

Conceptual: tree structure from initial state. Nodes: states. Edges: actions. Path: sequence of states/actions. Goal node: satisfies goal test. Search: find path from root to goal node.

State Space

Graph of all reachable states from initial. Edges: possible transitions. Search explores state space. Same state may appear in multiple paths (cycles). Cycle detection important.

Frontier/Open List

Nodes waiting expansion. Initially contains start. Algorithm removes node from frontier, expands (generates children), adds to frontier. Repeat until goal found or frontier empty.

Visited/Closed List

Nodes already expanded. Avoid revisiting (prevent cycles). Trade-off: memory for cycle detection.

Optimality Metrics

Complete: finds solution if exists. Optimal: finds lowest-cost solution. Time complexity: nodes expanded. Space complexity: maximum frontier size.

Breadth-First Search (BFS)

Algorithm Idea

Explore all neighbors at depth d before exploring depth d+1. Uses queue (FIFO): frontier. Expand shallowest node first.

Pseudocode

def breadth_first_search(start, goal):
 queue = Queue()
 queue.enqueue(start)
 visited = {start}

 while queue not empty:
 node = queue.dequeue()
 if node == goal:
  return path_to(node)

 for neighbor in neighbors(node):
  if neighbor not in visited:
  visited.add(neighbor)
  queue.enqueue(neighbor)

 return failure

Completeness

Complete: finds solution if exists (finite spaces). Explores all paths level-by-level.

Optimality

Optimal: finds shallowest goal node. If all edges cost 1 (unweighted), shortest path. If edges have costs: not optimal (may revisit state via lower-cost path).

Complexity Analysis

Time: O(b^d) where b=branching factor, d=depth. Expands all nodes up to d. Space: O(b^d) frontier size. Memory usually limiting factor.

Advantages and Disadvantages

Advantage: complete, optimal (unweighted). Disadvantage: exponential space, infeasible for large d. Not practical for large problem spaces.

Practical Limitations

Example: branching 10, depth 12 = 10^12 nodes. Memory-prohibitive. Used for small problems or combined with other techniques.

Depth-First Search (DFS)

Algorithm Idea

Explore deeply before backtracking. Uses stack (LIFO): frontier. Expand deepest node first (most recently added).

Pseudocode

def depth_first_search(start, goal):
 stack = Stack()
 stack.push(start)
 visited = {start}

 while stack not empty:
 node = stack.pop()
 if node == goal:
  return path_to(node)

 for neighbor in neighbors(node):
  if neighbor not in visited:
  visited.add(neighbor)
  stack.push(neighbor)

 return failure

Completeness

Complete in finite spaces (cycle detection). Incomplete in infinite graphs (may follow infinite path). Risk: gets stuck in deep, unproductive branches.

Optimality

Not optimal: finds deepest goal, not necessarily shortest path. May find far solution when near solution exists.

Complexity Analysis

Time: O(b^m) where m=maximum depth. Potentially very long (if m >> d). Space: O(b*m) backtrack stack. Much smaller than BFS!

Advantages and Disadvantages

Advantage: memory-efficient (only stores path). Disadvantage: not optimal, can waste time on deep branches. Good for deep solutions, bad for general use.

Infinite Graph Danger

Infinite graphs: DFS may never return (follow infinite path). Mitigation: depth limits, visited check (requires memory).

Uniform Cost Search (UCS)

Algorithm Idea

Generalization of BFS for weighted graphs. Expand lowest-cost node (by cumulative path cost g(n)). Frontier: priority queue ordered by cost.

Pseudocode

def uniform_cost_search(start, goal):
 frontier = PriorityQueue()
 frontier.add(start, priority=0)
 visited = {}

 while frontier not empty:
 node = frontier.pop() # Lowest cost
 if node == goal:
  return path_to(node)

 if node in visited:
  continue
 visited[node] = True

 for neighbor in neighbors(node):
  if neighbor not in visited:
  cost = path_cost(node) + edge_cost(node, neighbor)
  frontier.add(neighbor, priority=cost)

 return failure

Completeness

Complete: finds solution if exists. Explores all nodes, always lowest cost first ensures finding path.

Optimality

Optimal: finds lowest-cost path. Expands nodes in order of increasing cost. When goal found, it's optimal (no lower-cost paths possible).

Complexity Analysis

Time: O(b^(1+⌊C*/ε⌋)) where C*=optimal cost, ε=minimum edge cost. Depends on cost, not just depth. Can be inefficient if many similar-cost paths.

Comparison to BFS

BFS: special case of UCS (all edges cost 1). UCS: more general (weighted graphs). UCS reduces to BFS for uniform costs.

Practical Use

Guaranteed optimal for weighted graphs. Used in route planning (minimize distance/time). More efficient than BFS if edge costs vary significantly.

Depth-Limited Search (DLS)

Algorithm Idea

DFS with depth limit: don't explore beyond depth L. Prevents infinite paths in infinite graphs. Return "cutoff" if goal not found within limit.

Pseudocode

def depth_limited_search(node, goal, limit):
 if limit < 0: return "cutoff"
 if node == goal: return path_to(node)

 for neighbor in neighbors(node):
 result = depth_limited_search(neighbor, goal, limit - 1)
 if result != "failure": return result

 return "failure"

Completeness

Complete if solution within limit. Incomplete if solution beyond limit (returns "cutoff" false-negative).

Optimality

Not optimal: finds shallower solution (within limit) even if deeper optimal exists.

Complexity Analysis

Time: O(b^L). Space: O(b*L). Better than DFS if L chosen well. Poor if L too small (misses solution) or too large (similar to DFS).

Practical Use

Safe DFS variant: prevents infinite loops. Requires estimating solution depth (risky). Basis for iterative deepening.

Iterative Deepening Search (IDS)

Algorithm Idea

Combine BFS completeness with DFS space efficiency. Run DLS with increasing limits: 0, 1, 2, 3, ... until goal found. Redoes work but manageable.

Pseudocode

def iterative_deepening_search(start, goal):
 for limit in range(0, infinity):
 result = depth_limited_search(start, goal, limit)
 if result != "cutoff": return result

Completeness

Complete: tries all depths eventually, finds solution if exists.

Optimality

Optimal (unweighted graphs): finds shallowest goal (like BFS).

Complexity Analysis

Time: O(b^d) (same as BFS). Revisits nodes repeatedly but overhead manageable. Space: O(b*d) (like DFS). Best of both!

Why Optimal Complexity?

Intuition: each iteration depth-limited to d. Iteration d expands b^d nodes. Previous iterations sum: b^0 + b^1 + ... + b^(d-1) < b^d (for b > 1). Total O(b^d) dominated by final iteration.

Practical Advantages

Combines BFS (complete, optimal) with DFS (memory-efficient). Often practical first choice for uninformed search. Used in game playing (chess).

Bidirectional Search

Algorithm Idea

Search from start toward goal AND from goal toward start simultaneously. Meet in middle. Reduces search space exponentially.

Pseudocode

def bidirectional_search(start, goal):
 forward_frontier = {start}
 backward_frontier = {goal}
 forward_visited = {start}
 backward_visited = {goal}

 while forward_frontier and backward_frontier:
 # Expand forward frontier
 if expand_forward(forward_frontier, forward_visited, backward_visited):
  return path

 # Expand backward frontier
 if expand_backward(backward_frontier, backward_visited, forward_visited):
  return path

 return failure

Completeness

Complete: searches from both ends, will meet if solution exists.

Optimality

Optimal (if both directions BFS). Meet in middle = optimal meeting point.

Complexity Advantage

Time: O(b^(d/2) + b^(d/2)) = O(b^(d/2)). Exponential savings! Space: O(b^(d/2)). Two searches of half-depth < one search full-depth.

Example

BFS from both ends: b=10, d=12. BFS: 10^12. Bidirectional: 2*10^6 (meet at depth 6). Million-fold speedup!

Limitations

Requires reverse direction (actions reversible). Unsuitable if multiple goals (hard to search backward). Implementation complex (coordinate two searches).

Practical Use

Excellent when goal known, actions reversible. Route planning (start/destination known). Game solving (initial/goal known).

Comparative Analysis and Complexity

Complexity Comparison

Algorithm Complete Optimal Time Space
BFS Yes Yes* O(b^d) O(b^d)
DFS No No O(b^m) O(b*m)
UCS Yes Yes O(b^(C*/ε)) O(b^(C*/ε))
IDS Yes Yes* O(b^d) O(b*d)
Bidirectional Yes Yes* O(b^(d/2)) O(b^(d/2))

*Optimal for unweighted graphs, assuming BFS variant.

When to Use Each

BFS: small spaces, need shortest path. DFS: memory constrained, deep solutions. UCS: weighted graphs, need lowest cost. IDS: memory constrained, need optimality. Bidirectional: known goal, reversible actions.

Exponential Growth Challenge

All uninformed O(b^d) or worse. For b=10, d=10: ~10 billion nodes. All exhausted. Heuristics critical for practical problems.

Open and Closed Lists

Open List (Frontier)

Unexpanded nodes. Nodes waiting exploration. Algorithm selects one, expands. Data structure: queue (BFS), stack (DFS), priority queue (UCS). Order determines search strategy.

Closed List (Visited)

Expanded nodes. Prevents revisiting. Hash table: fast membership test. Memory-space trade-off: closed list consumes memory, prevents cycles.

Memory Trade-Offs

No closed list: may explore same state infinitely (infinite cycles). Closed list: prevents cycles, memory required. Large problems: closed list memory-prohibitive.

Implicit vs. Explicit Graphs

Implicit: generate states on-demand (no full graph stored). Explicit: graph structure known. Implicit more memory-efficient but slower generation.

Practical Considerations

DFS: small open/closed (only path length). BFS: huge open list (exponential). Hybrid strategies balance memory/speed.

Cycle Detection and Looping

Problem: Infinite Loops

Infinite state spaces or cycles: algorithm loops infinitely. Example: any state can return to parent. DFS explores forever without detection.

Solution: Visited Set

Track visited states. Don't re-explore. Prevents cycles. Cost: memory (must store all visited).

Efficient Cycle Detection

Hash table: O(1) membership check. Python set: efficient. Track: which states visited, not full paths (reduces memory).

Parent Pointer Alternative

Only track parent (path reconstruction). Check: is neighbor already on path? Only prevents immediate cycles, not distant ones.

Trade-Offs

Full visited: guarantees no revisits, consumes memory. No tracking: fast but infinite loop risk. Practical: use visited set for safety.

Applications and Limitations

Where Uninformed Works

Small search spaces: puzzle solving, graph reachability. Complete problem structures (no heuristics available). Baseline comparison (measure informed gains).

Practical Limitations

Exponential complexity: infeasible for large spaces. Real-world: usually heuristics available. Uninformed search slow, impractical alone.

Hybrid Approaches

Combine uninformed + informed: use heuristics when available, fall back uninformed if not. Iterative deepening with heuristic limits.

Algorithm Selection

Unknown solution depth: iterative deepening. Known depth, enough memory: BFS. Weighted graph: UCS. Goal known, reversible: bidirectional. Memory-critical: DFS.

Modern Context

Heuristic/informed methods dominate practice. Uninformed foundational, rarely used alone. BFS/DFS building blocks for advanced algorithms.

References

  • Russell, S. J., and Norvig, P. "Artificial Intelligence: A Modern Approach." Prentice Hall, 4th edition, 2020.
  • Cormen, T. H., Leiserson, C. E., Rivest, R. L., and Stein, C. "Introduction to Algorithms." MIT Press, 3rd edition, 2009.
  • Pearl, J. "Heuristics: Intelligent Search Strategies for Computer Problem Solving." Addison-Wesley, 1984.
  • Korf, R. E. "Depth-First Iterative-Deepening: An Optimal Admissible Tree Search." Artificial Intelligence, vol. 27, 1985, pp. 97-109.
  • Pohl, I. "Bi-Directional Search." Machine Intelligence, vol. 6, 1971, pp. 127-140.
!main_footer!