Definition and Purpose

Concept

Asymptotic analysis: evaluates algorithm behavior as input size approaches infinity. Abstracts constant factors and lower order terms. Focus: growth trends over exact timings.

Goal

Compare efficiency across algorithms independent of hardware, implementation. Determine scalability, bottlenecks, and feasibility for large inputs.

Scope

Primarily applied to runtime (time complexity) and memory usage (space complexity). Facilitates theoretical and practical algorithm assessment.

Asymptotic Notations

Big O Notation (O)

Upper bound on growth rate. Guarantees algorithm does not exceed given complexity asymptotically. Common for worst-case analysis.

Big Omega Notation (Ω)

Lower bound on growth rate. Algorithm performance at least as good as stated bound for large inputs.

Big Theta Notation (Θ)

Tight bound. Algorithm growth rate bounded above and below by same function asymptotically.

Little o and Little ω

Strict bounds indicating growth rate strictly less than (o) or strictly greater than (ω) another function.

Common Complexity Classes

Constant Time: O(1)

Execution time independent of input size. Example: accessing array element by index.

Logarithmic Time: O(log n)

Time grows proportionally to logarithm of input size. Example: binary search.

Linear Time: O(n)

Time grows linearly with input size. Example: single pass through array.

Polynomial Time: O(n^k)

Time grows as power of input size. Example: nested loops.

Exponential Time: O(2^n)

Time doubles with each additional input element. Common in brute-force combinatorial problems.

Time vs Space Complexity

Time Complexity

Measures number of operations or steps performed related to input size. Focuses on execution speed.

Space Complexity

Measures amount of memory consumed by algorithm relative to input. Includes variables, data structures, stack space.

Trade-offs

Many algorithms trade space for time (e.g., caching) or time for space (e.g., recomputation). Analyzed via asymptotic bounds.

Worst, Average, and Best Case Analysis

Worst Case

Maximum time or space consumption for any input of size n. Ensures upper bound on resource usage.

Average Case

Expected complexity averaged over all inputs of size n, assuming input distribution. More realistic but harder to compute.

Best Case

Minimum time or space usage possible for any input of size n. Less useful for guarantees.

Amortized Analysis

Definition

Average cost per operation over worst sequence of operations, smoothing expensive operations with cheaper ones.

Methods

Aggregate method: total cost divided by number of operations. Accounting method: assign credits/debits. Potential method: uses potential function.

Applications

Dynamic arrays resizing, splay trees, incrementing binary counters.

Techniques for Asymptotic Analysis

Recurrence Relations

Express complexity in recursive form. Solve using substitution, recursion trees, master theorem.

Summations

Convert loops and nested loops to summations. Simplify using formulas or approximations.

Loop Analysis

Count iterations, multiply by cost per iteration. Consider nested loops and conditional branches.

Probabilistic Analysis

Used for average case complexity. Model input distributions and expected values.

Limits and Limitations

Ignoring Constants

Asymptotic analysis abstracts away constants and lower order terms, may mislead for small inputs.

Input Distribution Assumptions

Average case depends on accurate input distribution modeling, often unknown in practice.

Hardware and Implementation

Analysis ignores hardware specifics, compiler optimizations, and real-world factors affecting runtime.

Practical Importance in Algorithm Design

Algorithm Selection

Guides choice of algorithm based on input size and resource constraints.

Optimization Focus

Identifies bottlenecks and parts with highest growth rates for targeted improvement.

Scalability Prediction

Estimates feasibility of algorithms for very large inputs before implementation.

Examples of Asymptotic Analysis

Linear Search

Time complexity: O(n). Single pass checking each element.

Binary Search

Time complexity: O(log n). Divides search space by half each step.

Merge Sort

Time complexity: O(n log n). Divide and conquer with recursive merges.

Matrix Multiplication

Naive algorithm: O(n^3). Optimized algorithms can reduce exponent.

Summary Tables

Common Complexity Classes

Complexity ClassTypical AlgorithmsExample
O(1)Direct accessArray indexing
O(log n)Divide and conquer searchBinary search
O(n)Linear traversalLinear search
O(n log n)Efficient sortingMerge sort
O(n^2)Nested loopsBubble sort

Comparison of Notations

NotationMeaningBounds
O(f(n))Upper bound≤ c·f(n) for some c > 0
Ω(f(n))Lower bound≥ c·f(n) for some c > 0
Θ(f(n))Tight boundBoth O(f(n)) and Ω(f(n))
o(f(n))Strictly smallerGrowth rate < f(n)
ω(f(n))Strictly largerGrowth rate > f(n)

Key Formulas and Expressions

Definition of Big O

f(n) = O(g(n)) if ∃ c > 0, n₀ > 0 such that ∀ n ≥ n₀: f(n) ≤ c·g(n)

Definition of Big Ω

f(n) = Ω(g(n)) if ∃ c > 0, n₀ > 0 such that ∀ n ≥ n₀: f(n) ≥ c·g(n)

Definition of Big Θ

f(n) = Θ(g(n)) if f(n) = O(g(n)) and f(n) = Ω(g(n))

Master Theorem (for recurrences)

T(n) = a·T(n/b) + f(n), a ≥ 1, b > 1Case 1: If f(n) = O(n^{log_b a - ε}), then T(n) = Θ(n^{log_b a})Case 2: If f(n) = Θ(n^{log_b a} log^k n), then T(n) = Θ(n^{log_b a} log^{k+1} n)Case 3: If f(n) = Ω(n^{log_b a + ε}), and a·f(n/b) ≤ c·f(n) for some c < 1, then T(n) = Θ(f(n)) 

References

  • Cormen, T. H., Leiserson, C. E., Rivest, R. L., & Stein, C. Introduction to Algorithms. MIT Press, 3rd ed., 2009, pp. 45-120.
  • Knuth, D. E. The Art of Computer Programming, Volume 1: Fundamental Algorithms. Addison-Wesley, 3rd ed., 1997, pp. 395-427.
  • Tarjan, R. E. Amortized Computational Complexity. SIAM Journal on Algebraic and Discrete Methods, vol. 6, 1985, pp. 306–318.
  • Graham, R. L., Knuth, D. E., & Patashnik, O. Concrete Mathematics: A Foundation for Computer Science. Addison-Wesley, 2nd ed., 1994, pp. 123-150.
  • Sipser, M. Introduction to the Theory of Computation. Cengage Learning, 3rd ed., 2012, pp. 72-80.