!main_tags!Orthogonal Projections - Linear Algebra | What's Your IQ !main_header!

Definition and Basic Properties

Orthogonal Projection Concept

Orthogonal projection: mapping a vector onto a subspace along the direction orthogonal to it. Result: closest vector in subspace to original vector. Mechanism: decomposition of vector into components parallel and orthogonal to subspace.

Mathematical Definition

For vector space V with inner product ⟨·,·⟩ and subspace W ⊆ V, orthogonal projection P: V → W satisfies:

P(v) ∈ W, v - P(v) ∈ W⊥

Uniqueness and Existence

Existence: guaranteed if V is inner product space and W is closed subspace. Uniqueness: for every v ∈ V, unique decomposition v = w + w⊥, with w ∈ W, w⊥ ∈ W⊥.

Inner Product Spaces

Definition

Vector space V over ℝ or ℂ with inner product ⟨·,·⟩: V × V → ℝ or ℂ. Properties: linearity in first argument, conjugate symmetry, positive-definiteness.

Induced Norm and Orthogonality

Norm: ∥v∥ = √⟨v,v⟩. Orthogonality: vectors u, v satisfy ⟨u,v⟩ = 0. Orthogonal sets: collection of mutually orthogonal vectors.

Examples

Euclidean space ℝⁿ with dot product. Complex space ℂⁿ with Hermitian inner product. Function spaces with integral inner products.

Orthogonal Complements

Definition

Orthogonal complement W⊥ of subspace W defined as: W⊥ = {v ∈ V | ⟨v, w⟩ = 0 ∀ w ∈ W}.

Properties

W ∩ W⊥ = {0}. V = W ⊕ W⊥ if V is Hilbert space or finite-dimensional inner product space.

Examples

In ℝ³, if W is plane through origin, W⊥ is line perpendicular to that plane. Null space and row space in matrix theory are orthogonal complements.

Projection Operators

Definition

Projection operator P: V → V satisfies idempotency: P² = P. Orthogonal projection further satisfies self-adjointness: P = P*.

Orthogonality Condition

Operator P projects onto W orthogonally if for all v ∈ V, v - P(v) ∈ W⊥ and P = P*.

Examples

Projection onto coordinate axes in ℝⁿ. Projection onto span of single vector u: P(v) = (⟨v,u⟩/⟨u,u⟩) u.

Matrix Representation

Orthogonal Projection Matrix

Given orthonormal basis {u₁, ..., u_k} of W ⊆ ℝⁿ, projection matrix P = U Uᵀ, where U is n×k matrix with columns u_i.

Formula for Single Vector Projection

P = (u uᵀ) / (uᵀ u)

General Subspace Projection

For basis vectors forming matrix A (full rank), projection matrix:

P = A (Aᵀ A)⁻¹ Aᵀ
Matrix Type Projection Formula Properties
Single Vector P = (u uᵀ) / (uᵀ u) Rank 1, symmetric, idempotent
General Subspace P = A (Aᵀ A)⁻¹ Aᵀ Symmetric, idempotent, rank = dim(W)

Orthonormal Bases and Gram-Schmidt

Orthonormal Basis Definition

Set of vectors {u₁, ..., u_k} with ⟨u_i, u_j⟩ = δ_ij (Kronecker delta). Simplifies projection computations.

Gram-Schmidt Process

Algorithm to convert linearly independent set {v_i} into orthonormal set {u_i} spanning same subspace.

For i=1 to k: w_i = v_i - ∑_{j=1}^{i-1} ⟨v_i, u_j⟩ u_j u_i = w_i / ∥w_i∥

Impact on Projection Matrices

If U has orthonormal columns, projection P = U Uᵀ is simpler and numerically stable.

Least Squares Approximation

Problem Statement

Given inconsistent linear system Ax = b, find x minimizing ∥Ax - b∥². Solution found via orthogonal projection of b onto Col(A).

Normal Equations

Derived from projection condition: Aᵀ A x = Aᵀ b. Unique least squares solution x if Aᵀ A invertible.

Projection Interpretation

Orthogonal projection P = A (Aᵀ A)⁻¹ Aᵀ projects b onto Col(A). Residual vector r = b - Ax orthogonal to Col(A).

Key Properties

Idempotency

P² = P. Applying projection twice equals single application.

Self-Adjointness

P = P*. Projection is symmetric (real case) or Hermitian (complex case).

Norm Relations

∥P(v)∥ ≤ ∥v∥ for all v ∈ V. Projection reduces length or keeps it constant.

Eigenvalues

Eigenvalues of P are 0 or 1 only. 1 corresponds to vectors in W, 0 to vectors in W⊥.

Examples of Orthogonal Projections

Projection onto a Line

Vector u ≠ 0 spans line. Projection of v onto line: P(v) = (⟨v,u⟩/⟨u,u⟩) u.

Projection onto a Plane in ℝ³

Given orthonormal basis {u₁,u₂} of plane W, projection: P(v) = ⟨v,u₁⟩ u₁ + ⟨v,u₂⟩ u₂.

Projection in Function Spaces

Projection of function f ∈ L² onto subspace spanned by orthonormal functions {φ_i}: P(f) = ∑ ⟨f, φ_i⟩ φ_i.

Space Subspace Projection Formula
ℝ² x-axis P(x,y) = (x, 0)
ℝ³ Plane spanned by u₁, u₂ P(v) = ⟨v,u₁⟩ u₁ + ⟨v,u₂⟩ u₂
L²[a,b] Span{φ₁, ..., φ_k} P(f) = ∑ ⟨f, φ_i⟩ φ_i

Applications

Data Science and Statistics

Linear regression: least squares solutions use orthogonal projections. Dimensionality reduction via PCA involves projections onto principal subspaces.

Signal Processing

Noise filtering: projection of signals onto subspaces of noise-free components. Orthogonal projections used in Fourier analysis and filtering.

Computer Graphics

Projection of 3D points onto 2D planes for rendering. Orthogonal projections simplify calculations and preserve angles locally.

Quantum Mechanics

Quantum states modeled in Hilbert spaces. Measurement operators correspond to orthogonal projections onto eigenspaces of observables.

Generalizations to Hilbert Spaces

Infinite-Dimensional Spaces

Hilbert spaces: complete inner product spaces. Orthogonal projections extend naturally with closed subspaces.

Projection Theorem

For closed subspace W of Hilbert space H, every v ∈ H decomposes uniquely as v = w + w⊥, with w ∈ W, w⊥ ∈ W⊥.

Bounded Linear Operators

Orthogonal projection P is bounded, linear, self-adjoint, idempotent operator on H. Plays fundamental role in spectral theory.

Computational Methods

QR Decomposition

Factorize A = Q R with Q orthonormal columns. Projection: P = Q Qᵀ. Numerically stable, efficient for large systems.

SVD-Based Projection

Singular value decomposition A = U Σ Vᵀ. Projection onto Col(A) via U Uᵀ. Useful for rank-deficient matrices.

Numerical Stability and Efficiency

Gram-Schmidt prone to numerical errors; modified versions preferred. QR and SVD offer better stability for projections.

References

  • Axler, S. "Linear Algebra Done Right," Springer, 3rd ed., 2015, pp. 200-230.
  • Lax, P. "Linear Algebra and Its Applications," Wiley, 1997, pp. 120-150.
  • Strang, G. "Introduction to Linear Algebra," Wellesley-Cambridge Press, 5th ed., 2016, pp. 185-210.
  • Halmos, P.R. "Finite-Dimensional Vector Spaces," Springer, 2nd ed., 1974, pp. 60-80.
  • Conway, J.B. "A Course in Functional Analysis," Springer, 2nd ed., 1990, pp. 110-140.
!main_footer!