Definition of Orthogonality

Concept Overview

Orthogonality: two vectors are orthogonal if their inner product equals zero. Symbolically, for vectors u, v in inner product space V, u ⊥ v if <u, v> = 0. Orthogonality generalizes perpendicularity from Euclidean geometry.

Mathematical Expression

Inner product <u, v> = 0 defines orthogonality. Zero scalar product implies geometric independence in direction.

Geometric Interpretation

Orthogonal vectors form a 90° angle in Euclidean space. In abstract spaces, orthogonality implies no linear correlation or interaction under inner product.

Inner Product Spaces

Definition

Vector space V over field F with inner product <·,·> : V×V → F satisfying linearity, conjugate symmetry, positivity.

Properties

Linearity in first argument, conjugate symmetry: <u, v> = &overline;<v, u>, positive-definiteness: <v, v> ≥ 0, equality iff v=0.

Examples

Euclidean inner product on ℝⁿ: <x, y> = Σ xᵢyᵢ. Complex inner product on ℂⁿ: <>x, y> = Σ xᵢ·conj(yᵢ).

Orthogonal Vectors

Definition

Vectors u, v ∈ V are orthogonal if <u, v> = 0. Orthogonality is symmetric: if u ⊥ v, then v ⊥ u.

Zero Vector

Zero vector is orthogonal to all vectors: <0, v> = 0 ∀ v ∈ V.

Examples in ℝ² and ℝ³

Standard basis vectors e₁=(1,0), e₂=(0,1) in ℝ² are orthogonal: <e₁, e₂> = 0.

Orthonormal Sets and Bases

Orthonormal Set

Set {v₁,..., vₖ} is orthonormal if each vector has norm one and vectors are mutually orthogonal: <vᵢ, vⱼ>=δᵢⱼ.

Orthonormal Basis

Orthonormal basis spans V with orthonormal vectors. Simplifies computations: coordinates found via inner products.

Advantages

Numerical stability, simplification of linear transformations, easy computation of projections and decompositions.

Orthogonal Complement

Definition

Given subspace W ⊆ V, orthogonal complement W⊥ = {v ∈ V : <v, w> = 0 ∀ w ∈ W}.

Properties

W⊥ is a subspace; V = W ⊕ W⊥ if V is finite-dimensional and inner product space.

Dimension Relation

dim(W) + dim(W⊥) = dim(V).

Projection Theorem

Orthogonal Projection

For v ∈ V and subspace W, unique decomposition v = w + w⊥ with w ∈ W, w⊥ ∈ W⊥.

Formula

Projection P_W(v) = Σ <v, uᵢ>uᵢ for orthonormal basis {uᵢ} of W.

Best Approximation

Projection minimizes distance ‖v - w‖ over all w ∈ W.

Given v ∈ V, orthonormal basis {u₁,..., uₖ} of W:P_W(v) = Σ_{i=1}^k <v, uᵢ> uᵢ 

Gram-Schmidt Process

Purpose

Construct orthonormal set from linearly independent vectors.

Algorithm Steps

Iteratively subtract projections on previous vectors, then normalize.

Formula

Given {v₁,..., vₙ} linearly independent:u₁ = v₁ / ‖v₁‖For k=2 to n: w_k = v_k - Σ_{j=1}^{k-1} <v_k, u_j> u_j u_k = w_k / ‖w_k‖ 

Properties of Orthogonality

Pythagorean Theorem

If u ⊥ v, then ‖u + v‖² = ‖u‖² + ‖v‖².

Orthogonality and Linear Independence

Orthogonal nonzero vectors are linearly independent.

Orthogonal Decomposition

Every vector can be decomposed uniquely into components in subspace and its orthogonal complement.

PropertyStatement
SymmetryIf u ⊥ v then v ⊥ u
Linearityu ⊥ (v + w) if u ⊥ v and u ⊥ w
Norm Additivity‖u+v‖² = ‖u‖² + ‖v‖² if u ⊥ v

Applications of Orthogonality

Signal Processing

Orthogonal signals reduce interference; basis for Fourier transforms.

Data Compression

Orthonormal bases enable efficient representation (PCA, SVD).

Numerical Methods

Stability and accuracy in solving linear systems and eigenvalue problems.

Quantum Mechanics

State vectors orthogonal if mutually exclusive.

Orthogonal Matrices

Definition

Square matrix Q with QᵀQ = QQᵀ = I.

Properties

Columns (and rows) form orthonormal sets. Orthogonal transformations preserve lengths and angles.

Examples

Rotation matrices in ℝ², reflection matrices.

MatrixDescription
Q = [[cosθ, -sinθ], [sinθ, cosθ]]Rotation matrix, orthogonal with determinant 1
Q = [[1, 0], [0, -1]]Reflection matrix, orthogonal with determinant -1

Principal Component Analysis (PCA)

Overview

Statistical technique: orthogonal transformation to convert correlated variables into uncorrelated principal components.

Role of Orthogonality

Principal components are orthogonal vectors maximizing variance.

Computation

Eigenvectors of covariance matrix form orthonormal basis for data representation.

Examples and Exercises

Example 1: Orthogonality in ℝ³

Check if u = (1, 2, 3) and v = (3, -6, 1) are orthogonal.

Compute <u, v> = 1·3 + 2·(-6) + 3·1 = 3 - 12 + 3 = -6 ≠ 0Conclusion: u and v are not orthogonal. 

Example 2: Gram-Schmidt on ℝ²

Orthonormalize vectors v₁ = (1, 1), v₂ = (1, 0).

u₁ = v₁ / ‖v₁‖ = (1,1)/√2 = (1/√2, 1/√2)proj_{u₁}(v₂) = <v₂, u₁> u₁ = (1·1/√2 + 0·1/√2)(1/√2, 1/√2) = (1/√2)(1/√2, 1/√2) = (1/2, 1/2)w = v₂ - proj_{u₁}(v₂) = (1,0) - (1/2, 1/2) = (1/2, -1/2)u₂ = w / ‖w‖ = (1/2, -1/2) / √((1/2)² + (-1/2)²) = (1/2, -1/2) / (1/√2) = (√2/2, -√2/2) 

Exercise

Prove that the set { (1,0,0), (0,1,0), (0,0,1) } is orthonormal.

References

  • Axler, S., Linear Algebra Done Right, Springer, 3rd ed., 2015, pp. 45-80.
  • Strang, G., Introduction to Linear Algebra, Wellesley-Cambridge Press, 5th ed., 2016, pp. 120-150.
  • Lax, P., Linear Algebra and Its Applications, Wiley, 2nd ed., 2007, pp. 200-230.
  • Halmos, P. R., Finite-Dimensional Vector Spaces, Springer, 2nd ed., 1974, pp. 90-110.
  • Horn, R. A., Johnson, C. R., Matrix Analysis, Cambridge University Press, 2nd ed., 2013, pp. 350-380.