Definition and Basic Properties

Definition

A symmetric matrix A ∈ ℝn×n satisfies A = AT. Equivalently, aij = aji for all i,j. Symmetry defined only for square matrices.

Examples

Typical symmetric matrices: covariance matrices, adjacency matrices of undirected graphs, Hessian matrices in optimization.

Basic Properties

Properties: entries symmetric across main diagonal; real eigenvalues; diagonal entries are real numbers; closed under addition and scalar multiplication; product not necessarily symmetric unless matrices commute.

Notation and Terminology

Notation: A = AT. Sym(ℝn×n) denotes space of symmetric matrices. Dimension: n(n+1)/2.

Eigenvalues of Symmetric Matrices

Real Eigenvalues

Theorem: All eigenvalues of a real symmetric matrix are real. Proof uses Rayleigh quotient and properties of self-adjoint operators.

Multiplicity

Algebraic multiplicity equals geometric multiplicity for symmetric matrices. No defective eigenvalues.

Bounds and Estimates

Gershgorin circle theorem applies; eigenvalues bounded by row sums. Extreme eigenvalues characterized via min/max Rayleigh quotient.

Eigenvalue Distribution

Spectrum symmetric around zero only if matrix is skew-symmetric (not symmetric). Distribution relevant in stability and structural analysis.

Eigenvectors and Orthogonality

Orthogonality of Eigenvectors

Eigenvectors corresponding to distinct eigenvalues are orthogonal. Proof via symmetric bilinear form and inner product properties.

Orthonormal Bases

Existence of orthonormal eigenbasis: symmetric matrices diagonalizable by orthogonal matrices. Basis spans ℝn.

Degenerate Eigenspaces

For repeated eigenvalues, eigenvectors can be chosen orthonormal via Gram-Schmidt. Eigenspaces invariant under symmetry.

Geometric Interpretation

Eigenvectors define principal axes; important in quadratic forms and transformations preserving inner product structure.

Spectral Theorem

Statement

Every real symmetric matrix A can be decomposed as A = QΛQT, where Q orthogonal and Λ diagonal with eigenvalues.

Implications

Diagonalization by orthogonal similarity. Simplifies matrix functions and powers. Foundation for principal component analysis.

Proof Outline

Construct eigenvector basis by induction. Use orthogonality and compactness arguments. Relies on self-adjoint operator theory.

Extensions

Applies to complex Hermitian matrices with unitary diagonalization. Generalizes to infinite-dimensional Hilbert spaces.

Significance in Linear Algebra

Enables spectral decomposition, simplifies quadratic forms, key in numerical linear algebra.

Diagonalization and Orthogonal Diagonalization

Diagonalization Process

Procedure: find eigenvalues, eigenvectors; form Q matrix with orthonormal eigenvectors; compute QTA Q = Λ.

Orthogonal Diagonalization

Special case for symmetric matrices; diagonalization achieved by orthogonal similarity transform preserving norms.

Algorithmic Steps

1. Compute eigenvalues (solve characteristic polynomial). 2. Compute eigenvectors. 3. Orthonormalize eigenvectors. 4. Assemble Q and Λ.

Limitations

Non-symmetric matrices may not be diagonalizable or require complex similarity transforms. Symmetric guarantee simplifies computations.

Positive Definiteness and Semi-Definiteness

Definitions

Positive definite: xTAx > 0 ∀ x ≠ 0. Positive semi-definite: xTAx ≥ 0 ∀ x. Negative definite and indefinite matrices similarly defined.

Characterizations

Eigenvalue criteria: all positive for positive definite; all nonnegative for semi-definite. Principal minors test also used.

Applications

Used in optimization (convexity), statistics (covariance matrices), mechanics (strain energy).

Tests for Definiteness

Leading principal minors positive → positive definite. Sylvester’s criterion standard test.

Matrix Decompositions Related to Symmetric Matrices

Cholesky Decomposition

For positive definite symmetric matrices: A = LLT, with L lower triangular. Efficient for numerical solutions.

Spectral Decomposition

A = QΛQT where Λ diagonal eigenvalue matrix, Q orthogonal eigenvector matrix. Basis for matrix functions.

Singular Value Decomposition (SVD)

General decomposition: A = UΣVT. For symmetric A, U = V and singular values equal absolute eigenvalues.

Relation Between Decompositions

Cholesky requires positive definiteness; spectral and SVD apply more generally. Choice depends on application and matrix properties.

Applications of Symmetric Matrices

Principal Component Analysis (PCA)

Covariance matrices symmetric; eigenvectors define principal components. Dimensionality reduction, data compression.

Physics and Engineering

Stress/strain tensors symmetric; modal analysis uses spectral theorem; vibrational modes from symmetric matrices.

Graph Theory

Adjacency and Laplacian matrices symmetric for undirected graphs; eigenvalues encode connectivity and clustering.

Optimization

Quadratic forms with symmetric Hessians; convexity determined by definiteness of symmetric matrices.

Numerical Methods

Efficient algorithms exploit symmetry; storage optimization; stability of numerical routines.

Computational Aspects

Numerical Stability

Symmetry exploited to improve stability; orthogonal transformations preserve norms reducing rounding errors.

Algorithms

QR algorithm adapted for symmetric matrices; divide-and-conquer; Lanczos method for large sparse matrices.

Complexity

Diagonalization O(n³) for dense matrices; faster for structured/sparse symmetric matrices.

Software Implementations

Libraries: LAPACK, Eigen, MATLAB functions specialized for symmetric matrices.

Examples and Illustrations

Example 1: 2×2 Symmetric Matrix

A = [2 1 1 3]

Eigenvalues: λ₁ = 1, λ₂ = 4. Eigenvectors orthogonal, used to diagonalize A.

Example 2: Covariance Matrix

Data matrix X; covariance matrix Σ = (1/(n-1)) XTX symmetric positive semi-definite. PCA basis from Σ eigenvectors.

Matrix AEigenvaluesOrthogonal Eigenvectors
[[4, 1], [1, 4]]5, 3[1/√2, 1/√2], [-1/√2, 1/√2]

Example 3: Positive Definite Matrix

A = [[6, 2], [2, 5]] positive definite. Verified by eigenvalues λ > 0 and Sylvester’s criterion.

Common Misconceptions

Symmetry Implies Diagonal

False: symmetric matrices not necessarily diagonal; diagonalization requires eigenbasis.

All Matrices Have Real Eigenvalues

False: only symmetric (or Hermitian) matrices guarantee real eigenvalues.

Product of Symmetric Matrices is Symmetric

False unless matrices commute: AB symmetric ⇔ AB = BA.

Symmetry Implies Positive Definiteness

False: symmetric matrices can be indefinite or negative definite.

Summary and Key Takeaways

Symmetric matrices characterized by A = AT. Real eigenvalues guaranteed. Orthogonal diagonalization possible via spectral theorem. Eigenvectors corresponding to distinct eigenvalues orthogonal. Positive definiteness linked to eigenvalue positivity. Central role in theory and applications from PCA to physics. Computational algorithms efficiently exploit symmetry.

References

  • Horn, R. A., & Johnson, C. R. Matrix Analysis, Cambridge University Press, 2012, pp. 111-176.
  • Strang, G. Introduction to Linear Algebra, Wellesley-Cambridge Press, 2016, pp. 221-270.
  • Axler, S. Linear Algebra Done Right, Springer, 2015, pp. 150-200.
  • Golub, G. H., & Van Loan, C. F. Matrix Computations, Johns Hopkins University Press, 2013, pp. 350-400.
  • Lay, D. C. Linear Algebra and Its Applications, Pearson, 2012, pp. 310-365.