Introduction to Eigenvalues
Context
Eigenvalues arise in the study of linear transformations represented by matrices. They quantify how vectors transform under these mappings.
Historical Background
Originated in the 19th century, eigenvalues were introduced by mathematicians such as Augustin-Louis Cauchy and James Joseph Sylvester for solving systems of linear equations.
Significance
Used extensively in physics, engineering, computer science, and applied mathematics to analyze stability, oscillations, and system behavior.
Formal Definition
Definition Statement
Given a square matrix \( A \in \mathbb{C}^{n \times n} \), a scalar \( \lambda \in \mathbb{C} \) is an eigenvalue of \( A \) if there exists a nonzero vector \( \mathbf{v} \in \mathbb{C}^n \) such that:
A \mathbf{v} = \lambda \mathbf{v} Eigenvector Explanation
The vector \( \mathbf{v} \) is called an eigenvector corresponding to eigenvalue \( \lambda \). It is nonzero and satisfies the above equation.
Linear Transformation Perspective
Interpreted as: \( \mathbf{v} \) is scaled by \( \lambda \) under transformation \( A \), direction preserved (except possibly sign or complex phase).
Characteristic Polynomial
Definition
The characteristic polynomial \( p(\lambda) \) of matrix \( A \) is:
p(\lambda) = \det(A - \lambda I) Role in Eigenvalue Computation
Eigenvalues are roots of \( p(\lambda) \): values of \( \lambda \) solving \( p(\lambda) = 0 \).
Properties
Degree equals \( n \), coefficients depend on traces and minors of \( A \), polynomial is monic.
| Term | Description |
|---|---|
| \( \lambda^n \) | Leading term, monic polynomial |
| Coefficients | Functions of matrix trace, determinant |
Eigenvector Relationship
Definition Recap
Nonzero vector \( \mathbf{v} \) satisfying \( A \mathbf{v} = \lambda \mathbf{v} \).
Geometric Interpretation
Eigenvectors define invariant directions under linear transformation \( A \).
Algebraic Multiplicity
Dimension of eigenspace associated with eigenvalue \( \lambda \) equals geometric multiplicity.
Linear Independence
Eigenvectors corresponding to distinct eigenvalues are linearly independent.
Matrix Diagonalization
Definition
Matrix \( A \) is diagonalizable if there exists invertible \( P \) such that:
P^{-1} A P = D where \( D \) is diagonal with eigenvalues on the diagonal.
Conditions
Diagonalizability requires \( n \) linearly independent eigenvectors.
Significance
Simplifies matrix functions and powers, facilitates spectral analysis.
| Matrix Type | Diagonalizable? |
|---|---|
| Symmetric | Always diagonalizable |
| Defective | Not diagonalizable |
Spectral Theorem
Statement
Every real symmetric matrix \( A \in \mathbb{R}^{n \times n} \) can be diagonalized by an orthogonal matrix \( Q \):
Q^T A Q = D Implications
Eigenvalues are real, eigenvectors form orthonormal basis.
Applications
Principal component analysis (PCA), quadratic forms, vibration analysis.
Properties of Eigenvalues
Dependence on Matrix
Eigenvalues depend on entries of \( A \), invariant under similarity transformations.
Sum and Product
Sum of eigenvalues equals trace of \( A \), product equals determinant.
Algebraic vs Geometric Multiplicity
Algebraic multiplicity: root multiplicity of characteristic polynomial. Geometric multiplicity: dimension of eigenspace.
Complex Eigenvalues
Real matrices may have complex eigenvalues; conjugate pairs appear for real entries.
Computational Methods
Characteristic Polynomial Roots
Direct solving for roots of \( \det(A - \lambda I) = 0 \) for small \( n \).
Power Iteration
Iterative method to approximate dominant eigenvalue and eigenvector.
QR Algorithm
Numerical method for all eigenvalues using QR decomposition iterations.
Jacobi Method
Specifically for symmetric matrices to find eigenvalues and eigenvectors.
Algorithm: Power IterationInput: Matrix A, initial vector b0, iterations kFor i = 1 to k: b_i = A b_{i-1} b_i = b_i / ||b_i||Eigenvalue approx: λ ≈ (b_i)^T A b_iEigenvector approx: b_i Applications of Eigenvalues
Stability Analysis
Eigenvalues determine stability of equilibrium points in differential equations.
Quantum Mechanics
Eigenvalues represent measurable physical quantities (energy levels).
Principal Component Analysis (PCA)
Eigenvalues quantify variance explained by components in data reduction.
Vibrations and Modal Analysis
Eigenvalues correspond to natural frequencies of mechanical systems.
Markov Chains
Eigenvalues characterize long-term behavior and convergence rates.
Examples
Example 1: 2x2 Matrix
A = [2 1 1 2]Characteristic polynomial:p(λ) = det(A - λI) = (2-λ)(2-λ) - 1 = λ^2 - 4λ + 3Eigenvalues: λ = 1, 3 Eigenvectors
For λ=3:
(A - 3I)v = 0 [ -1 1 ] [v1] = 0 [ 1 -1 ] [v2] = 0Eigenvector: v = k [1,1]^T Example 2: Complex Eigenvalues
A = [0 -1 1 0]Characteristic polynomial:p(λ) = λ^2 + 1Eigenvalues: λ = i, -i (complex conjugates) Common Misconceptions
Eigenvalue vs Eigenvector
Eigenvalue is scalar, eigenvector is vector; they are related but distinct.
All Matrices Have Real Eigenvalues
False: non-symmetric real matrices can have complex eigenvalues.
Every Matrix is Diagonalizable
No: defective matrices lack full eigenvector basis.
Eigenvectors Must Be Unit Vectors
Incorrect: eigenvectors can be any nonzero scalar multiple.
Summary
Eigenvalues: scalars \( \lambda \) satisfying \( A \mathbf{v} = \lambda \mathbf{v} \), where \( \mathbf{v} \neq 0 \). Computed as roots of characteristic polynomial. Essential in matrix diagonalization and spectral analysis. Applications span science and engineering domains. Understanding eigenvalues facilitates analysis of linear transformations, stability, and system behavior.
References
- Horn, R. A., & Johnson, C. R., Matrix Analysis, Cambridge University Press, Vol. 2, 2012, pp. 1-600.
- Strang, G., Linear Algebra and Its Applications, Brooks Cole, 4th Edition, 2006, pp. 1-656.
- Golub, G. H., & Van Loan, C. F., Matrix Computations, Johns Hopkins University Press, 4th Edition, 2013, pp. 1-600.
- Lancaster, P., Theory of Matrices, Academic Press, Vol. 1, 1969, pp. 1-400.
- Trefethen, L. N., & Bau, D., Numerical Linear Algebra, SIAM, 1997, pp. 1-340.