Introduction
Matrix operations constitute the core manipulations in linear algebra enabling solution of linear systems, transformations, and data representation. Essential for applied mathematics, physics, computer science, and engineering. Operations include arithmetic, transposition, inversion, and determinant calculation.
"Matrices are the language of linear transformations. Mastering their operations unlocks the power of multidimensional analysis." -- Gilbert Strang
Matrix Addition and Subtraction
Definition
Element-wise operation: sum or difference of corresponding entries in matrices of identical dimensions (m × n).
Properties
Commutative: A + B = B + A. Associative: (A + B) + C = A + (B + C). Existence of zero matrix as additive identity.
Constraints
Both matrices must be same order. Operation undefined for unequal dimensions.
Example
A = [1 2; 3 4], B = [5 6; 7 8]A + B = [6 8; 10 12]Scalar Multiplication
Definition
Multiplying each matrix element by a scalar value k ∈ ℝ or ℂ.
Properties
Distributive over addition: k(A + B) = kA + kB. Associative with scalar multiplication: (kl)A = k(lA).
Applications
Scaling transformations, normalization, adjusting matrix magnitude.
Example
k = 3, A = [1 2; 3 4]kA = [3 6; 9 12]Matrix Multiplication
Definition
Dot product between rows of first matrix (m × p) and columns of second matrix (p × n) producing (m × n) matrix.
Properties
Associative: (AB)C = A(BC). Distributive: A(B + C) = AB + AC. Non-commutative generally: AB ≠ BA.
Existence Conditions
Number of columns in A equals number of rows in B.
Example
A = [1 2; 3 4], B = [5 6; 7 8]AB = [1×5+2×7 1×6+2×8; 3×5+4×7 3×6+4×8] = [19 22; 43 50]Computational Complexity
Standard: O(n³). Optimized algorithms (Strassen, Coppersmith-Winograd) reduce complexity.
Transpose
Definition
Matrix obtained by swapping rows and columns: if A = [aᵢⱼ], then Aᵀ = [aⱼᵢ].
Properties
(Aᵀ)ᵀ = A. (A + B)ᵀ = Aᵀ + Bᵀ. (AB)ᵀ = BᵀAᵀ.
Types
Symmetric matrix: A = Aᵀ. Skew-symmetric: Aᵀ = -A.
Example
A = [1 2 3; 4 5 6]Aᵀ = [1 4; 2 5; 3 6]Determinant
Definition
Scalar value summarizing matrix properties: volume scaling, invertibility. Defined only for square matrices (n × n).
Properties
det(AB) = det(A) × det(B). det(Aᵀ) = det(A). det(I) = 1. If det(A) ≠ 0, A invertible.
Calculation Methods
Expansion by minors, row reduction, Laplace expansion, LU decomposition.
Example
A = [1 2; 3 4]det(A) = 1×4 - 2×3 = -2| Matrix Order | Determinant Formula |
|---|---|
| 2 × 2 | ad - bc |
| 3 × 3 | a(ei − fh) − b(di − fg) + c(dh − eg) |
Inverse
Definition
Matrix A⁻¹ such that A × A⁻¹ = I, exists only if det(A) ≠ 0.
Properties
(A⁻¹)⁻¹ = A. (AB)⁻¹ = B⁻¹A⁻¹. (Aᵀ)⁻¹ = (A⁻¹)ᵀ.
Computation Methods
Gaussian elimination, adjoint and determinant, LU decomposition.
Example
A = [4 7; 2 6]det(A) = 10 ≠ 0A⁻¹ = (1/10) [6 -7; -2 4]Identity Matrix
Definition
Square matrix with ones on diagonal, zeros elsewhere. Denoted Iₙ for order n.
Properties
Multiplicative identity: A × I = I × A = A. Invertible with I⁻¹ = I.
Role in Linear Algebra
Basis for inversion, eigenvalue problems, and system solutions.
Example
I₃ = [1 0 0; 0 1 0; 0 0 1]Rank and Nullity
Rank
Maximum number of linearly independent rows or columns. Indicates matrix's dimension of column space.
Nullity
Dimension of null space (solutions to Ax = 0). Satisfies Rank-Nullity Theorem: rank(A) + nullity(A) = n.
Computation
Row-echelon form, reduced row-echelon form, singular value decomposition.
Example
A = [1 2 3; 2 4 6]rank(A) = 1, nullity(A) = 2Special Matrices
Diagonal Matrices
Non-zero elements only on the main diagonal. Simplifies multiplication and inversion.
Symmetric Matrices
A = Aᵀ. Eigenvalues real, important in quadratic forms.
Orthogonal Matrices
A⁻¹ = Aᵀ. Preserve length and angles (rotations, reflections).
Triangular Matrices
Upper or lower triangular; key in LU decomposition.
| Matrix Type | Key Property |
|---|---|
| Diagonal | Only diagonal elements non-zero |
| Symmetric | Equal to its transpose |
| Orthogonal | Inverse equals transpose |
| Triangular | Zero entries above or below diagonal |
Applications of Matrix Operations
Solving Linear Systems
Use inverses or row operations to solve Ax = b.
Computer Graphics
Transformations: rotation, scaling, translation via matrix multiplication.
Data Science
Principal Component Analysis, covariance matrices, dimensionality reduction.
Engineering
Circuit analysis, structural mechanics, control systems represented by matrices.
Quantum Mechanics
State vectors and operators expressed as matrices; unitary transformations.
Computational Algorithms
Gaussian Elimination
Row operations to reduce matrix to row-echelon form for solving and inversion.
LU Decomposition
Factorizes matrix into lower and upper triangular matrices; efficient for multiple solves.
Strassen’s Algorithm
Divide-and-conquer approach reducing multiplication complexity below O(n³).
QR Decomposition
Orthogonal-triangular factorization, useful in least squares problems.
Singular Value Decomposition (SVD)
Decomposes matrix into orthogonal matrices and diagonal matrix; important in rank, nullity, and pseudoinverse.
Algorithm: Gaussian EliminationInput: Matrix A (m × n)Step 1: Use row swaps to position pivot.Step 2: Eliminate entries below pivot via row operations.Step 3: Repeat for each pivot column.Output: Upper triangular matrix U and lower triangular matrix L (if LU decomposition).References
- Strang, G., Introduction to Linear Algebra, Wellesley-Cambridge Press, 2016, pp. 45-120.
- Horn, R. A., Johnson, C. R., Matrix Analysis, Cambridge University Press, 2013, pp. 1-450.
- Lay, D. C., Linear Algebra and Its Applications, Pearson, 2015, pp. 75-210.
- Trefethen, L. N., Bau III, D., Numerical Linear Algebra, SIAM, 1997, pp. 50-200.
- Axler, S., Linear Algebra Done Right, Springer, 2015, pp. 30-160.