Definition of Vector Spaces
Concept
Vector space: set V over a field F with two operations: vector addition and scalar multiplication. Elements: vectors. Scalars: elements of field F. Structure: algebraic system satisfying specific axioms.
Field
Field F: set with two operations (addition, multiplication), obeying commutativity, associativity, distributivity, identity elements, and inverses. Common fields: real numbers ℝ, complex numbers ℂ, rational numbers ℚ.
Operations
Vector addition: binary operation V × V → V. Scalar multiplication: operation F × V → V. Both operations compatible with field structure.
Formal Definition
Vector space (V, +, ·): set V with addition + and scalar multiplication ·, where for all u, v, w ∈ V and a, b ∈ F, axioms hold (see next section).
Vector Addition
Operation
Addition: combines two vectors u, v ∈ V to produce another vector u + v ∈ V. Closed under addition: u + v ∈ V.
Properties
Commutativity: u + v = v + u. Associativity: (u + v) + w = u + (v + w). Existence of zero vector 0 ∈ V: u + 0 = u. Inverse: for u, ∃ -u ∈ V such that u + (-u) = 0.
Geometric Interpretation
Addition: vector displacement or translation. Resultant vector points from origin to combined effect.
Scalar Multiplication
Definition
Scalar multiplication: scalar a ∈ F multiplies vector v ∈ V, producing av ∈ V. Closed operation.
Properties
Distributivity over vector addition: a(u + v) = au + av. Distributivity over field addition: (a + b)v = av + bv. Associativity: a(bv) = (ab)v. Identity: 1v = v.
Geometric Interpretation
Scaling: changes vector magnitude, possibly reverses direction if scalar negative.
Axioms of Vector Spaces
Addition Axioms
1. Closure: u + v ∈ V2. Commutativity: u + v = v + u3. Associativity: (u + v) + w = u + (v + w)4. Identity: ∃ 0 ∈ V, u + 0 = u5. Inverse: ∃ -u ∈ V, u + (-u) = 0 Scalar Multiplication Axioms
6. Closure: a v ∈ V7. Distributivity (vectors): a(u + v) = au + av8. Distributivity (scalars): (a + b) v = av + bv9. Associativity: a(b v) = (ab) v10. Identity: 1 v = v Summary
These 10 axioms define vector spaces over field F. Any set with operations satisfying axioms is vector space.
Subspaces
Definition
Subspace W ⊆ V: nonempty subset closed under vector addition and scalar multiplication. W itself a vector space under inherited operations.
Conditions
1. 0 ∈ W. 2. u, v ∈ W ⇒ u + v ∈ W. 3. a ∈ F, u ∈ W ⇒ a u ∈ W.
Examples
Zero subspace {0}, entire space V, solution sets of homogeneous linear systems.
Intersection
Intersection of subspaces is a subspace.
Linear Combinations and Span
Linear Combination
Expression: a1 v1 + a2 v2 + ... + an vn, ai ∈ F, vi ∈ V. Represents vector constructed by scaling and adding vectors.
Span
Span(S): set of all linear combinations of subset S ⊆ V. Span(S) is smallest subspace containing S.
Properties
Span is subspace. If span(S) = V, then S generates entire space.
Notation
Span(S) = { v ∈ V : v = ∑ ai vi, vi ∈ S, ai ∈ F }.
Linear Independence
Definition
Set S = {v1, ..., vn} ⊆ V is linearly independent if ∑ ai vi = 0 ⇒ all ai = 0.
Dependence
If ∃ ai not all zero with ∑ ai vi = 0, then S is linearly dependent.
Significance
Independence: vectors contribute uniquely to span, no redundancy.
Testing
Use augmented matrices, Gaussian elimination to test dependence.
Basis and Dimension
Basis
Basis B ⊆ V: linearly independent set spanning V. Every vector in V uniquely expressed as linear combination of basis vectors.
Dimension
Dimension dim(V): cardinality of any basis of V. Finite or infinite.
Properties
All bases have same cardinality. Dimension zero: only zero vector.
Examples
ℝ^n has standard basis {e1, ..., en}, dim = n.
| Vector Space | Basis | Dimension |
|---|---|---|
| ℝ² | {(1,0), (0,1)} | 2 |
| P₃(ℝ) (polynomials degree ≤ 3) | {1, x, x², x³} | 4 |
| Matrices M₂×₂(ℝ) | {E₁₁, E₁₂, E₂₁, E₂₂} | 4 |
Linear Transformations
Definition
Linear transformation T: V → W satisfies T(u + v) = T(u) + T(v) and T(a v) = a T(v) for all u,v ∈ V, a ∈ F.
Kernel and Image
Kernel ker(T): vectors mapped to zero in W. Image im(T): set of vectors T(v), v ∈ V.
Isomorphisms
Bijective linear transformations. Vector spaces V, W are isomorphic if ∃ isomorphism T: V → W.
Matrix Representation
T can be represented as matrix once bases fixed in V and W. Composition corresponds to matrix multiplication.
T(u + v) = T(u) + T(v)T(a v) = a T(v)ker(T) = { v ∈ V | T(v) = 0 }im(T) = { T(v) | v ∈ V } Inner Product Spaces
Definition
Inner product space: vector space V with inner product ⟨·,·⟩: V × V → F satisfying positivity, linearity, symmetry/conjugate symmetry.
Properties
Positive definiteness: ⟨v,v⟩ ≥ 0, equality only if v = 0. Linearity in first argument. Symmetry: ⟨u,v⟩ = ⟨v,u⟩ (real), conjugate symmetry (complex).
Norm and Orthogonality
Norm: ||v|| = √⟨v,v⟩. Orthogonality: ⟨u,v⟩ = 0. Orthogonal sets simplify basis construction.
Examples
Euclidean space ℝⁿ with dot product. Complex vector spaces with Hermitian inner product.
Examples of Vector Spaces
Euclidean Space
ℝⁿ with standard operations. Dimension n, basis: standard unit vectors.
Polynomial Spaces
Set of polynomials degree ≤ n with coefficient addition and scalar multiplication. Infinite or finite dimension depending on degree bound.
Function Spaces
Set of functions from set X to field F. Addition and scalar multiplication defined pointwise.
Matrix Spaces
All m × n matrices over F. Addition and scalar multiplication element-wise.
Applications of Vector Spaces
Linear Systems
Solutions form subspaces. Vector space theory underpins solution structure.
Computer Graphics
Modeling position, transformations, rendering via vector operations.
Data Science
Feature vectors, dimensionality reduction, PCA based on vector space properties.
Quantum Mechanics
State spaces modeled as complex inner product spaces (Hilbert spaces).
Signal Processing
Function spaces and vector decompositions used in Fourier analysis.
| Application | Description |
|---|---|
| Linear Systems | Solution sets as subspaces, analysis via vector spaces. |
| Computer Graphics | Vectors represent points, transformations. |
| Quantum Mechanics | State vectors in complex inner product spaces. |
References
- Axler, S., Linear Algebra Done Right, Springer, 3rd ed., 2015, pp. 1-350.
- Halmos, P. R., Finite-Dimensional Vector Spaces, Springer, 2nd ed., 1974, pp. 1-234.
- Lang, S., Linear Algebra, Springer, 3rd ed., 1987, pp. 1-400.
- Strang, G., Introduction to Linear Algebra, Wellesley-Cambridge Press, 5th ed., 2016, pp. 1-600.
- Greub, W., Linear Algebra, Springer, 6th ed., 1975, pp. 1-350.
Vector spaces form the backbone of linear algebra. They provide an abstract framework for vectors, enabling analysis across mathematics, physics, engineering, and computer science. Core concepts include vector addition, scalar multiplication, subspaces, bases, dimension, and linear transformations. Mastery of vector spaces facilitates understanding of more advanced structures like inner product spaces and functional spaces.
"Vector spaces are fundamental structures that unify diverse mathematical objects under a common algebraic framework." -- Gilbert Strang