Introduction to Linear Algebra
Systems of Linear Equations
- Introduction
- Linear Systems
- Vectors
- Linear combinations
- Matrices
- Planes in ℝ³
- Row operations
- Gaussian elimination
- Reduced Row-Echelon Form
- Equation A x = b
- Sensitivity of solutions
- Iterative methods
- Linear independence
- Plane transformations
- Space transformations
- Rotations
- Linear transformations
- Affine maps
- Exercises
- Answers
Matrix Algebra
- Introduction
- Manipulation of matrices
- Partitioned matrices
- Block matrices Matrix operators
- Determinants
- Cofactors
- Cramer's rule
- Partitioned matrices
- Elementary matrices
- Inverse matrices
- Equivalent matrices
- Rank
- Elimination: A = L U
- PLU factorization
- Reflection
- Givens rotation
- Special matrices
- Exercises
- Answers
Vector Spaces
- Introduction
- Motivation
- Vector Spaces
- Bases
- Dimension
- Coordinate systems
- Change of basis
- Linear transformations
- Matrix transformations
- Compositions
- Isomorphisms
- Dual spaces
- Dual transformations
- Subspaces
- Direct sums
- Quotient spaces
- Vector products
- Cross products
- Matrix spaces
- Solving A x = b
- Exercises
- Answers
Eigenvalues, Eigenvectors
- Introduction
- Characteristic polynomials
- Companion matrix
- Algebraic and geometric multiplicities
- Minimal polynomials
- Eigenspaces
- Where are eigenvalues?
- Eigenvalues of A B and B A
- Generalized eigenvectors
- Similarity
- Diagonalizability
- Self-adjoint operators
Euclidean Spaces
- Introduction
- Dot product
- Bilinear transformations
- Inner product
- Norm and distance
- Matrix norms
- Dual norms
- Dual transformations
- Orthogonality
- Gram--Schmidt process
- Orthogonal sets
- Self-adjoint matrices
- Unitary matrices
- Projection operators
- QR-decomposition
- Least Square Approximation
- Quadratic forms
- Exercises
- Answers
Matrix Decompositions
- Introduction
- Symmetric matrices
- LU-decomposition
- Sylvester formula
- Cholesky decomposition
- Schur decomposition
- Jordan decomposition
- Positive matrices
- Roots
- Polar factorization
- Spectral decomposition
- Singular values
- SVD <
- Pseudoinverse
- Exercises
- Answers
Applications
- GPS problem
- Poisson equation
- Graph theory
- Error correcting codes
- Electric circuits
- Markov chains
- Cryptography
- Wave-length transfer matrix
- Computer graphics
- Linear Programming
- Hill's determinant
- Fibonacci matrices
- Discrete dynamic systems
- Discrete Fourier transform
- Fast Fourier transform
- Curve fitting
Functions of Matrices
- Introduction
- Diagonalization
- Sylvester formula
- The Resolvent Method
- Polynomial interpolation
- Positive matrices
- Roots <
- Pseudoinverse
- Exercises
- Answers
Miscellany
- Circles along curves
- TNB frames
- Tensors
- Tensors in ℝ³
- Tensors & Mechanics
- Differential forms
- Calculus
- Vector Representations
- Matrix Representations
- Change of Basis
- Orthonormal Diagonalization
- Generalized Inverse
Preliminaries
- Complex Number Operations
- Sets
- Polynomials
- Polynomials and Matrices
- Computer solves systems of Linear Equations
- Location of eigenvalues
- Power method
- Iterative method
Glossary
Reference
This Book is licensed under Creative Commons Attribution-NonCommercial-NoDerivs 3.0 Unported License
‹‹
Rank 1 Matrices
If u ∈ 𝔽m×1 and v ∈ 𝔽n×1v are both nonzero column vectors, then the outer product matrix u vT always has matrix rank 1. Indeed, the columns of the outer product are all proportional to the first column. Thus they are all linearly dependent on that one column, hence the matrix is of rank one.
- Axler, Sheldon Jay (2015). Linear Algebra Done Right (3rd ed.). Springer. ISBN 978-3-319-11079-0.