Notes and code experiments for linear algebra in code. The idea is to construct the SVD as soon as possible, then use it for everything else — from characterizing invertibility, to parametrizing the loss surface of a linear regression model. Some of the interesting stuff that are covered:
- Proof of the real spectral theorem, and a code demo
- Proof of the singular value decomposition (SVD)
- An extensive discussion of the Moore-Penrose pseudoinverse
- Stability of the Gram-Schmidt algorithm
- Characterizing the loss surface of a linear regression problem
- Characterizing quadratic forms using the principal axes theorem
Figure. SVD of a sum of Gaussians. Only the first few vectors are meaningful, the rest model noise.
Figure. Energy surface of an indefinite matrix. It has a negative minimum and a positive maximum.
- Vectors and matrices
- Singular value decomposition
- Matrix multiplication and norms
- Rank and dimension
- Four fundamental subspaces
- Determinant
- Matrix inverse and pseudoinverse
- Projection and orthogonalization
- Least squares for model fitting
- Eigendecomposition
- Quadratic form and definiteness
- Mike X Cohen. Complete linear algebra: theory and implementation in code. Udemy. (2021)
- Sheldon Axler. Down With Determinants! The American Monthly. (1996)
- Leslie Hogben (editor). Handbook of Linear Algebra. CRC Press. (2014)
- Cleve Moler. Numerical Computing with MATLAB. The MathWorks / SIAM. (2013)
- Peter Olver and Chehzrad Shakiban. Applied Linear Algebra. UTM Springer. (2018)
- Petersen & Pedersen. The Matrix Cookbook. v. Nov. 15, 2012. (2012)