Mathematics for Machine Learning (MML)ΒΆ
Notebook series following the Mathematics for Machine Learning textbook by Deisenroth, Faisal, Ong.
Source PDF: mml-book.pdf
Course NotebooksΒΆ
# |
Notebook |
Book Chapter |
Topics |
|---|---|---|---|
01 |
Ch 2 |
Systems of equations, vector spaces, basis, rank, linear mappings |
|
02 |
Ch 3 |
Norms, inner products, projections, rotations |
|
03 |
Ch 4 |
Eigenvalues, Cholesky, SVD, low-rank approximation |
|
04 |
Ch 5 |
Gradients, Jacobians, backpropagation, Taylor series |
|
05 |
Ch 6 |
Distributions, Bayesβ theorem, Gaussian, exponential family |
|
06 |
Ch 7 |
Gradient descent, Lagrange multipliers, convexity |
|
07 |
Ch 8-9 |
MLE, MAP, Bayesian linear regression |
|
08 |
Ch 10 |
Maximum variance, projection, dimensionality reduction |
|
09 |
Ch 11 |
GMM, EM algorithm, latent variables |
|
10 |
Ch 12 |
Separating hyperplanes, kernels, dual formulation |
ExercisesΒΆ
Notebook |
Content |
|---|---|
Practice problems for Ch 2-7 |
|
Practice problems for Ch 8-12 |
|
Solutions for Part 1 |
|
Solutions for Part 2 |
PrerequisitesΒΆ
foundational/ notebooks 01-04
Python 3.8+, NumPy, Matplotlib
Suggested OrderΒΆ
Follow the course notebooks 01-10 in order. The book has two parts:
Part I (01-06): Mathematical foundations
Part II (07-10): Central ML problems that apply those foundations
Practice LabsΒΆ
For hands-on implementations of each chapter, see practice-labs/.