Advanced Mathematics for Machine LearningΒΆ
Research-level mathematical topics and learning theory for understanding modern ML research.
Prerequisites: Complete foundational/ and mml-book/ sections first.
NotebooksΒΆ
Part I: Learning TheoryΒΆ
# |
Notebook |
Topics |
|---|---|---|
01 |
Generalization, bias-variance tradeoff |
|
02 |
Hoeffding, Bernstein, McDiarmidβs inequality |
|
03 |
Uniform convergence, capacity measures |
|
04 |
PAC learning framework, Bayesian perspective |
|
05 |
Infinite-width neural networks, kernel methods |
Part II: Advanced Optimization & InferenceΒΆ
# |
Notebook |
Topics |
|---|---|---|
06 |
Mean-field approximation, ELBO |
|
07 |
Dirichlet Process, Chinese Restaurant Process |
|
08 |
EM algorithm, convergence proofs, GMM |
|
09 |
Implicit bias, convergence analysis |
Part III: Advanced Models & TheoryΒΆ
# |
Notebook |
Topics |
|---|---|---|
10 |
Kalman Filters, Hidden Markov Models |
|
11 |
Dependency modeling, multivariate distributions |
|
12 |
Diversity modeling, sampling |
|
13 |
Random projections, dimensionality reduction |
|
14 |
Lagrangian duality, KKT conditions |
|
15 |
Efficient second-order optimization |
|
16 |
Matrix-valued concentration inequalities |
Learning PathsΒΆ
Theoretical ML Researcher: 01 -> 02 -> 03 -> 04 -> 05
Probabilistic ML: 06 -> 07 -> 08 -> 10
Optimization: 09 -> 14 -> 15
PrerequisitesΒΆ
Solid understanding of linear algebra, multivariable calculus, probability, and basic ML
Python 3.8+, NumPy, SciPy, Matplotlib