Advanced Mathematics for Machine LearningΒΆ

Research-level mathematical topics and learning theory for understanding modern ML research.

Prerequisites: Complete foundational/ and mml-book/ sections first.

NotebooksΒΆ

Part I: Learning TheoryΒΆ

#

Notebook

Topics

01

Introduction to Learning Theory

Generalization, bias-variance tradeoff

02

Concentration Inequalities

Hoeffding, Bernstein, McDiarmid’s inequality

03

Rademacher Complexity

Uniform convergence, capacity measures

04

PAC-Bayes Theory

PAC learning framework, Bayesian perspective

05

Neural Tangent Kernel

Infinite-width neural networks, kernel methods

Part II: Advanced Optimization & InferenceΒΆ

#

Notebook

Topics

06

Variational Inference

Mean-field approximation, ELBO

07

Bayesian Nonparametrics

Dirichlet Process, Chinese Restaurant Process

08

Expectation Maximization

EM algorithm, convergence proofs, GMM

09

Gradient Descent Convergence

Implicit bias, convergence analysis

Part III: Advanced Models & TheoryΒΆ

#

Notebook

Topics

10

State Space Models

Kalman Filters, Hidden Markov Models

11

Copula Theory

Dependency modeling, multivariate distributions

12

Determinantal Point Processes

Diversity modeling, sampling

13

Johnson-Lindenstrauss

Random projections, dimensionality reduction

14

Duality Theory

Lagrangian duality, KKT conditions

15

Conjugate Gradients

Efficient second-order optimization

16

Matrix Concentration

Matrix-valued concentration inequalities

Learning PathsΒΆ

Theoretical ML Researcher: 01 -> 02 -> 03 -> 04 -> 05

Probabilistic ML: 06 -> 07 -> 08 -> 10

Optimization: 09 -> 14 -> 15

PrerequisitesΒΆ

  • Solid understanding of linear algebra, multivariable calculus, probability, and basic ML

  • Python 3.8+, NumPy, SciPy, Matplotlib