Mathematics for MLΒΆ
Mathematical foundations for the rest of the curriculum. The goal is enough fluency to understand optimization, probability, embeddings, attention, and evaluation without treating them as magic.
Folder MapΒΆ
Folder |
Notebooks |
Level |
What It Covers |
|---|---|---|---|
13 |
Beginner |
Core math: linear algebra, calculus, probability, gradient descent, info theory, neural net math |
|
42 |
Beginner |
Visual intuition: calculus (12), linear algebra (13), differential equations (8), neural networks (9) |
|
24 |
Intermediate |
Mathematics for Machine Learning: course (10), exercises (4), practice labs (10) |
|
18 |
Intermediate |
Stanford CS229: regression, classification, SVMs, learning theory, clustering, RL |
|
15 |
Intermediate |
Intro to Statistical Learning: 13 chapters + practice exercises |
|
13 |
Intermediate |
ML: A Probabilistic Perspective: Bayesian inference, graphical models, MCMC, EM |
|
6 |
Intermediate |
Deep Learning Interviews: practice labs for logistic regression, info theory, CNNs |
|
6 |
Intermediate |
Speech & Language Processing: NLP labs from tokenization to transformers |
|
16 |
Advanced |
Research topics: learning theory, PAC-Bayes, NTK, variational inference, state space models |
|
β |
Reference |
ML problem-solving reference PDF |
Total: 153 notebooks across 10 folders
Quick StartΒΆ
# Start here
jupyter notebook foundational/01_linear_algebra_fundamentals.ipynb
Learning PathsΒΆ
Path 1: Beginner (start here)ΒΆ
Work through the foundational notebooks first. These cover the essentials:
Supplement with 3blue1brown/ notebooks for visual intuition on any topic that feels abstract.
Path 2: ML EngineerΒΆ
After the foundational pass, build depth in ML theory and algorithms:
mml-book/course/ β rigorous math foundations (linear algebra through optimization)
cs229-course/ β Stanford ML algorithms (regression, SVMs, neural nets, RL)
mml-book/practice-labs/ β hands-on implementation of MML concepts
dli-book/ β deep learning interview math
Path 3: Data ScientistΒΆ
Statistical and probabilistic foundations:
islp-book/ β statistical learning (regression, classification, resampling, trees, SVMs)
mlpp-book/ β probabilistic perspective (Bayesian inference, graphical models, MCMC)
slp-book/ β NLP and language model foundations
Path 4: ResearcherΒΆ
Graduate-level theory (requires Path 1 + Path 2 as prerequisites):
advanced/ β learning theory, concentration inequalities, PAC-Bayes, NTK
Topic Cross-ReferenceΒΆ
Find the same topic at different depths across folders:
Topic |
Beginner |
Intermediate |
Advanced |
Practice |
|---|---|---|---|---|
Linear Algebra |
||||
Calculus |
β |
|||
Probability |
β |
|||
Optimization |
||||
Information Theory |
β |
β |
||
Regression |
β |
β |
||
Classification |
β |
β |
||
SVMs |
β |
β |
||
PCA |
β |
β |
||
Neural Networks |
β |
|||
Transformers/LLMs |
β |
β |
β |
|
Bayesian Methods |
β |
|||
Clustering/GMM |
β |
β |
Source PDFsΒΆ
Each book folder contains its own PDF:
Location |
|
|---|---|
Mathematics for Machine Learning |
|
Stanford CS229 Notes |
|
Intro to Statistical Learning with Python |
|
ML: A Probabilistic Perspective |
mlpp-book/ML-Machine-Learning-A-Probabilistic-Perspective.pdf |
Deep Learning Interviews |
|
Speech & Language Processing |
|
ML Problem Solving |
Practical RulesΒΆ
Learn the intuition before the notation
Re-derive small examples by hand when possible
If a symbol-heavy notebook feels abstract, reconnect it to one use case: gradient descent, cosine similarity, cross-entropy, PCA, or attention
Do not try to finish every notebook before continuing the curriculum
Do not spend weeks on theorem-level depth if your goal is applied AI engineering
Next StepΒΆ
After the foundational notebooks, continue into 05-embeddings/ and 06-neural-networks/, then come back here as needed.