Foundational MathematicsΒΆ
Core mathematical building blocks for machine learning. Start here if youβre new to the math side.
NotebooksΒΆ
# |
Notebook |
Topics |
|---|---|---|
00 |
NumPy, Matplotlib, SciPy essentials for ML math |
|
01 |
Vectors, matrices, operations, systems of equations |
|
02 |
Derivatives, chain rule, partial derivatives, gradients |
|
03 |
Distributions, Bayesβ theorem, expectation, variance |
|
04 |
Optimization basics, learning rate, convergence |
|
05 |
Entropy, cross-entropy, KL divergence |
|
06 |
Hypothesis testing, confidence intervals, MLE |
|
07 |
Forward pass, backpropagation, loss functions |
|
08 |
Eigendecomposition, SVD, PCA foundations |
|
09 |
Closed-form vs iterative solutions, numerical stability |
|
10 |
Control theory connections to RL and optimization |
|
11 |
Markov chains, hidden Markov models, Viterbi |
|
12 |
SGD, momentum, Adam optimizer implementation |
PrerequisitesΒΆ
Python 3.8+
NumPy, Matplotlib
Suggested OrderΒΆ
Essential first pass (covers what you need for 90% of ML):
01 Linear Algebra β 02 Calculus β 03 Probability β 04 Gradient Descent
Then pick based on need:
Going into NLP? β 05 Information Theory
Going into neural nets? β 07 Neural Network Math β 12 Optimization
Going into Bayesian ML? β 06 Statistical Inference
Going into sequence models? β 11 Markov Models
Next StepsΒΆ
After completing the essential pass, continue to:
mml-book/ for rigorous math depth
cs229-course/ for ML algorithms
mml-book/practice-labs/ for hands-on implementation