Stanford CS229 Machine LearningΒΆ
Notebooks covering the Stanford CS229 ML theory and algorithms course.
Source PDF: cs229.pdf
NotebooksΒΆ
# |
Notebook |
Topics |
|---|---|---|
01 |
Normal equations, least squares, feature scaling |
|
02 |
Batch, stochastic, mini-batch GD, convergence |
|
03 |
Non-parametric regression, bandwidth selection |
|
04 |
Binary classification, sigmoid, cross-entropy |
|
05 |
Gaussian discriminant analysis, Naive Bayes |
|
06 |
Margin maximization, kernel trick, SMO |
|
07 |
L1/L2 penalties, bias-variance tradeoff |
|
08 |
PAC learning, VC dimension, generalization bounds |
|
09 |
CART, pruning, information gain, Gini impurity |
|
10 |
Perceptron, feedforward nets, activation functions |
|
11 |
Backprop, dropout, batch norm, architectures |
|
12 |
Error analysis, dataset splits, debugging ML |
|
13 |
K-Means, hierarchical, DBSCAN, evaluation |
|
14 |
PCA, t-SNE, autoencoders |
|
15 |
MDPs, Q-learning, policy gradient |
|
X01 |
Gaussian-based, isolation forest |
|
X02 |
Collaborative filtering, matrix factorization |
|
β |
Review exercises across all topics |
PrerequisitesΒΆ
foundational/ notebooks 01-04 (linear algebra, calculus, probability, gradient descent)
Python 3.8+, NumPy, Matplotlib, scikit-learn
Suggested OrderΒΆ
Follow the numbered sequence (01-15). X01-X02 are supplementary.