Syllabus
Readings are primarily from the course textbook Machine Learning: a Probabilistic Perspective by Kevin Murphy. Readings from the textbook will be prefixed by "M". For example, M1 means Murphy chapter 1. Unless otherwise notes, skip sections that have a * in the title. These are optional.
For those who prefer Chris Bishop's Pattern Recognition and Machine Learning, I will list the corresponding readings from this textbook. These readings will be prefixed by "B". Optional readings will be written in italics.
Recitation sections are led by the TAs and are optional.
Date 
Topics 
Readings 

Supervised Learning: Classifiers 

M 8/31 
Machine Learning: Overview Background of the field 
M1 PDF 

W 9/2 
Machine Learning: Foundations Introduce key concepts 


F 9/4 
Recitation: Probability/Lin Alg Events, random variables, probabilities, pdf, pmf, cdf, mean, mode, median, variance, multivariate distributions, marginal\ s, conditionals, Bayes theorem, independence 

M 9/7 
NO CLASS 


W 9/9 
Regression 1 Linear regression 
M7, excluding M7.4 and M7.6 

F 9/11 
Recitation: Math Review Basic properties of matrices, eigenvalue decompositions, singular value decompositions 
cs229's linearalgebra notes


M 9/14 
Regression 2 

W 9/16 
Classification 1 Introduces logistic regression and classification 
M8 (stop at M8.3.7), M13.3 

F 9/18 
Recitation: MLE 

M 9/21 
Data Geometry: Support Vector Machines Maxmargin classification and optimization 
M14.5 

M W/23 
Data Geometry: Kernel Methods Dual optimization, kernel trick 
M14.1, M14.2 

F 9/25 
Recitation: Convex Optimization 

M 9/28 
Perceptron Online learning algorithms 
M8.5  
W 9/30 
Deep Learning: Shallow Learning 
M16.5, M27.7, M28 

F 10/2 
Recitation: PyTorch Intro 

M 10/5 
Deep Learning: Backprop 

W 10/7 
Deep Learning: The Details 

F 10/9 
Recitation: Midterm Review 

M 10/12 
Decision Trees Construction, pruning, overfitting 
M2.8, M16.2 

W 10/14 
Boosting Ensemble methods 
M16.4, M16.6 

F 10/16 
Recitation: Midterm (Required Attendance) 

Unsupervised Learning: Core Methods 

M 10/19 
Clustering Kmeans 
M25.1, M11 (stop at M11.4) 

W 10/21 
TBD 

F 10/23 
No Recitation: Fall Break 

M 10/26 
Tentative: Midterm 

W 10/28 
Expectation Maximization 1 
M11.4 (stop at M11.4.8) 

F 10/30 
Recitation: EM 

M 11/2 
Expectation Maximization 2 

W 11/4 
Graphical Models 1 Bayesian networks and conditional independence 
M10 

F 11/6 
Recitation: Graphical Models 

M 11/9 
Graphical Models 2 MRFs and exact inference 
M19.1 (stop at M19.4), M19.5 

W 11/11 
Graphical Models 3 Inference 
M20 (stop at M20.3) 

F 11/13 
Recitation: Graphical Models 

M 11/16 
Graphical Models 4 Max Sum and Max Product 

W 11/18 
Structured Prediction 1 Margin based methods, HMMs, CRFs 
M17 (stop at M17.6), M19.6, M19.7 

F 11/20 
Recitation: Transformers 

M 11/30 
Structured Prediction 2 Recurrent Neural Networks 

W 12/2 
Dimensionality reduction PCA 
M12.2 

F 12/4 
Recitation: Dimensionality Reduction 

M 12/7 
Fairness, Accountability, Transparency and Ethics of ML 

W 12/9 
Practical Machine Learning 

F 12/11 
No Recitation 

TBD 
Final Exam TBD (75 minutes) 


