TAs: There are several TAs for the class. Amanda Kube (amanda.kube at wustl), the graduate assistant to the instructor, will be the head TA and will conduct various recitation sessions as needed. The TAs will hold regular office hours (to be scheduled), grade homeworks, and answer questions on Piazza. The complete roster is as follows:
TA | Email (at wustl.edu unless otherwise specified) | Office Hours |
Amanda Kube (Graduate Assistant to the Instructors) | amanda.kube | Mon 1-2 PM (Lopata 201) |
Blake Bordelon | blake.bordelon | Thu 9-10 AM (Lopata 103) |
Eric Cai | ecai | Tue 5-6 PM (Lopata 201) |
Sam Griesemer | samgriesemer at gmail dot com | Tue 1-2 PM (Cupples I 216) |
Jiaqi Hu | hu.jiaqi | Sun 3-4 PM (Lopata 103) |
Feiran Jia | feiran.jia | Thu 1-2 PM (Cupples I 216) |
Guancheng Jiang | guancheng | Wed 3-4 PM (Lopata 302) |
Adam Kern | adam.kern | Wed 6-7 PM (Cupples I 216) |
Jonathan Park | jongwhan | Thu 4-5 PM (Lopata 202) |
Lexie Sun | sunce | Fri 2-3 PM (Lopata 103) |
Cong Wang | cwang41 | Wed 11 AM-12 Noon (Lopata 202) |
Sijia Wang | sijiawang | Mon 6-7 PM (Lopata 302) |
Tong Wu | tongwu | Tue 3-4 PM (Lopata 103) |
Shaohua Zhang | shaohua | Fri 9-10 AM (Lopata 103) |
Louise Zhu | bingluzhu | Sat 1-2 PM (Sever 102) |
The complete TA office hour schedule is as follows:
Sundays | 3-4 PM (Lopata 103) |
Mondays | 1-2 PM (Lopata 201); 6-7 PM (Lopata 302) |
Tuesdays | 1-2 PM (Cupples I 216); 3-4 PM (Lopata 103); 5-6 PM (Lopata 201) |
Wednesdays | 11 AM-12 Noon (Lopata 202); 3-4 PM (Lopata 302); 6-7 PM (Cupples I 216) |
Thursdays | 9-10 AM (Lopata 103); 1-2 PM (Cupples I 216); 4-5 PM (Lopata 202) |
Fridays | 9-10 AM (Lopata 103); 2-3 PM (Lopata 103) |
Saturdays | 1-2 PM (Sever 102) |
Date | Topics | Readings | Assignments |
Jan 15 | Introduction. Course policies. Course overview. | Slides; AML 1.1, 1.2. | |
Jan 17 | The perceptron learning algorithm. Is learning feasible? | AML Section 1.1.2, Problem 1.3, Section 1.3.1 | |
Jan 22 | Generalizing outside the training set. Error and noise. | AML 1.3, 1.4 | |
Jan 24 | Infinite hypothesis spaces. VC dimension. | AML 2.1.1-2.1.3 | HW1 out> Gradescope instructions |
Jan 29 | The VC generalization bound (Prof. Ho). | AML 2.1.4, 2.2 | |
Jan 31 | No in-class lecture. AI for Social Good by Prof. Milind Tambe at AAAI | ||
Feb 5 | The bias-variance tradeoff. | AML 2.3.1 | |
Feb 7 | Bias-variance tradeoff, continued. Learning linear models with noisy data. | AML 2.3.2, 3.1, 3.2 | |
Feb 12 | Logistic regression and gradient descent. | AML 3.3 | HW2 out> |
Feb 14 | Nonlinear transformations. Overfitting. Regularization. | AML 3.4, 4.1, 4.2 | |
Feb 19 | Regularization contd. Validation | AML 4.2, 4.3 | Malik Magdon-Ismail's slides on validation |
Feb 21 | Overfitting (reprise); Occam's razor, sample selection bias, and data snooping. | AML 4.1, Chapter 5 | |
Feb 26 | Exam review. | ||
Feb 28 | In-class exam #1 | ||
Mar 5 | Intro to decision trees. | Tom Mitchell, Machine Learning Ch3; CASI 8.4 | |
Mar 7 | Decision trees, contd. | Same as last lecture. | HW3 out |
Mar 19 | Midterm discussion. Pruning. Bagging. | CASI 17.1 | |
Mar 21 | Random forests. Boosting. | CASI 17.1, 17.5, AdaBoost training error theorem proof> | -->|
Mar 27 | Lecture by Amanda Kube on causal inference. | Slides are in Piazza resources | |
Mar 29 | Final thoughts on boosting. Nearest neighbor methods | AML eChapter 6.1-6.2 | HW4 out> | -->
Apr 2 | Efficient nearest neighbor search (k-d trees and LSH); | Wikipedia articles on k-d trees and LSH. AML eChapter 6.3, 6.3.1. | |
Apr 4 | RBF Networks. Fairness in ML. | AML eChapter 6.3.1, 6.3.2. Propublica
article on risk assessments>, The Measure and Mismeasure of Fairness: A Critical Review of Fair Machine Learning (Corbett-Davies and Goel) | -->
HW5 out | Apr 9 | Fairness in ML continued. | Readings for HW5 on Piazza resources | -->
Apr 11 | Support vector machines. | AML eChapter 8.1, 8.2, 8.4 | HW6 out (due Apr 20) |
Apr 16 | Multilayer neural networks and backpropagation. | AML eChapter 7.1, 7.2 | |
Apr 18 | Text analytics | Slides |