Princeton University, Fall 2018

Prof. Ryan Adams (OH: Thursday 12:30-1:30pm in CS 411)TA: Jad Rahme (OH: Tue 9-11am in Fine Hall 216)

TA: Farhan Damani (OH: Wed 9-11am outside CS 242)

TA: Fanghong Dong (OH: Wed 2-4pm in CS 2nd floor tea room)

Time: Tuesday and Thursday, 11:00am-12:20pm

Location: COS 104

Contact: cos324-f18@lists.cs.princeton.edu

- 7 January 2019: Special pre-final office hours:
- Weds 16 Jan 9-11am, Farhan, outside COS 242
- Thurs 17 Jan 10am-12pm, Jad, Fine Hall 216
- Thurs 17 Jan 3-5pm, Fanghong, COS 2nd Floor Tea Room
- Thurs 17 Jan 7-5pm,
**Final Review Session**, CSML classroom - Fri 18 Jan 9-11am, Ryan, COS 411
- Mon 21 Jan 9-11am, Farhan, outside COS 242
- Mon 21 Jan 11am-1pm, Ryan, COS 411
- Mon 21 Jan 3-5pm, Fanghong, COS 2nd Floor Tea Room
- Mon 21 Jan 7-9pm, Jad, Fine Hall 216

- 7 January 2019: Final practice exam
- 4 December 2018: Assignment 6 posted
- 21 November 2018: Assignment 5 posted
- 8 November 2018: Midterm solutions
- 6 November 2018: Assignment 4 posted
- 20 October 2018: Midterm practice exam
- 16 October 2018: Assignment 3 posted
- 15 October 2018: Midterm review session will be led by the TAs on Monday 22 October at 5:30pm in the CSML classroom.
- 2 October 2018: Assignment 2 posted
- 25 September 2018: Minor wording change to problem 2 of Assignment 1
- 18 September 2018: Assignment 1 posted
- 13 September 2018: Sign up on the Piazza discussion site.

Thu 13 Sep 2018

Lecture 1: Introduction [slides]

- [optional] Book: Murphy -- Chapter 1 --
*Introduction* - [optional] Book: Bishop -- Chapter 1 --
*Introduction*

Lecture 2: Linear Regression I

Assignment 1 Out

- [required] Course Notes: Linear Regression
- [optional] Metacademy: Linear Regression
- [optional] ESL: Sections 2.1-2.3 -- Overview of Supervised Learning
- [optional] ESL: Sections 3.1-3.2 -- Linear Methods for Regression
- [optional] External Course Notes: Andrew Ng Notes Sections 1 and 2
- [optional] External Slides: Roger Grosse CSC321 Lecture 2
- [optional] ISL: Sections 3.1-3.2 -- Linear Regression

Thu 20 Sep 2018

Lecture 3: Linear Regression II

- [required] Course Notes: Maximum Likelihood Linear Regression
- [optional] Metacademy: Linear Regression as Maximum Likelihood
- [optional] Mathematical Monk Video: MLE for Linear Regression Part 1, Part 2, Part 3
- [optional] External Course Notes: Andrew Ng Notes Section 3

Tue 25 Sep 2018

Lecture 4: Features and Basis Functions

- [required] Course Notes: Features and Basis Functions
- [optional] ESL: Sections 5.1-5.3 -- Basis Expansions and Regularization
- [optional] Mathematical Monk Video: Basis Functions for MLE
- [optional] ISL: Chapter 7 -- Moving Beyond Linearity

Thu 27 Sep 2018

Lecture 5: Overfitting and Regularization

- [required] Course Notes: Overfitting and Regularization
- [optional] ESL: Section 2.9 -- Model Selection and the Bias-Variance Tradeoff
- [optional] ESL: Section 3.4 -- Shrinkage Methods
- [optional] Mathematical Monk Video: Bias-Variance Decomposition
- [optional] Metacademy: Ridge Regression
- [optional] Metacademy: Regularization
- [optional] Metacademy: Bias-Variance Decomposition

Fri Sep 28 2018

Assignment 1 Due at 23:55

Tue 2 Oct 2018

Lecture 6: Cross Validation

Assignment 2 Out

- [required] Course Notes: Model Selection
- [optional] ESL: Sections 7.1-7.3, 7.10 -- Model Assessment and Selection
- [optional] ISL: Section 5.1 -- Cross-Validation
- [optional] Mathematical Monk Video: Cross-validation part 1, Part 2, Part 3
- [optional] Metacademy: Cross-validation
- [optional] Wikipedia: Cross-validation

Thu 4 Oct 2018

Lecture 7: Linear Classification I

- [required] Course Notes: Perceptrons
- [optional] ESL: Sections 4.1-4.2 -- Linear Methods for Classification
- [optional] External Course Notes: Andrew Ng Notes Section 6
- [optional] ISL: Sections 4.1-4.2 -- Classification
- [optional] MacKay: Chapter 39 -- The Single Neuron as a Classifier
- [optional] MacKay: Chapter 40 -- Capacity of a Single Neuron
- [optional] Metacademy: Perceptron Learning Algorithm
- [optional] Metacademy: Binary Linear Classification
- [optional] External Slides: Roger Grosse CSC321 Lecture 4

Tue 9 Oct 2018

Lecture 8: Linear Classification II

- [required] Course Notes: Logistic Regression
- [optional] ESL: Section 4.4 -- Logistic Regression
- [optional] Mathematical Monk Video: Logistic Regression Part 1, Part 2, Part 3, Part 4, Part 4
- [optional] External Course Notes: Andrew Ng Notes Section 5
- [optional] ISL: Section 4.3 -- Logistic Regression
- [optional] Metacademy: Logistic Regression

Thu 11 Oct 2018

Lecture 9: Linear Classification III -- Support Vector Machines

- [required] Course Notes: Support Vector Machines
- [optional] ESL: Section 4.5 -- Separating Hyperplanes
- [optional] ESL: Chapter 12 -- Support Vector Machines and Flexible Discriminants
- [optional] ISL: Sections 9.1-9.2 -- Support Vector Machines
- [optional] Metacademy: Support Vector Machine
- [optional] External Course Notes: Andrew Ng Notes Part V

Fri 12 Oct 2018

Assignment 2 Due at 23:55

Tue 16 Oct 2018

Lecture 10: Kernel-based Classification

- [required] Course Notes: Kernels
- [optional] ISL: Sections 9.3-9.4 -- Support Vector Machines
- [optional] ESL: Section 5.8 -- Regularization and Reproducing Kernel Hilbert Spaces
- [optional] Metacademy: Kernel Trick

Thu 18 Oct 2018

Lecture 11: Neural Networks I

Assignment 3 Out

- [required] Course Notes: Backprop and Automatic Differentiation
- [optional] ESL: Chapter 11 -- Neural Networks
- [optional] Metacademy: Feed-Forward Neural Nets
- [optional] Metacademy: Backpropagation
- [optional] External Slides: Roger Grosse CSC321 Lecture 5: Learning in a Single Neuron
- [optional] External Slides: Roger Grosse CSC321 Lecture 6: Backpropagation

Tue 23 Oct 2018

Lecture 12: Neural Networks II

- [optional] Metacademy: Convolutional Neural Nets
- [optional] External Slides: Roger Grosse CSC321 Lecture 7: Neural Language Models
- [optional] External Slides: Roger Grosse CSC321 Lecture 9: Recurrent Neural Nets
- [optional] External Slides: Roger Grosse CSC321 Lecture 11: Convolutional Nets

Thu 25 Oct 2018

Midterm Exam

Assignment 3 Due at 23:55

Tue 6 Nov 2018

No Class

Assignment 4 Out

Thu 8 Nov 2018

Lecture 13: K-Means Clustering

- [required] Course Notes: K-Means Clustering
- [optional] Mathematical Monk Video: K-Means Clustering Part 1, Part 2
- [optional] ISL: Sections 10.3 -- Clustering Methods
- [optional] Metacademy: K-Means Clustering
- [optional] Metacademy: K-Means++

Tue 13 Nov 2018

Lecture 14: Hierarchical Clustering

- [required] Course Notes: Hierarchical Clustering
- [optional] External Slides: Ryan Tibshirani CMU Data Mining: Hierarchical Clustering (Part II)
- [optional] ESL: Section 14.3: Cluster Analysis
- [optional] ISL: Sections 10.3 -- Clustering Methods

Thu 15 Nov 2018

Lecture 15: Principal Component Analysis

- [required] Course Notes: Principal Component Analysis
- [optional] ISL: Sections 10.2 -- Principal Component Analysis
- [optional] Metacademy: Principal Component Analysis

Fri 16 Nov 2018

Assignment 4 Due at 23:55

Tue 20 Nov 2018

Lecture 16: Latent Factor Models

Assignment 5 Out

- [optional] Metacademy: Latent Semantic Analysis

Lecture 17: Markov Decision Processes

- [required] Course Notes: Markov Decision Processes
- [optional] Metacademy: Markov Decision Process

Thu 29 Nov 2018

Lecture 18: Value Iteration

- [required] Course Notes: Value and Policy Iteration
- [optional] Metacademy: Value Iteration

Fri 30 Nov 2018

Assignment 5 Due at 23:55

Tue 4 Dec 2018

Lecture 19: Policy Iteration

Assignment 6 Out

- [required] Course Notes: Value and Policy Iteration
- [optional] Metacademy: Policy Iteration

Thu 6 Dec 2018

Lecture 20: Model-based Reinforcement Learning

- [required] Course Notes: Reinforcement Learning

Tue 11 Dec 2018

Lecture 21: Model-free Reinforcement Learning

- [required] Course Notes: Reinforcement Learning

- Assignment 1: Out Tues 18 Sept; Due Fri 28 Sept at 23:55 (hw1.tex) solutions
- Assignment 2: Out Tues 2 Oct; Due Fri 12 Oct at 23:55 (hw2.tex, spam.train.dat, spam.test.dat) solutions
- Assignment 3: Out Tues 16 Oct; Due Mon 5 Nov at 23:55 (hw3.tex, motorcycle.csv) solutions
- Assignment 4: Out Tues 6 Nov; Due Fri 16 Nov at 23:55 (hw4.tex, cities100.csv) solutions
- Assignment 5: Out Weds 21 Nov; Due Mon 3 Dec at 23:55 (hw5.tex) solutions
- Assignment 6: Out Tues 4 Dec; Due Weds 19 Dec at 23:55 (hw6.tex, hw6-code.tgz)

- Assignments: 60%
- Midterm: 20%
- Final: 20%

- [Murphy] Kevin Murphy, Machine Learning: A Probabilistic Perspective, MIT Press.
- [Bishop] Christopher M. Bishop, Pattern Recognition and Machine Learning, Springer.
- [MacKay] David J.C. MacKay, Information Theory, Inference, and Learning Algorithms, Cambridge University Press.
**Freely available online.** - [ESL] Trevor Hastie, Robert Tibshirani, and Jerome Friedman, The Elements of Statistical Learning, Springer.
**Freely available online.** - [ISL] Gareth James, Daniela Witten, Trevor Hastie, and Robert Tibshirani, An Introduction to Statistical Learning, Springer.
**Freely available online.**

**What is****Metacademy?**Metacademy is an exciting online tool developed by Roger Grosse and Colorado Reed for helping you to develop personalized instruction. It's meant to help you manage what you know about different topics and develop an individualized curriculum to learn a new subject.**I have an interview/sporting event/illness/computer crash. Can I have an extension?**No. You can turn your assignment in up to a week late for half credit.**How should I format my assignment?**It is highly recommended to use the provided .tex and .cls files (see the examples) to produce a LaTeX document. Compile it to PDF and upload the result to the course dropbox.**What if I don't know how to use LaTeX?**Everyone doing quantitative work should know how to use LaTeX, so consider this class an opportunity to learn it.**Is attendance required at lecture/precept?**No, attendance is not required. However, you will be assessed on material that is presented in both lecture and precepts and may or may not be available from the readings alone.**I can't make the precept to which I was assigned. Can I switch?**You are welcome to attend whatever precept you want, modulo space constraints.