Princeton University, Spring 2019

Prof. Ryan Adams (OH: Mon and Weds 3-4pm in CS 411)TA: Jad Rahme (OH: Tue 6-8pm in Fine Hall 216)

TA: Farhan Damani (OH: Mon 7-9pm outside CS 242)

TA: Fanghong Dong (OH: Wed 4-6pm CS 2nd floor tea room)

Time: Monday and Wednesday, 1:30-2:50pm

Location: COS 104

Contact: cos324-s19@lists.cs.princeton.edu

- 15 April 2019: HW5 posted.
- 1 April 2019: HW4 posted.
- 27 March 2019: Midterm solutions
- 11 March 2019: Midterm practice problems and solutions
- 9 March 2019: HW3 posted.
- 9 March 2019: Reminder -- midterm review on Monday 11 March 5-7pm in CSML classroom
- 21 February 2019: HW2 posted.
- 11 February 2019: Extension for HW1 to Monday 18 February at 23:55.
- 6 February 2019: HW1 posted.
- 6 February 2019: Precepts are:
- Wednesday 7:30-8:20pm Friend Center 009
- Thursday 7:30-8:20pm Friend Center 112
- Friday 1:30-2:20pm Andlinger Center 017

- 1 February 2019: Sign up on the Piazza discussion site.

Mon 4 February 2019

Lecture 1: Introduction

- [optional] Book: Murphy -- Chapter 1 --
*Introduction* - [optional] Book: Bishop -- Chapter 1 --
*Introduction*

Lecture 2: Linear Regression I

Assignment 1 Out

- [required] Course Notes: Linear Regression
- [optional] Metacademy: Linear Regression
- [optional] ESL: Sections 2.1-2.3 -- Overview of Supervised Learning
- [optional] ESL: Sections 3.1-3.2 -- Linear Methods for Regression
- [optional] External Course Notes: Andrew Ng Notes Sections 1 and 2
- [optional] External Slides: Roger Grosse CSC321 Lecture 2
- [optional] ISL: Sections 3.1-3.2 -- Linear Regression

Mon 11 February 2019

Lecture 3: Linear Regression II

- [required] Course Notes: Maximum Likelihood Linear Regression
- [required] Course Notes: Features and Basis Functions
- [optional] Metacademy: Linear Regression as Maximum Likelihood
- [optional] Mathematical Monk Video: MLE for Linear Regression Part 1, Part 2, Part 3
- [optional] External Course Notes: Andrew Ng Notes Section 3

Weds 13 February 2019

Lecture 4: Linear Regression III

- [required] Course Notes: Maximum Likelihood Linear Regression
- [optional] Metacademy: Linear Regression as Maximum Likelihood
- [optional] Mathematical Monk Video: MLE for Linear Regression Part 1, Part 2, Part 3
- [optional] External Course Notes: Andrew Ng Notes Section 3

Mon 18 February 2019

Lecture 5: Features and Basis Functions

Assignment 1 Due at 23:55

- [required] Course Notes: Features and Basis Functions
- [optional] ESL: Sections 5.1-5.3 -- Basis Expansions and Regularization
- [optional] Mathematical Monk Video: Basis Functions for MLE
- [optional] ISL: Chapter 7 -- Moving Beyond Linearity

Wed 20 February 2019

Lecture 6: Overfitting and Regularization

- [required] Course Notes: Overfitting and Regularization
- [optional] ESL: Section 2.9 -- Model Selection and the Bias-Variance Tradeoff
- [optional] ESL: Section 3.4 -- Shrinkage Methods
- [optional] Mathematical Monk Video: Bias-Variance Decomposition
- [optional] Metacademy: Ridge Regression
- [optional] Metacademy: Regularization
- [optional] Metacademy: Bias-Variance Decomposition

Thu 21 February 2019

Assignment 2 Out

Mon 25 February 2019

Lecture 7: Cross Validation

- [required] Course Notes: Model Selection
- [optional] ESL: Sections 7.1-7.3, 7.10 -- Model Assessment and Selection
- [optional] ISL: Section 5.1 -- Cross-Validation
- [optional] Mathematical Monk Video: Cross-validation part 1, Part 2, Part 3
- [optional] Metacademy: Cross-validation
- [optional] Wikipedia: Cross-validation

Weds 27 February 2019

Lecture 8: Linear Classification I

- [required] Course Notes: Perceptrons
- [optional] ESL: Sections 4.1-4.2 -- Linear Methods for Classification
- [optional] External Course Notes: Andrew Ng Notes Section 6
- [optional] ISL: Sections 4.1-4.2 -- Classification
- [optional] MacKay: Chapter 39 -- The Single Neuron as a Classifier
- [optional] MacKay: Chapter 40 -- Capacity of a Single Neuron
- [optional] Metacademy: Perceptron Learning Algorithm
- [optional] Metacademy: Binary Linear Classification
- [optional] External Slides: Roger Grosse CSC321 Lecture 4

Mon 4 March 2019

Lecture 9: Linear Classification II

- [required] Course Notes: Logistic Regression
- [optional] ESL: Section 4.4 -- Logistic Regression
- [optional] Mathematical Monk Video: Logistic Regression Part 1, Part 2, Part 3, Part 4, Part 4
- [optional] External Course Notes: Andrew Ng Notes Section 5
- [optional] ISL: Section 4.3 -- Logistic Regression
- [optional] Metacademy: Logistic Regression

Weds 6 March 2019

Lecture 10: Linear Classification III -- Support Vector Machines

Assignment 2 Due at 23:55

Assignment 3 Out

- [required] Course Notes: Support Vector Machines
- [optional] ESL: Section 4.5 -- Separating Hyperplanes
- [optional] ESL: Chapter 12 -- Support Vector Machines and Flexible Discriminants
- [optional] ISL: Sections 9.1-9.2 -- Support Vector Machines
- [optional] Metacademy: Support Vector Machine
- [optional] External Course Notes: Andrew Ng Notes Part V

Mon 11 March 2019

Lecture 11: Kernel-based Classification

- [required] Course Notes: Kernels
- [optional] ISL: Sections 9.3-9.4 -- Support Vector Machines
- [optional] ESL: Section 5.8 -- Regularization and Reproducing Kernel Hilbert Spaces
- [optional] Metacademy: Kernel Trick

Weds 13 March 2019

**MIDTERM EXAM** (Covers through Lecture 10 on 6 March)

Mon 25 March 2019

Lecture 12: Neural Networks I

- [required] Course Notes: Backprop and Automatic Differentiation
- [optional] ESL: Chapter 11 -- Neural Networks
- [optional] Metacademy: Feed-Forward Neural Nets
- [optional] Metacademy: Backpropagation
- [optional] External Slides: Roger Grosse CSC321 Lecture 5: Learning in a Single Neuron
- [optional] External Slides: Roger Grosse CSC321 Lecture 6: Backpropagation

Lecture 13: Neural Networks II

- [optional] Metacademy: Convolutional Neural Nets
- [optional] External Slides: Roger Grosse CSC321 Lecture 7: Neural Language Models
- [optional] External Slides: Roger Grosse CSC321 Lecture 9: Recurrent Neural Nets
- [optional] External Slides: Roger Grosse CSC321 Lecture 11: Convolutional Nets

Fri 29 March 2019

Assignment 3 Due at 23:55

Lecture 14: K-Means Clustering Assignment 4 Out

- [required] Course Notes: K-Means Clustering
- [optional] Mathematical Monk Video: K-Means Clustering Part 1, Part 2
- [optional] ISL: Sections 10.3 -- Clustering Methods
- [optional] Metacademy: K-Means Clustering
- [optional] Metacademy: K-Means++

Weds 3 April 2019

Lecture 15: Hierarchical Clustering

- [required] Course Notes: Hierarchical Clustering
- [optional] External Slides: Ryan Tibshirani CMU Data Mining: Hierarchical Clustering (Part II)
- [optional] ESL: Section 14.3: Cluster Analysis
- [optional] ISL: Sections 10.3 -- Clustering Methods

Mon 8 April 2019

Lecture 16: Principal Component Analysis

- [required] Course Notes: Principal Component Analysis
- [optional] ISL: Sections 10.2 -- Principal Component Analysis
- [optional] Metacademy: Principal Component Analysis

Weds 10 April 2019

Lecture 17: SVD and Latent Factor Models

- [optional] Metacademy: Latent Semantic Analysis

Fri 12 April 2019

Assignment 4 Due at 23:55

Lecture 18: Markov Decision Processes Assignment 5 Out

- [required] Course Notes: Markov Decision Processes
- [optional] Metacademy: Markov Decision Process

Weds 17 April 2019

Lecture 19: Value Iteration

- [required] Course Notes: Value and Policy Iteration
- [optional] Metacademy: Value Iteration

Mon 22 April 2019

Lecture 20: Policy Iteration

- [required] Course Notes: Value and Policy Iteration
- [optional] Metacademy: Policy Iteration

Mon 24 April 2019

Lecture 21: Reinforcement Learning I

- [required] Course Notes: Reinforcement Learning

Fri 26 April 2019

Assignment 5 Due at 23:55

Mon 29 April 2019

Lecture 22: Reinforcement Learning II Assignment 6 Out

- [required] Course Notes: Reinforcement Learning

Weds 1 May 2019

Fri 10 May 2019

Lecture 23: Wrap-up

Fri 10 May 2019

Assignment 6 Due at 23:55

LaTeX template and example: cos324.cls cos324-example.tgz

- Assignment 1 -- Out Weds 5 Feb, due
~~Fri 15 Feb~~Mon 18 Feb at 23:55 -- hw1.pdf, hw1.tex (hw1-solutions.pdf) - Assignment 2 -- Out Thu 21 Feb, due Wed 6 Mar at 23:55 -- hw2.pdf, hw2.tex, hmeq-train.csv, hmeq-test.csv (hw2-solutions.pdf)
- Assignment 3 -- Out Sat 8 Mar, due Fri 29 Mar at 23:55 -- hw3.pdf, hw3.tex, motorcycle.csv (hw3-solutions.pdf
- Assignment 4 -- Out Mon 1 Apr, due Fri 12 Apr at 23:55 -- hw4.pdf, hw4.tex, cities100.csv
- Assignment 5 -- Out Mon 15 Apr, due Fri 26 Apr at 23:55 -- hw5.pdf, hw5.tex
- Assignment 6

- Assignments: 60%
- Midterm: 20%
- Final: 20%

- [Murphy] Kevin Murphy, Machine Learning: A Probabilistic Perspective, MIT Press.
- [Bishop] Christopher M. Bishop, Pattern Recognition and Machine Learning, Springer.
- [MacKay] David J.C. MacKay, Information Theory, Inference, and Learning Algorithms, Cambridge University Press.
**Freely available online.** - [ESL] Trevor Hastie, Robert Tibshirani, and Jerome Friedman, The Elements of Statistical Learning, Springer.
**Freely available online.** - [ISL] Gareth James, Daniela Witten, Trevor Hastie, and Robert Tibshirani, An Introduction to Statistical Learning, Springer.
**Freely available online.**

**What is****Metacademy?**Metacademy is an exciting online tool developed by Roger Grosse and Colorado Reed for helping you to develop personalized instruction. It's meant to help you manage what you know about different topics and develop an individualized curriculum to learn a new subject.**I have an interview/sporting event/illness/computer crash. Can I have an extension?**No. You can turn your assignment in up to a week late for half credit.**How should I format my assignment?**It is highly recommended to use the provided .tex and .cls files (see the examples) to produce a LaTeX document. Compile it to PDF and upload the result to the course dropbox.**What if I don't know how to use LaTeX?**Everyone doing quantitative work should know how to use LaTeX, so consider this class an opportunity to learn it.**Is attendance required at lecture/precept?**No, attendance is not required. However, you will be assessed on material that is presented in both lecture and precepts and may or may not be available from the readings alone.**I can't make the precept to which I was assigned. Can I switch?**You are welcome to attend whatever precept you want, modulo space constraints.