STA561: Probabilistic Machine Learning: Fall 2013

Prof:Barbara Engelhardt barbara.engelhardt@duke.edu OH: Fri 2-3Gross 318
TAs:Kevin Luo kevinluo@cs.duke.eduOH:Friday 10-11North 225
Jordan Hashimi jordan.hashemi@duke.eduOH:Thurs 3-4Gross Hall, 318
Nick Jarrett nicholas.jarrett@duke.eduOH:Wed 7-9pmSECC (Old Chemistry 211A)
Wenzhao Lian wl89@duke.eduOH:Thurs 1-2Gross Hall, 351
Class:M/W 10:05-11:20am Westbrook 0012, Divinity School

Description

Introduction to machine learning techniques. Graphical models, latent variable models, dimensionality reduction techniques, deep learning, regression, kernel methods, state space models, HMMs, MCMC, variational methods. Emphasis is on applying these techniques to real data in a variety of application areas.


News and information CS Students: If you are taking this course to fulfill your CS AI/ML Quals requirement, then you will be required, without exception, to take a written final exam, and this final will be worth 35% of your grade (the other proportions will be shuffled around to accommodate this change). You will not be exempt from the other course requirements.

All students: we will have two poster sessions: December 4th (Wednesday) from 2-5pm and December 14th (Saturday) from 2-5pm (both in Gross Hall 3rd floor East Meeting Space). Come to one and only one of these sessions. I highly recommend coming to the first. If you are in the above category of CS students taking this class for credit, you will be able to take the final exam during either time period. We will send around a survey so you can sign up for one of the two dates. If you are auditing the course, we'd love to have you at the first poster session (bring your research groups too!).


Statistics at the level of STA611 (Introduction to Statistical Methods) is encouraged, along with knowledge of linear algebra and multivariate calculus.

Course grade is based on homeworks (45%), take home midterm (15%), a final project (30%), and class participation and scribe notes (10%). Homeworks are due to me exactly one week after they are handed out at the beginning of class. Solutions should be uploaded to SAKAI before class on the due date; they should be each a single PDF document, and additional files will not be considered. I recommend using LaTeX to write up your homeworks (here is a homework template for you to use); if you have never used LaTeX before, you should consider this course an excellent opportunity to learn how to use it. Late homeworks will not be accepted, although you are allowed one late homework (maximum one week) for the course. Students may (and should) collaboratively discuss the homework assignments; however, I expect each student to program and write up their own homework solutions. Please write the names of the students you discussed the homework assignment with at the top of your solutions, and cite any material used in the preparation of the homeworks.

There is a Piazza course discussion page. Please direct questions about homeworks and other matters to that page. Otherwise, you can email the instructors (TAs and professor) at sta561-ta@duke.edu. Note that we are more likely to respond to the Piazza questions than to the email, and your classmates may respond too, so that is a good place to start.

Each lecture will have up to four scribes, who will type up notes in the LaTeX template. Within a week of class, the scribes should send the TAs the LaTeX file, at which point we will edit them and post them to the website. It is best to discuss beforehand with your fellow scribes what role you will take in the write-up: Careful note-taker, tex writer, figure maker, and copy editor, or whether you will split these roles among the four of you. If you have never used LaTeX before, there are online tutorials, Mac GUIs, and even online compilers that might help you. Here is an example of well-done scribe notes and associated figure.

The course project will include a project proposal due mid-semester, a four page writeup of the project at the end of the semester, and an all-campus poster session where you will present your work. This is the most important part of the course; we strongly encourage you to come and discuss project ideas with us early and often throughout the semester. We expect some of these projects to become publications. You are absolutely permitted to use your current rotation or research project as course projects.

A second set of references for R may be useful. First, you can download R from the CRAN website. There are many resources, such as R Studio, that can help with the programming interface, and tutorials on R are all over the place. If you are getting bored with the standard graphics package, I really like using ggplot2 for beautiful graphics and figures. Finally, you can integrate R code and output with plain text using KNITR, but that might be going a bit too far if you are a beginner.

The course will follow Kevin Murphy's Machine Learning: a probabilistic approach book. I may include optional readings or videos as appropriate. Some other texts and notes that may be useful include:

  1. Michael Lavine, Introduction to Statistical Thought (an introductory statistical textbook with plenty of R examples, and it's online too)
  2. Chris Bishop, Pattern Recognition and Machine Learning
  3. Daphne Koller & Nir Friedman, Probabilistic Graphical Models
  4. Hastie, Tibshirani, Friedman, Elements of Statistical Learning (ESL) (PDF available online)
  5. David J.C. MacKay Information Theory, Inference, and Learning Algorithms (PDF available online)

The final project TeX template and final project style file should be used in preparation of your final project report. Please follow the instructions and let me know if you have questions. We will have a poster session where you present your research project in lieu of a final exam.

This syllabus is tentative, and will almost surely be modified. Reload your browser for the current version.


Syllabus

  1. (August 26th) Introduction: concepts in probability and statistics [Scribe notes]
  2. (August 28th) Introduction: MLE, MAP, Bayesian reasoning [Scribe notes]
  3. (September 2nd) Introduction: exponential family, conjugacy, and sufficiency [Scribe notes]
  4. (September 4th) Simple discrete models: chains, trees, hierarchical models [Scribe notes]
  5. (September 9th) Gaussian models [Scribe notes]
  6. (September 11th) Linear Regression [Scribe notes]
  7. (September 16th) Generalized linear models [Scribe notes]
  8. (September 18th) Mixture models & K-means & Expectation Maximization [Scribe notes]
  9. (September 23th) Hidden Markov models [Scribe notes]
  10. (September 25th) Hidden Markov models (continued) and State Space models [Scribe notes]
  11. (September 30th) Exact inference [Scribe notes]
  12. (October 2nd) Factor analysis [Scribe notes]
  13. (October 7th) Sparse linear models [Scribe notes]
  14. (October 9th) Kernels and kernel methods [Scribe notes]
  15. (October 16th) Gaussian processes [Scribe notes]
  16. (October 21st) Markov random fields
  17. (October 23rd) MCMC introduction: importance sampling, rejection sampling, simulated annealing, MCs
  18. (October 28th) MCMC: Metropolis hastings, Gibbs sampling
  19. (October 30th) Variational algorithms: duality, mean field
  20. (November 4th) Variational algorithms: BP, EP, convex relaxations, VB [Scribe notes]
  21. (November 6th) Latent Dirichlet allocation & topic models [no scribe notes; see slides in Sakai and David Blei's website for details]
  22. (November 11th) Dirichlet process mixture models
  23. (November 13th) Review class [Unedited lecture notes]
  24. (November 18th) Adaptive basis function learning (CART and Boosting)
  25. (November 20th) Deep belief learning
  26. (November 25th) Graphical model structure learning