COS513: Foundations of Probabilistic Modeling

Table of Contents

1 Course Information

Lectures are on Mondays and Wednesdays from 11:00 AM - 12:20 PM in Computer Science room 105.

There is a weekly recitation session on Tuesdays at 4:25PM in Friend 108.

The course staff are

  • David Blei (Professor); blei@cs.princeton.edu
  • David Mimno; mimno@cs.princeton.edu

Office hours are available by appointment.

The final project tex template is here.

2 Course Description

Probabilistic modeling is a mainstay of modern artificial intelligence research, providing essential tools for analyzing the vast amount of data that have become available in science, scholarship, and everyday life. This course will cover the mathematical and algorithmic foundations of this field, as well as methods underlying the current state of the art.

Over the last century, problems that have been partially solved with probabilistic models include:

  • Automatically grouping genes into clusters
  • Identifying email that is likely to be spam
  • Transcribing speech from the recorded signal
  • Identifying recurring patterns in gene sequences
  • Predicting books or movies that a user will like based on his or her previous purchases
  • Tracking an object's position by radar
  • Determining the structure of the evolutionary tree of a set of species
  • Diagnosing a disease from its symptoms
  • Decoding the original message from a noisy transmission
  • Understanding the phase transitions in a physical system of electrons

Each of these applications of probabilistic modeling has involved the determination of a statistical model, a method for fitting that model to observed data, and a method for using the fitted model to solve the task at hand. As one might expect from the diversity of applications listed above, each model has been developed and studied within a different intellectual community.

Over the past two decades, scholars working in the field of machine learning have sought to unify such data analysis activities. Their focus has been on developing tools for devising, analyzing, and implementing probabilistic models in generality. These efforts have lead to the body of work on probabilistic graphical models, a marriage of graph theory and probability theory. Graphical models provide both a language for expressing assumptions about data, and a suite of efficient algorithms for reasoning and computing with those assumptions.

As a consequence, graphical models research has forged connections between signal processing, coding theory, computational biology, natural language processing, computer vision, statistics, and many other fields. Knowledge of graphical models is essential to academics working in artificial intelligence and machine learning, and is of increased importance to those in the other scientific and engineering fields to which these methods have been applied.

3 Syllabus and Readings

The syllabus is subject to change. Readings and dates will be added as the semester progresses.

DateLecture topicReadingScribe notes
9/20/10Introduction / Graphical modelsJordan (2004)Feehan
9/22/10Conditional independenceITGM Ch 2Asmuth
9/27/10d-Separation and eliminationITGM Ch 3Pereira
9/29/10Elimination (cont)Eldar
10/4/10Propagation on treesITGM Ch 4Lee
10/6/10Statistical conceptsITGM Ch 5, Freedman (1994)White
10/11/10Statistical concepts (continued)Anuradha
10/13/10Linear regressionEoSL Ch 3.1-3.2Wang
10/18/10Linear regression (cont) and regularized linear regressionEoSL Ch 3.4Nag
10/20/10Mixture models and expectation maximization (Mimno)ITGM Ch 11Barut
10/25/10Exponential familiesBishop, Ch 2.4Weinstein
10/27/10Exponential familiesBandeira
11/8/10Generalized linear modelsMcCullagh and Nelder, Ch 1-2Bastian
11/10/10Generalized linear modelsLee
11/15/10Factor analysis, principal component analysisBishop, Ch 12Partridge
11/17/10Expectation maximization, FA/PCA continuedITGM Ch 11Acs
11/22/10Sequence models IITGM Ch 12Gopalan
11/24/10Sequence models IIKao
11/29/10Monte Carlo Markov chain sampling INeal (1993)Manning
12/1/10Monte Carlo Markov chain sampling (II)Jang
12/6/10Monte Carlo Markov chain sampling (III)Salesi
12/8/10Nonparametric regression (Guest: Prof. Philippe Rigolet)Wang
12/13/10Variational methods (I)Blei (2004)Lee and Zhao
12/15/10Bayesian nonparametricsTank

Notes

4 Prerequisites

We require some exposure to probability, such as what is covered in COS341 or COS402, and comfort with computer programming and basic linear algebra. Contact Prof. Blei if you have concerns about your prerequisite coursework.

5 Grades and Workload

The course grade is based on four items

  • A 5-page report that discusses a current area of research in probabilistic modeling. This will be due on October 29, 2010..
  • A longer final report, which is expected to be a somewhat ambitious research project that explores a new application or theoretical issue in probabilistic modeling. This will be due at the end of the semester (Dean's day).
  • The preparation of a set of scribe notes during the semester.
  • Class attendance and active participation.

We emphasize that in the first three items, including the scribe notes, we expect polished and proofread reports. Writing quality will play a role in the final evaluation.

Author: David Blei

Date: 2011-01-18 15:01:44 EST

HTML generated by org-mode 7.3 in emacs 23