Princeton COS 521
Advanced Algorithm Design

Capstone graduate algorithms course covering advanced topics such as randomness, optimization, and high dimensional geometry. We also explore diverse applications of algorithmic tools and thinking.

Instructors: Pravesh Kothari, Christopher Musco


Course Summary

Material: COS 521 gives a broad yet deep exposure to algorithmic advances of the past few decades, preparing students to read and understand research papers in algorithms. Course is suitable for graduate students (including those not in CS) and advanced undergrads.

Prerequisites: One undergraduate course in algorithms (e.g., COS 423), or equivalent mathematical maturity. Listeners and auditors are welcome with prior permission.

Coursework: Two lectures per week. 5 homeworks, including some simple programming-based exploration of the lecture ideas (60% of grade). Choice of take-home final in January, or a term project in groups of two (40% of grade). For specific policy on grading, late assignments, etc. please see the grading policy sheet.

Resources: There is no official text -- we will use our own lecture notes and assorted online resources. Please see course webpages from previous years for additional material.

Homework:
Homework 1 (due Monday, Oct. 8th, Blackboard submission link)

Homework 2 (due Friday, Oct. 26th, Blackboard submission link)

Homework 3 (due Monday, Nov. 19th, Blackboard submission link)


Administrative Information

Lectures: Tuesday & Thursday 3:00pm-4:20pm in Friend Center 004.

Teaching Assistants: Sixue Liu (Cliff) - sixuel@cs.princeton.edu, Seyed Sobhan Mir Yoosefi (Sobhan) - syoosefi@cs.princeton.edu.

Office Hours: Pravesh: Immediately after class, 194 Nassau St, Room 219.
Christopher: Immediately after class, Friends 004.
Sixue: Wed. 7:00-8:00pm, 194 Nassau St, Room 307.
Sobhan: Fri. 2:00-3:00pm, 35 Olden St, Room 431.

Piazza: Course discussion and questions will be managed through Piazza. Please sign up here.

Homework: We require students to prepare problem sets in LaTeX. You can use this template. Submission is managed through Princeton's Blackboard system. A compiled PDF of your homework should be uploaded by 11:59pm on the due date.

For regular homework problems collaboration is allowed, but solutions must be written-up individually. Students must list collaborators for each problem separately, or write "No Collaborators" if you worked alone. Collaboration is not allowed on bonus problems (see grading policy).

Final Project Read the Final Project Guidelines Here.
Deadlines Preliminary Proposal : Dec 5
Final Proposal: Dec 13
Presentation: TBA
Final Reports: Jan 15


Final project reports from 2014 here.

Lecture # Topic Required Reading Optional Reading
Randomness and Hashing
1. 9/13 Hashing Lecture 1 notes.
  • Interesting paper and lecture on the practical importance of choosing a hash function with the right independence properties.
  • Blog post on attacks against hash-based data structures.
2. 9/18 Randomized Minimum Cut Lecture 2 notes.
3. 9/20 Concentration Bounds Lecture 3 notes.
  • For a proof of the ``power of two choices'' result, see Section 2 in this survey.
  • Additional reading on concentration bounds can be found in Terry Tao's notes.
4. 9/25 Hashing to Reals Lecture 4 notes.
Linear Thinking
5. 9/27 Linear Thinking Lecture 5 notes.
  • Linear Programming Fascinating historical article by G. B. Dantzig, a key figure. Many of the events happened at Princeton.
6. 10/2 LP Relaxations & Approximation Algorithms Lecture 6 notes.
7. 10/4 LP Relaxations Continued
  • Notes from Matt Weinberg's class from last year. Problem 3 gives yet another example of dependent randomized rounding.
8. 10/9 Linear Programming Duality Lecture 8 notes.
9. 10/11 Learning from Experts: Multiplicative Weights Algorithm Lecture 9 notes.
Dimensionality Reduction
10. 10/16 The Johnson-Lindenstrauss Lemma Lecture 10 notes.
  • Recent paper that essentially resolves the optimality of the Johnson-Lindenstrauss lemma for preserving distances.
11. 10/18 Approximate regression, ε-nets, fast JL
Lecture 11 notes.
  • Simple MATLAB experiments exploring the pseudorandom properties of Hadamard matrices.
  • As an alternative to randomized Fourier constructions, JL dimension reduction can also be accelerated via sparse random matrices. See these lecture notes for more information.
12. 10/23 Nearest Neighbor Search Lecture 12 notes.
  • Alex Andoni's LSH webpage, which provides useful links to survey papers, implementations, and recent research progress.
13. 10/25 Low-rank Approximation and SVD Lecture 13 notes.
  • Cool application of the singular value decomposition to visualizing a genetic dataset.
  • Greg Valiant and Tim Roughgarden's lecture notes on the singular value decomposition and low-rank approximation. Lots of good links and an explanation of the connection with principal component analysis (PCA).
10/30 No Class, Fall Recess
11/1 No Class, Fall Recess
14. 11/6 Power Method and and Spectral Clustering Lecture 14 notes.
  • Check out Chapter 10 in these notes for a discussion of methods faster than power method.
  • Daniel Spielman's lecture notes on spectral methods for the stochastic block model.
Optimization
15. 11/8 Gradient Descent Lecture 15 notes.
16. 11/13 Ellipsoid Method (+ online and stochastic GD) Lecture 16 notes.
  • Nisheeth Vishnoi's excellent lecture notes on convex optimization, including the ellipsoid method. These cover many details we don't have time for in this course!
  • New York Times article from 1984 on the discovery that the Ellipsoid method can solve linear programs in polynomial time.
17. 11/15 Interior Point Methods Lecture 17 notes.
  • New York Times article from 1979 on the discovery that the Ellipsoid method can solve linear programs in polynomial time.
18. 11/20 Semidefinite Programming
11/22 No Class, Thanksgiving
Select Topics
19. 11/27 TBA
20. 11/29 TBA
21. 12/4 TBA
22. 12/6 TBA
23. 12/11 TBA
24. 12/13 TBA