Princeton University logo
Princeton University
Computer Science Department

Computer Science 521
Advanced Algorithm Design

Sanjeev Arora

  Fall 2013

Course Summary

(Important: In light of the new grad course requirements, this course is changing as of Fall 2013 to make it more accessible to CS grads who are not specializing in theoretical CS. )
Design and analysis of algorithms is an important part of computer science today. This course gives a broad yet deep exposure to algorithmic advances of the past few decades, and brings students up to a level where they can read and understand research papers in algorithms. The course is suitable for advanced undergrads and non-CS grads as well, and they will be graded on a different curve.  Grads who intend to specialize in theoretical CS are invited to attend extra discussions (at Small World coffee on Friday afternoon) that explore some topics in greater depth.

Thematically, the biggest difference from undergrad algorithms (such as COS 423) is extensive use of ideas such as randomness, approximation, high dimensional geometry,  which are increasingly important in most applications. We will encounter notions such as algorithm design in face of uncertainty, approaches to handle big data, handling intractability, heuristic approaches, etc. We will develop all necessary mathematical tools.

Prerequisites:  One undergraduate course in algorithms (eg COS 423), or equivalent mathematical maturity. Listeners and auditors are welcome with prior permission.

Coursework: Two lectures/week.  For motivated students, a 1-hour discussion of advanced ideas each week at Small World Coffee on Friday afternoon. There will be 4 homeworks over the semester, which may include some simple programming-based exploration of the lecture ideas using Matlab or other packages. (Collaboration OK on homeworks.) There will be a take-home final in January. Grads not specializing in theoretical CS will be allowed to substitute a course project (done in groups of 2) + one extra homework for the final.  There is no official text. Instead, we will use assorted online resources. Students will be expected to scribe lecture notes once or twice during the term.

Administrative Information

Lectures: Tues-Thurs 13:30-15:00   Room 402 . First meeting: Sept 12.

Instructor: Sanjeev Arora- 307 CS Building - 609-258-3869 arora AT the domain name

Teaching assistant: Aman Dhesi adhesi  AT the domain name      


Office hrs: Sanjeev Monday 3:30-5pm in Room 307 and by appointment.
                   Aman Wed 12-1:30pm             

Term Project 

Handout on term project (note various deadlines).

Takehome Final Exam

Download here when you are are ready to work on it. (Finish within 48 hrs.)

Lecture notes + readings

Lecture number + Title
Required reading
Further reading + links
1) (Sept 12) How is this course different from undergrad algorithms?
   Hashing Part 1.
Lecture 1 notes.

2) Karger's min cut algorithm (and its extensions).A simple and gorgeous intro to randomized algorithms.
Lecture 2 notes
(includes extracts from lecture notes of S. Dasgupta and E. Vigoda)

3) Deviation bounds and their applications.
Bounds by Markov, Chebyshev and Chernoff on how much and how often a random variable deviates from its expectation. Applications to Load Balancing and sampling.
Lecture 3 notes.

Survey of concentration inequalities by Chung and Lu
4) Hashing to real numbers and its big-data applications. Estimating the size of a set that's too large to write down. Estimating the similarity of two documents using their hashes.
Lecture 4 notes.

5) Sept 24: Linear thinking. (Linear modeling, linear equations and inequalities, linear programming.  Examples from econometrics, machine learning, etc.)
Lecture 5 notes .
Also see section 7.1 of relevant chapter from Dasgupta, Papadimitriou, Vazirani (ugrad text).

Analysis of Gaussian elimination (notes by Peter Gacs)
6) (Oct 1) Provable Approximation via Linear Programming.
(Min vertex cover, MAX-2SAT, Virtual Circuit routing)
Lecture 6 notes.

7) (Oct 3) Decision-making under uncertainty: Part 1.
Basics of rational choice theory. Optimal decision via dynamic programming. Markov Decision processes (MDPs) and stationary solutions via LP.
Lecture 7 notes.

This old article may still be useful if you want to read more.
Wikipedia page has many pointers.
Lots of other material on the web but everything is very notation-heavy.
(Informal) Five commandments about decision theory.
For those with further interest in MDPs, there's   Andrew Moore's notes  and Jay Talor's excellent survey
8) (Oct 8) Decision making under total uncertainty.
(Hint: Minimize your regret!)
Lecture 8 notes.
Arora-Hazan-Kale survey.
9) Oct 10: NO CLASS (Makeup session during reading period)

10) Oct 15): Using multiplicative weights for LP solving,
Game Playing, and Portfolio Management. (+ glimpses of duality)
Lecture 10 notes.
Papers cited in the notes.
Wikipedia entry on the traditional stock
pricing theory.

11) Oct 17): High dimensional geometry. Curse (and blessing) of Dimensionality. Dimension reduction.
Lecture 11 notes.

12 Oct 22): Random walks, Markov Chains, and Analysis of convergence. Also, Markovian models.
Lecture 12 notes.

13) Oct 24): Finding true dimensionality of datasets, low-rank matrix approximation, and SVD.
Lecture 13 notes

14) Nov 5): Computing SVD, Power Method, Recovering planted bisections. (Glimpses of eigenvalues of random matrices.)
Lecture 14 notes.
SVD chapter in Hopcroft-Kannan book.
15) Nov 7): Semidefinite Programming. 0.878-algorithms for MAX-CUT and MAX-2SAT, and the saga of the 0.878.
Notes delayed; see other profs' notes on the web.
Satish Rao's notes.
16) Nov 12): Oracles, Ellipsoids, and their uses for convex optimization.
(including solving LPs too big to write down)
Lecture 16 notes.
Santosh Vempala's notes.
17) Nov 14): Duality and the minmax theorem. (Also started gradient descent, which appears in lecture notes for next lecture.)
Lecture 17 notes.

18) Nov 19) Guest lecture: Bernard Chazelle (intro to computational geometry)

19) Nov 21) Following the slope: Gradient Descent.
Offline, Online, Stochastic.
Lecture 19 notes.
Chapter 9 of Convex Optimization by Boyd and Vandenberghe. (pdf is online).  Nesterov's notes are terser
but still very readable.
20) Nov 26): Counting and sampling problems and their close interrelationship.
Valiant's class #P. Monte Carlo method. Dyer's algorithm for counting knapsack solutions.
Lecture 20 notes.
Eric Vigoda's notes (Has FPRAS'es for three problems)
21) Dec 2): Protecting against information loss: coding theory.
Lecture 21 notes

22) Dec 4): A taste of cryptography: secret sharing and secure multiparty computation. Lecture 22 notes.

23) Dec 9): Heuristics: Algorithms we don't know how to analyse.
Lecture 23 notes.


  1. Homework 1. Due Thurs Oct 3 in class.
  2. Homework 2. Due Thurs Oct 24 in class. Link to data file for Qs 6 (thanks to Elad Hazan) described in this README file.
    The README file also describes this other data file in CSV format, which you can also test on if you like. 
  3. Homework 3. Due Nov 14 in class. Here is the image file for Qs 6.
  4. Homework 4. Due Dec 12 in class. (Or by Dec 13 in my office.)

Resources and Readings

Further reading (books)

This course presents "greatest hits" of algorithms research and/or "must-know foundational ideas."  Usually the topics are covered in greater detail in specific textbooks.  Here are some great resources for additional reading:
  1. Randomized Algorithms by Motwani and Raghavan.
  2. Online computation and online analysis by Borodin and El-Yaniv.
  3. Probabilistic Method by Alon and Spencer.
  4. Approximation algorithms by Vijay Vazirani.
  5. Design of approximation algorithms (legal download) by Williamson and Shmoys
  6. Spectral graph theory by Fan Chung.
  7. Mining of massive datasets by Rajaraman, Leskovec, Ullmann.
  8. Algorithmic Game Theory (nonprintable legal version) by Nisan, Roughgarden, Tardos, Vazirani.

Readings for Friday Section

(for students who come to the Friday meetings)

Date and topic
Additional reading/links
1) Random graph theory, indep. set in random graphs, random 3SAT.

2) More on polytopes, simplex, LP.

3) Method of conditional expectations and pessimistic estimators. Intro to online
Lecture notes
by N. Harvey.
Buchbinder-Naor survey
in FTCS.
4) More about eigenvalues, SVD

5) Cheeger inequality, logn-approximation to sparsest cut via SDP with triangle inequality.

6) More on LP relaxations for approximation. Subtour Elimination LP for TSP. Spreading metric constraints. Ellipsoid method.

7) Gradient descent flavors. "Primal-dual" view. Steepest descent (l_1, l_2, and Hessian norms). Intro to near-linear time algorithms for Laplacian solvers and flows.