Computer Science 521
(Important: In light of the new grad course requirements, this course is changing as of Fall 2013 to make it more accessible to CS grads who are not specializing in theoretical CS. )
Design and analysis of algorithms is an important part of computer science today. This course gives a broad yet deep exposure to algorithmic advances of the past few decades, and brings students up to a level where they can read and understand research papers in algorithms. The course is suitable for advanced undergrads and non-CS grads as well, and they will be graded on a different curve. Grads who intend to specialize in theoretical CS are invited to attend extra discussions (at Small World coffee on Friday afternoon) that explore some topics in greater depth.
Thematically, the biggest difference from undergrad algorithms (such as COS 423) is extensive use of ideas such as randomness, approximation, high dimensional geometry, which are increasingly important in most applications. We will encounter notions such as algorithm design in face of uncertainty, approaches to handle big data, handling intractability, heuristic approaches, etc. We will develop all necessary mathematical tools.
Prerequisites: One undergraduate course in algorithms (eg COS 423), or equivalent mathematical maturity. Listeners and auditors are welcome with prior permission.
Coursework: Two lectures/week. For motivated students, a 1-hour discussion of advanced ideas each week at Small World Coffee on Friday afternoon. There will be 4 homeworks over the semester, which may include some simple programming-based exploration of the lecture ideas using Matlab or other packages. (Collaboration OK on homeworks.) There will be a take-home final in January. Grads not specializing in theoretical CS will be allowed to substitute a course project (done in groups of 2) + one extra homework for the final. There is no official text. Instead, we will use assorted online resources. Students will be expected to scribe lecture notes once or twice during the term.
Instructor: Sanjeev Arora- 307 CS Building - 609-258-3869 arora AT the domain name cs.princeton.edu
Teaching assistant: Aman Dhesi adhesi AT the domain
ENROLLED STUDENTS SHOULD ADD THEMSELVES TO THE DISCUSSION LIST AT
Office hrs: Sanjeev Monday 3:30-5pm in Room 307 and by
Aman Wed 12-1:30pm
Handout on term project (note various deadlines).
Takehome Final Exam
Download here when you are are
ready to work on it. (Finish within 48 hrs.)
|Lecture number + Title
||Further reading + links
|1) (Sept 12) How is this course different
from undergrad algorithms?
Hashing Part 1.
|2) Karger's min cut algorithm (and its
extensions).A simple and gorgeous intro to randomized
(includes extracts from lecture notes of S. Dasgupta and E. Vigoda)
|3) Deviation bounds and their
Bounds by Markov, Chebyshev and Chernoff on how much and how often a random variable deviates from its expectation. Applications to Load Balancing and sampling.
of concentration inequalities by Chung and Lu
|4) Hashing to real numbers and its
big-data applications. Estimating the size of a set
that's too large to write down. Estimating the similarity of
two documents using their hashes.
|5) Sept 24: Linear thinking. (Linear
modeling, linear equations and inequalities, linear
programming. Examples from econometrics, machine
||Lecture 5 notes .
||Also see section 7.1 of
relevant chapter from Dasgupta, Papadimitriou,
Vazirani (ugrad text).
Analysis of Gaussian elimination (notes by Peter Gacs)
|6) (Oct 1) Provable Approximation via
(Min vertex cover, MAX-2SAT, Virtual Circuit routing)
|Lecture 6 notes.
|7) (Oct 3) Decision-making under
uncertainty: Part 1.
Basics of rational choice theory. Optimal decision via dynamic programming. Markov Decision processes (MDPs) and stationary solutions via LP.
|Lecture 7 notes.
This old article may still be useful if you want to read more.
page has many pointers.
Lots of other material on the web but everything is very notation-heavy.
(Informal) Five commandments about decision theory.
For those with further interest in MDPs, there's Andrew Moore's notes and Jay Talor's excellent survey
|8) (Oct 8) Decision making under total
(Hint: Minimize your regret!)
|Lecture 8 notes.
|9) Oct 10: NO CLASS (Makeup session during
|10) Oct 15): Using multiplicative weights
for LP solving,
Game Playing, and Portfolio Management. (+ glimpses of duality)
||Papers cited in the notes.
Wikipedia entry on the traditional stock
|11) Oct 17): High dimensional geometry.
Curse (and blessing) of Dimensionality. Dimension
|12 Oct 22): Random walks, Markov Chains,
and Analysis of convergence. Also, Markovian models.
|13) Oct 24): Finding true dimensionality
of datasets, low-rank matrix approximation, and SVD.
||Lecture 13 notes
|14) Nov 5): Computing SVD, Power Method,
Recovering planted bisections. (Glimpses of
eigenvalues of random matrices.)
||SVD chapter in Hopcroft-Kannan book.
|15) Nov 7): Semidefinite Programming. 0.878-algorithms
for MAX-CUT and MAX-2SAT, and the saga of the 0.878.
||Notes delayed; see other profs' notes on the
|16) Nov 12): Oracles, Ellipsoids, and
their uses for convex optimization.
(including solving LPs too big to write down)
|17) Nov 14): Duality and the minmax
theorem. (Also started gradient descent, which appears
in lecture notes for next lecture.)
|18) Nov 19) Guest lecture: Bernard
Chazelle (intro to computational geometry)
|19) Nov 21) Following the slope: Gradient
Offline, Online, Stochastic.
||Chapter 9 of Convex
Optimization by Boyd and Vandenberghe. (pdf is
notes are terser
but still very readable.
|20) Nov 26): Counting and sampling
problems and their close interrelationship.
Valiant's class #P. Monte Carlo method. Dyer's algorithm for counting knapsack solutions.
Vigoda's notes (Has FPRAS'es for three problems)
|21) Dec 2): Protecting against
information loss: coding theory.
||Lecture 21 notes
|22) Dec 4): A taste of cryptography: secret sharing and secure multiparty computation.||Lecture 22
|23) Dec 9): Heuristics: Algorithms we
don't know how to analyse.
(for students who come to the Friday meetings)
|Date and topic
|1) Random graph theory, indep. set in random
graphs, random 3SAT.
|2) More on polytopes, simplex, LP.
|3) Method of conditional expectations and
pessimistic estimators. Intro to online
by N. Harvey.
|4) More about eigenvalues, SVD
|5) Cheeger inequality, logn-approximation to
sparsest cut via SDP with triangle inequality.
|6) More on LP relaxations for approximation.
Subtour Elimination LP for TSP. Spreading metric
constraints. Ellipsoid method.
|7) Gradient descent flavors. "Primal-dual"
view. Steepest descent (l_1, l_2, and Hessian norms). Intro
to near-linear time algorithms for Laplacian solvers and