Princeton University

Computer Science 521

Fall 2013

(Important: In light of the new grad course requirements, this course is changing as of Fall 2013 to make it more accessible to CS grads who are not specializing in theoretical CS. )
Design and analysis of algorithms is an important part of computer science today. This course gives a broad yet deep exposure to algorithmic advances of the past few decades, and brings students up to a level where they can read and understand research papers in algorithms. The course is suitable for advanced undergrads and nonCS grads as well, and they will be graded on a different curve. Grads who intend to specialize in theoretical CS are invited to attend extra discussions (at Small World coffee on Friday afternoon) that explore some topics in greater depth.
Thematically, the biggest difference from undergrad algorithms (such as COS 423) is extensive use of ideas such as randomness, approximation, high dimensional geometry, which are increasingly important in most applications. We will encounter notions such as algorithm design in face of uncertainty, approaches to handle big data, handling intractability, heuristic approaches, etc. We will develop all necessary mathematical tools.
Prerequisites: One undergraduate course in algorithms (eg COS 423), or equivalent mathematical maturity. Listeners and auditors are welcome with prior permission.
Coursework: Two lectures/week. For motivated students, a 1hour discussion of advanced ideas each week at Small World Coffee on Friday afternoon. There will be 4 homeworks over the semester, which may include some simple programmingbased exploration of the lecture ideas using Matlab or other packages. (Collaboration OK on homeworks.) There will be a takehome final in January. Grads not specializing in theoretical CS will be allowed to substitute a course project (done in groups of 2) + one extra homework for the final. There is no official text. Instead, we will use assorted online resources. Students will be expected to scribe lecture notes once or twice during the term.
Instructor: Sanjeev Arora 307 CS Building  6092583869 arora AT the domain name cs.princeton.edu
Teaching assistant: Aman Dhesi adhesi AT the domain
name cs.princeton.edu
ENROLLED STUDENTS SHOULD ADD THEMSELVES TO THE DISCUSSION LIST AT
PIAZZA.COM
Office hrs: Sanjeev Monday 3:305pm in Room 307 and by
appointment.
Aman Wed 121:30pm
Handout on term project (note various deadlines).
Takehome Final Exam
Download here when you are are
ready to work on it. (Finish within 48 hrs.)
Lecture number + Title 
Required reading 
Further reading + links 
1) (Sept 12) How is this course different
from undergrad algorithms? Hashing Part 1. 
Lecture 1
notes. 

2) Karger's min cut algorithm (and its
extensions).A simple and gorgeous intro to randomized
algorithms. 
Lecture 2
notes (includes extracts from lecture notes of S. Dasgupta and E. Vigoda) 

3) Deviation bounds and their
applications. Bounds by Markov, Chebyshev and Chernoff on how much and how often a random variable deviates from its expectation. Applications to Load Balancing and sampling. 
Lecture 3
notes. 
Survey
of concentration inequalities by Chung and Lu 
4) Hashing to real numbers and its
bigdata applications. Estimating the size of a set
that's too large to write down. Estimating the similarity of
two documents using their hashes. 
Lecture 4
notes. 

5) Sept 24: Linear thinking. (Linear
modeling, linear equations and inequalities, linear
programming. Examples from econometrics, machine
learning, etc.) 
Lecture 5 notes .

Also see section 7.1 of
relevant chapter from Dasgupta, Papadimitriou,
Vazirani (ugrad text). Analysis of Gaussian elimination (notes by Peter Gacs) 
6) (Oct 1) Provable Approximation via
Linear Programming. (Min vertex cover, MAX2SAT, Virtual Circuit routing) 
Lecture 6 notes. 

7) (Oct 3) Decisionmaking under
uncertainty: Part 1. Basics of rational choice theory. Optimal decision via dynamic programming. Markov Decision processes (MDPs) and stationary solutions via LP. 
Lecture 7 notes. This old article may still be useful if you want to read more. 
Wikipedia
page has many pointers. Lots of other material on the web but everything is very notationheavy. (Informal) Five commandments about decision theory. For those with further interest in MDPs, there's Andrew Moore's notes and Jay Talor's excellent survey 
8) (Oct 8) Decision making under total
uncertainty. (Hint: Minimize your regret!) 
Lecture 8 notes. 
AroraHazanKale survey. 
9) Oct 10: NO CLASS (Makeup session during
reading period) 

10) Oct 15): Using multiplicative weights
for LP solving, Game Playing, and Portfolio Management. (+ glimpses of duality) 
Lecture 10
notes. 
Papers cited in the notes. Wikipedia entry on the traditional stock pricing theory. 
11) Oct 17): High dimensional geometry.
Curse (and blessing) of Dimensionality. Dimension
reduction. 
Lecture 11
notes. 

12 Oct 22): Random walks, Markov Chains,
and Analysis of convergence. Also, Markovian models. 
Lecture 12
notes. 

13) Oct 24): Finding true dimensionality
of datasets, lowrank matrix approximation, and SVD. 
Lecture 13 notes 

14) Nov 5): Computing SVD, Power Method,
Recovering planted bisections. (Glimpses of
eigenvalues of random matrices.) 
Lecture 14
notes. 
SVD chapter in HopcroftKannan book. 
15) Nov 7): Semidefinite Programming. 0.878algorithms
for MAXCUT and MAX2SAT, and the saga of the 0.878. 
Notes delayed; see other profs' notes on the
web. 
Satish
Rao's notes. 
16) Nov 12): Oracles, Ellipsoids, and
their uses for convex optimization. (including solving LPs too big to write down) 
Lecture 16
notes. 
Santosh
Vempala's notes. 
17) Nov 14): Duality and the minmax
theorem. (Also started gradient descent, which appears
in lecture notes for next lecture.) 
Lecture 17
notes. 

18) Nov 19) Guest lecture: Bernard
Chazelle (intro to computational geometry) 

19) Nov 21) Following the slope: Gradient
Descent. Offline, Online, Stochastic. 
Lecture 19
notes. 
Chapter 9 of Convex
Optimization by Boyd and Vandenberghe. (pdf is
online). Nesterov's
notes are terser but still very readable. 
20) Nov 26): Counting and sampling
problems and their close interrelationship. Valiant's class #P. Monte Carlo method. Dyer's algorithm for counting knapsack solutions. 
Lecture 20
notes. 
Eric
Vigoda's notes (Has FPRAS'es for three problems) 
21) Dec 2): Protecting against
information loss: coding theory. 
Lecture 21 notes 

22) Dec 4): A taste of cryptography: secret sharing and secure multiparty computation.  Lecture 22
notes. 

23) Dec 9): Heuristics: Algorithms we
don't know how to analyse. 
Lecture 23
notes. 
(for students who come to the Friday meetings)
Date and topic 
Reading 
Additional reading/links 
1) Random graph theory, indep. set in random
graphs, random 3SAT. 
AlonSpencer 

2) More on polytopes, simplex, LP. 

3) Method of conditional expectations and
pessimistic estimators. Intro to online algorithms. 
Lecture
notes by N. Harvey. 
BuchbinderNaor survey in FTCS. 
4) More about eigenvalues, SVD 

5) Cheeger inequality, lognapproximation to
sparsest cut via SDP with triangle inequality. 

6) More on LP relaxations for approximation.
Subtour Elimination LP for TSP. Spreading metric
constraints. Ellipsoid method. 

7) Gradient descent flavors. "Primaldual"
view. Steepest descent (l_1, l_2, and Hessian norms). Intro
to nearlinear time algorithms for Laplacian solvers and
flows. 