Quick links

Flexible, Reliable, and Scalable Nonparametric Learning

Date and Time
Monday, November 24, 2014 - 4:30pm to 5:30pm
Location
Computer Science Small Auditorium (Room 105)
Type
CS Department Colloquium Series
Host
Barbara Engelhardt

Erik Sudderth

Erik Sudderth

Applications of statistical machine learning increasingly involve datasets with rich hierarchical, temporal, spatial, or relational structure.  Bayesian nonparametric models offer the promise of effective learning from big datasets, but standard inference algorithms often fail in subtle and hard-to-diagnose ways.  We explore this issue via variants of a popular and general model family, the hierarchical Dirichlet process.  We propose a framework for "memoized" online optimization of variational learning objectives, which achieves computational scalability by processing local batches of data, while simultaneously adapting the global model structure in a coherent fashion.  Using this approach, we build improved models of text, audio, image, and social network data.

Erik B. Sudderth is an Assistant Professor in the Brown University Department of Computer Science.  He received the Bachelor's degree (summa cum laude, 1999) in Electrical Engineering from the University of California, San Diego, and the Master's and Ph.D. degrees (2006) in EECS from the Massachusetts Institute of Technology.  His research interests include probabilistic graphical models; nonparametric Bayesian methods; and applications of statistical machine learning in computer vision and the sciences.  He received an NSF CAREER award in 2014, and in 2008 was named one of "AI's 10 to Watch" by IEEE Intelligent Systems Magazine.

Follow us: Facebook Twitter Linkedin