Quick links

Some Algorithmic Challenges in Statistics

Date and Time
Monday, December 1, 2014 - 4:30pm to 5:30pm
Location
Computer Science Small Auditorium (Room 105)
Type
CS Department Colloquium Series

Sham Kakade

Machine learning is seeing tremendous progress in its impact on society. Along with this progress comes an increasing role for both scalable algorithms and theoretical foundations; the hope being that the such progress can facilitate further breakthroughs on core AI problems. This will talk will survey recent progress and future challenges at the intersection of computer science and statistics, with a focus on three areas:
 
How can we learn the interactions between observed variables, where there exist certain latent (or hidden) causes which help to explain the correlations in the observed data. Such settings where latent variable models have seen successes include document (or topic) modeling, hidden Markov models (say for modeling time series of acoustic signals), and discovering communities of individuals in social networks.
 
The second is that of stochastic optimization. Many problems that arise in science and engineering are those in which we only have a stochastic approximation to the underlying problem at hand (e.g. linear regression or other problems where our objective function is a sample average). Such problems highlight some of the challenges we face at the interface of computer science and statistics: should we use a highly (numerically) accurate algorithm (with costly time and space requirements) or a crude stochastic approximation scheme like stochastic gradient descent (which is light on memory and simple to implement, yet has a poor convergence rate)?

Finally, I will provide a brief discussion with regards to future challenges inspired by the impressive successes of deep learning.

 
A recurring theme is that algorithmic advances can provide new practical techniques for statistical estimation.
 
Sham Kakade is a principal research scientist scientist at Microsoft Research, New England. His research focus is on designing scalable and efficient algorithms for machine learning and artificial intelligence; he has worked (and has continued interests) in areas such as statistics, optimization, probability theory, algorithms, economics, and neuroscience. Previously, Dr. Kakade was an associate professor at the Department of Statistics, Wharton, University of Pennsylvania (from 2010-2012) and was an assistant professor at the Toyota Technological Institute at Chicago. Before this, he did a postdoc in the Computer and Information Science department at the University of Pennsylvania under the supervision of Michael Kearns. Dr. Kakade completed his PhD at the Gatsby Unit where his advisor was Peter Dayan. Before Gatsby, Dr. Kakade was an undergraduate at Caltech where he did his BS in physics.
Follow us: Facebook Twitter Linkedin