Quick links

Composing differentiable procedures for modeling, inference, and optimization

Date and Time
Thursday, February 18, 2016 - 12:30pm to 1:30pm
Location
Computer Science Small Auditorium (Room 105)
Type
CS Department Colloquium Series
Speaker
Host
Barbara Engelhardt

Much recent success in machine learning has been through optimizing simple feedforward procedures, such as neural networks, using gradients.  Surprisingly, many complex procedures such as message passing, filtering, inference, and even optimization itself can be meaningfully differentiated though as well.  Composing these procedures lets us build sophisticated models that generalize existing methods but retain their good properties.  We'll show applications to chemical design, gradient-based tuning of optimization procedures, and training procedures that don't require cross-validation.

David Duvenaud is a postdoc in the Harvard Intelligent Probabilistic Systems group, working with Prof. Ryan Adams on model-based optimization, synthetic chemistry, and neural networks.  He did his Ph.D. at the University of Cambridge with Carl Rasmussen and Zoubin Ghahramani. Previous to that, he worked on machine vision both with Kevin Murphy at the University of British Columbia, and later at Google Research.  David also co-founded Invenia, an energy forecasting and trading firm.

Follow us: Facebook Twitter Linkedin