Quick links

Machine Learning: Higher, Faster, Stronger

Date and Time
Tuesday, February 21, 2012 - 4:30pm to 5:30pm
Location
Computer Science Small Auditorium (Room 105)
Type
CS Department Colloquium Series
Speaker
Ohad Shamir, from Microsoft Research New England
Host
David Blei
Over the past decade, machine learning has emerged as a major and highly influential discipline of computer science and engineering. As the scope and variety of its applications increase, it faces novel and increasingly challenging settings, which go beyond classical learning frameworks. In this talk, I will present two recent works which fall under this category. The first work introduces a new model of sequential decision making with partial information. The model interpolates between two well-known online learning settings ("experts" and multi-armed bandits), and trades-off between the information obtained per round and the total number of rounds required to reach the same performance. The second work discusses the problem of parallelizing gradient-based learning algorithms, which is increasingly important for web-scale applications, but is highly non-trivial as these algorithms are inherently sequential. We show how this can be done using a generic and simple protocol, prove its theoretical optimality, and substantiate its performance experimentally on large-scale data.

Ohad Shamir is a postdoctoral researcher at Microsoft Research New England. He joined Microsoft in 2010 after receiving a Ph.D. in computer science from the Hebrew university, advised by Prof. Naftali Tishby. His research focuses on machine learning, with emphasis on novel algorithms which combine practical applicability and theoretical insight. His work was recognized by several awards, such as the Hebrew University's Schlomiuk Ph.D. thesis prize, the COLT 2010 best paper award, and the Wolf foundation scholarship.

Follow us: Facebook Twitter Linkedin