Quick links

Matrix Completion and Matrix Concentration

Date and Time
Monday, December 14, 2015 - 12:30pm to 1:30pm
Location
Computer Science Small Auditorium (Room 105)
Type
CS Department Colloquium Series
Host
Computer Science Department and The Center for Statistics and Machine Learning

Lester Mackey
The goal in matrix completion is to recover a matrix from a small subset of noisy entries. Web-scale instances of this problem include collaborative filtering for recommendation and link prediction in social networks. Many modern matrix completion algorithms provide strong recovery guarantees but scale poorly due to the repeated and costly computation of truncated SVDs. To address this deficiency, in the first part of this talk, I will introduce Divide-Factor-Combine (DFC), a parallel divide-and-conquer framework for boosting the scalability of a matrix completion algorithm while retaining its theoretical guarantees. Our experiments demonstrate the near-linear to super-linear speed-ups attainable with this approach, and our analysis shows that DFC enjoys high-probability recovery guarantees comparable to those of its base algorithm.

Fundamental to our analysis – and to the analyses of many matrix completion procedures – are matrix concentration inequalities that characterize the fluctuations of a random matrix about its mean. In the second part of this talk, I will show how Stein’s method of exchangeable pairs can be used to derive concentration inequalities for matrix-valued random elements. When applied to a sum of independent random matrices, this approach yields matrix generalizations of the classical inequalities due to Hoeffding, Bernstein, Khintchine, and Rosenthal. The same technique delivers bounds for sums of dependent random matrices and more general matrix functionals of dependent random elements.

Follow us: Facebook Twitter Linkedin