Robert E. Schapire is a Visiting Lecturer in Computer Science. A specialist in machine learning, he is best known as a primary developer of a technique called boosting in which many weak and inaccurate prediction methods are combined to form a highly accurate predictor. Since it was first proposed by Dr. Schapire and his colleague Yoav Freund, a computer science professor at the University of California San Diego, boosting algorithms have been used in a wide variety of applications. Dr. Schapire is a graduate of Brown University and received his master's and doctorate from the Massachusetts Institute of Technology. He did postdoctoral work at Harvard University and was a researcher at AT&T Bell Laboratories before joining the Princeton faculty from 2002 to 2015. A fellow of the Association for the Advancement of Artificial Intelligence, he, along with Freund, was a recipient of the 2003 Godel Prize for the development of the AdaBoost algorithm and the 2004 Paris Kanellakis Theory and Practice Award for development of the theory and practice of boosting.
- “Convergence and consistency of regularized boosting with weakly dependent observations.” Aurélie C. Lozano, Sanjeev R. Kulkarni and Robert E. Schapire. IEEE Transactions on Information Theory, 60(1):651-660, 2014.
- “Explaining AdaBoost.” Robert E. Schapire. In Bernhard Schölkopf, Zhiyuan Luo, Vladimir Vovk, editors, “Empirical Inference: Festschrift in Honor of Vladimir N. Vapnik.” Springer, 2013.
- “A theory of multiclass boosting.” Indraneel Mukherjee and Robert E. Schapire. Journal of Machine Learning Research 14:437-497, 2013. (Preliminary version appeared in Advances in Neural Information Processing Systems 23, 2011.)
- “The rate of convergence of AdaBoost.” Indraneel Mukherjee, Cynthia Rudin and Robert E. Schapire. Journal of Machine Learning Research 14:2315-2347, 2013. (Preliminary version appeared in The 24th Conference on Learning Theory, 2011.)
- “Boosting: Foundations and Algorithms.” Robert E. Schapire and Yoav Freund. MIT Press, 2012.