Publication list

Aurélie C. Lozano, Sanjeev R. Kulkarni and Robert E. Schapire.
Convergence and consistency of regularized boosting with weakly
dependent observations.
IEEE Transactions on Information Theory, 60(1):651660, 2014.
Pdf.

Robert E. Schapire.
Explaining AdaBoost.
In Bernhard Schölkopf, Zhiyuan Luo, Vladimir Vovk, editors,
Empirical Inference: Festschrift in Honor of
Vladimir N. Vapnik,
Springer, 2013.
Pdf.

Indraneel Mukherjee and Robert E. Schapire.
A theory of multiclass boosting.
Journal of Machine Learning Research 14:437497, 2013.
Preliminary version appeared in
Advances in Neural Information Processing Systems 23, 2011.
Pdf.

Indraneel Mukherjee, Cynthia Rudin and Robert E. Schapire.
The rate of convergence of AdaBoost.
Journal of Machine Learning Research 14:23152347, 2013.
Preliminary version appeared in
The 24th Conference on Learning Theory, 2011.
Pdf.

Robert E. Schapire and Yoav Freund.
Boosting: Foundations and Algorithms.
MIT Press, 2012.
Publisher's site.

Alekh Agarwal, Miroslav Dudík, Satyen Kale, John Langford and
Robert E. Schapire.
Contextual bandit learning with predictable rewards.
In Proceedings of the Fifteenth International Conference on
Artificial Intelligence and Statistics, 2012.
Pdf.

Cynthia Rudin, Robert E. Schapire and Ingrid Daubechies.
Does AdaBoost always cycle? [open problem].
In Proceedings of the 25th Annual Conference on Learning
Theory, 2012.
Pdf.

Sina Jafarpour, Volkan Cevher and Robert E. Schapire.
A game theoretic approach to expanderbased compressive
sensing.
In Proceedings, IEEE International Symposium on Information
Theory, 2011.
Pdf.

Alina Beygelzimer, John Langford, Lihong Li, Lev Reyzin
and Robert E. Schapire.
Contextual bandit algorithms with supervised learning
guarantees.
In Proceedings of the Fourteenth International Conference on
Artificial Intelligence and Statistics, 2011.
Pdf.

Wei Chu, Lihong Li, Lev Reyzin and Robert~E. Schapire.
Contextual bandits with linear payoff functions.
In Proceedings of the Fourteenth International Conference on
Artificial Intelligence and Statistics, 2011.
Pdf.

Sina Jafarpour, Robert E. Schapire and Volkan Cevher.
Compressive sensing meets game theory.
In Proceedings of the IEEE International Conference on Acoustics,
Speech, and Signal Processing, 2011.
Pdf.

Umar Syed and Robert E. Schapire.
A reduction from apprenticeship learning to
classification.
In Advances in Neural Information Processing Systems
23, 2011.
Pdf.

Satyen Kale, Lev Reyzin and Robert E. Schapire.
Nonstochastic bandit slate problems.
In Advances in Neural Information Processing Systems
23, 2011.
Pdf.

Indraneel Mukherjee and Robert E. Schapire.
Learning with continuous experts using drifting games.
Theoretical Computer Science, 411:26702683, 2010.
Pdf.

Berk Kapicioglu, Robert E. Schapire, Martin Wikelski and Tamara Broderick.
Combining spatial and telemetric features for learning
animal movement models.
In Proceedings of the 26th Conference on Uncertainty in
Artificial Intelligence, 2010.
Pdf.

Lihong Li, Wei Chu, John Langford and Robert E. Schapire.
A contextualbandit approach to personalized news article
recommendation.
In Proceedings of the 19th International Conference on World Wide
Web, 2010.
Pdf.

Robert E. Schapire.
The convergence rate of AdaBoost [open problem].
In The 23rd Conference on Learning Theory, 2010.
Pdf.

Cynthia Rudin and Robert E. Schapire.
Marginbased ranking and an equivalence between AdaBoost
and RankBoost.
Journal of Machine Learning Research 10:21932232, 2009.
Pdf.

Yongxin Taylor Xi, Zhen James Xiang, Peter J. Ramadge,
Robert E. Schapire.
Speed and sparsity of regularized boosting.
In Proceedings of the Twelfth International Conference on
Artificial Intelligence and Statistics, 2009.
Pdf.

Zafer Barutcuoglu, Edoardo M. Airoldi, Vanessa Dumeaux, Robert
E. Schapire, Olga G. Troyanskaya.
Aneuploidy prediction and tumor classification with
heterogeneous hidden conditional random fields.
Bioinformatics 25(10):13071313, 2009.
Pdf.

Ioannis Avramopoulos, Jennifer Rexford and Robert Schapire.
From optimization to regret minimization and back
again.
In Proceedings of the Third Workshop on Tackling Computer
System Problems with Machine Learning Techniques, 2008.
Pdf.

Umar Syed, Michael Bowling and Robert E. Schapire.
Apprenticeship learning using linear programming.
In Proceedings of the 25th International Conference on
Machine Learning, 2008.
Pdf.

Yoav Freund and Robert E. Schapire.
Response to Mease and Wyner, Evidence contrary to the
statistical view of boosting, JMLR 9:131156, 2008.
Journal of Machine Learning Research, 9:171174, 2008.
Pdf.

Chris Bourke, Kun Deng, Stephen D. Scott, Robert E. Schapire and
N.V. Vinodchandran.
On reoptimizing multiclass classifiers.
Machine Learning, 71:219242, 2008.
Pdf.

Joseph K. Bradley and Robert E. Schapire.
FilterBoost: Regression and classification on large
datasets.
In Advances in Neural Information Processing Systems
20, 2008.
Pdf.

Cynthia Rudin, Robert E. Schapire and Ingrid Daubechies.
Analysis of boosting algorithms using the smooth margin
function.
The Annals of Statistics, 35(6):27232768, 2007.
Pdf.

Umar Syed and Robert E. Schapire.
A gametheoretic approach to apprenticeship learning.
In Advances in Neural Information Processing Systems
20, 2008.
Pdf.

Umar Syed and Robert E. Schapire.
Imitation learning with a valuebased prior.
In Uncertainty in Artificial Intelligence: Proceedings of
the TwentyThird Conference, 2007.
Pdf.

Miroslav Dudík, David M. Blei and Robert E. Schapire.
Hierarchical maximum entropy density estimation.
In Proceedings of the 24th International Conference
on Machine Learning, 2007.
Pdf.

Cynthia Rudin, Robert E. Schapire and Ingrid Daubechies.
Precise statements of convergence for AdaBoost and arcgv.
AMSIMSSIAM Joint Summer Research Conference on Machine and
Statistical Learning, Prediction and Discovery,
pages 131145, 2007.
Pdf.

Luis E. Ortiz, Robert E. Schapire and Sham M. Kakade.
Maximum entropy correlated equilibria.
In Eleventh International Conference on
Artificial Intelligence and Statistics, 2007.
Pdf.

Miroslav Dudík, Steven J. Phillips and Robert E. Schapire.
Maximum entropy density estimation with generalized
regularization and an application to species distribution modeling.
Journal of Machine Learning Research, 8(Jun):12171260, 2007.
Pdf.

Lev Reyzin and Robert E. Schapire.
How boosting the margin can also boost classifier
complexity.
In Proceedings of the 23rd International Conference
on Machine Learning, 2006.
Pdf.

Amit Agarwal, Elad Hazan, Satyen Kale and Robert E. Schapire.
Algorithms for portfolio management based on the Newton
method.
In Proceedings of the 23rd International Conference
on Machine Learning, 2006.
Pdf.

Miroslav Dudík and Robert E. Schapire.
Maximum entropy distribution estimation with generalized
regularization.
In 19th Annual Conference on Learning Theory, 2006.
Pdf.

Jane Elith, Catherine H. Graham, Robert P. Anderson, Miroslav Dudík,
Simon Ferrier, Antoine Guisan, Robert J. Hijmans, Falk Huettmann,
John R. Leathwick, Anthony Lehmann, Jin Li, Lucia G. Lohmann, Bette
A. Loiselle, Glenn Manion, Craig Moritz, Miguel Nakamura, Yoshinori
Nakazawa, Jacob McC. M. Overton, A. Townsend Peterson, Steven
J. Phillips, Karen Richardson, Ricardo ScachettiPereira, Robert
E. Schapire, Jorge Soberón, Stephen Williams, Mary S. Wisz and
Niklaus E. Zimmermann.
Novel methods improve prediction of species' distributions from
occurrence data.
Ecography, 29:129151, 2006.
Pdf.

Zafer Barutcuoglu, Robert E. Schapire and Olga G. Troyanskaya.
Hierarchical multilabel prediction of gene function.
Bioinformatics, 22:830836, 2006.
Pdf.

Jordan BoydGraber, Christiane Fellbaum, Daniel Osherson and Robert
Schapire.
Adding dense, weighted connections to WordNet.
In Proceedings of the Third International WordNet
Conference, 2006.
Pdf.

Miroslav Dudík, Robert E. Schapire and Steven J. Phillips.
Correcting sample selection bias in maximum entropy
density estimation.
In Advances in Neural Information Processing Systems
18, 2006.
Pdf.

Aurélie C. Lozano, Sanjeev R. Kulkarni and Robert E. Schapire.
Convergence and consistency of regularized boosting
algorithms with stationary betamixing observations.
In Advances in Neural Information Processing Systems
18, 2006.
Pdf.

Cynthia Rudin, Corinna Cortes, Mehryar Mohri and Robert E. Schapire.
Marginbased ranking meets boosting in the middle.
In 18th Annual Conference on Computational Learning
Theory, 2005.
Pdf.

Steven J. Phillips, Robert P. Anderson and Robert E. Schapire.
Maximum entropy modeling of species geographic
distributions.
Ecological Modelling, 190:231259, 2006.
Pdf.

Gokhan Tur, Dilek HakkaniTür and Robert E.Schapire.
Combining active and semisupervised learning for spoken
language understanding.
Speech Communication, 45(2):171186, 2005.
Pdf.

Robert E. Schapire, Marie Rochery, Mazin Rahim and Narendra Gupta.
Boosting with prior knowledge for call classification.
IEEE Transactions on Speech and Audio Processing,
13(2), March, 2005.
Pdf.

Cynthia Rudin, Ingrid Daubechies and Robert E. Schapire.
The dynamics of AdaBoost: Cyclic behavior and convergence
of margins.
Journal of Machine Learning Research, 5: 15571595, 2004.
Pdf.

Cynthia Rudin, Robert E. Schapire and Ingrid Daubechies.
Boosting based on a smooth margin.
In 17th Annual Conference on Computational Learning
Theory, 2004.
Postscript or
gzipped postscript.

Steven J. Phillips, Miroslav Dudík and Robert E. Schapire.
A maximum entropy approach to species distribution
modeling.
In Proceedings of the TwentyFirst International Conference on
Machine Learning, pages 655662, 2004.
Pdf.

Miroslav Dudík, Steven J. Phillips and Robert E. Schapire.
Performance guarantees for regularized maximum entropy
density estimation.
In 17th Annual Conference on Learning Theory, 2004.
Postscript or
gzipped postscript.

Cynthia Rudin, Ingrid Daubechies and Robert E. Schapire.
On the dynamics of boosting.
In Advances in Neural Information Processing Systems
16, 2004.
Postscript or
gzipped postscript.

Yoav Freund, Yishay Mansour and Robert E. Schapire.
Generalization bounds for averaged classifiers.
The Annals of Statistics, 32(4):16981722, 2004.
Pdf.

Peter Stone, Robert E. Schapire, Michael L. Littman, János A. Csirik
and David McAllester.
Decisiontheoretic bidding based on learned density models in
simultaneous, interacting auctions.
Journal of Artificial Intelligence Research,
19:209242, 2003.
Postscript or
gzipped postscript.

Gokhan Tur, Robert E. Schapire and Dilek HakkaniTür.
Active learning for spoken language understanding.
In IEEE International Conference on Acoustics, Speech and Signal
Processing, 2003.
Pdf.

Yoav Freund and Robert E. Schapire.
A discussion of
``Process consistency for AdaBoost'' by Wenxin Jiang,
``On the Bayesrisk consistency of regularized boosting methods'' by
Gábor Lugosi and Nicolas Vayatis,
``Statistical behavior and consistency of classification methods based
on convex risk minimization'' by Tong Zhang.
The Annals of Statistics, 32(1), 2004.
Postscript or
gzipped postscript.

Robert E. Schapire.
Advances in boosting.
In Uncertainty in Artificial Intelligence: Proceedings of the
Eighteenth Conference, 2002.
Postscript or
gzipped postscript.

Giuseppe Di Fabbrizio, Dawn Dutton, Narendra Gupta, Barbara Hollister,
Mazin Rahim, Giuseppe Riccardi, Robert Schapire and Juergen
Schroeter.
AT&T help desk.
In 7th International Conference on Spoken Language
Processing, 2002.
Pdf.

Robert E. Schapire, Peter Stone, David McAllester, Michael L. Littman
and János A. Csirik.
Modeling auction price uncertainty using boostingbased
conditional density estimation.
In Machine Learning: Proceedings of the Nineteenth
International Conference, 2002.
Postscript or
gzipped postscript.

Robert E. Schapire, Marie Rochery, Mazin Rahim and Narendra Gupta.
Incorporating prior knowledge into boosting.
In Machine Learning: Proceedings of the Nineteenth
International Conference, 2002.
Postscript or
gzipped postscript.

Peter Stone, Robert E. Schapire, János A. Csirik, Michael
L. Littman and David McAllester.
ATTac2001: A learning, autonomous bidding agent.
In Agent Mediated Electronic Commerce IV: Designing
Mechanisms and Systems. Springer Verlag, 2002.
Postscript or
gzipped postscript.

Robert E. Schapire.
The boosting approach to machine learning: An
overview.
In D. D. Denison, M. H. Hansen, C. Holmes, B. Mallick, B. Yu, editors,
Nonlinear Estimation and Classification.
Springer, 2003.
Postscript or
gzipped postscript.

M. Rochery, R. Schapire, M. Rahim, N. Gupta, G. Riccardi,
S. Bangalore, H. Alshawi and S. Douglas.
Combining prior knowledge and boosting for call
classification in spoken language dialogue.
In International Conference on Accoustics, Speech and Signal
Processing, 2002.
Postscript or
gzipped postscript.

Marie Rochery, Robert Schapire, Mazin Rahim and Narendra Gupta.
BoosTexter for text categorization in spoken language
dialogue.
Accepted to Automatic Speech Recognition and Understanding
Workshop, 2001 (but withdrawn due to travel restrictions
following September 11).
Postscript or
gzipped postscript.

Michael Collins, Sanjoy Dasgupta and Robert E. Schapire.
A generalization of principal component analysis to the
exponential family.
In Advances in Neural Information Processing Systems
14, 2002.
Postscript or
gzipped postscript.

Peter Auer, Nicolò CesaBianchi, Yoav Freund, and Robert E. Schapire.
The nonstochastic multiarmed bandit problem.
SIAM Journal on Computing, 32(1):4877, 2002.
Postscript or
gzipped postscript.

David McAllester and Robert E. Schapire.
Learning theory and language modeling.
In Gerhard Lakemeyer and Bernhard
Nebel, editors, Exploring Artificial Intelligence in the New
Millenium. Morgan Kaufmann, 2002.
Postscript or
gzipped postscript.

Raj D. Iyer, David D. Lewis, Robert E. Schapire, Yoram Singer and Amit Singhal.
Boosting for document routing.
In Proceedings of the Ninth International Conference on
Information and Knowledge Management, 2000.
Postscript or
gzipped postscript.

Michael Collins, Robert E. Schapire and Yoram Singer.
Logistic regression, AdaBoost and Bregman distances.
Machine Learning, 48(1/2/3), 2002.
Postscript or
gzipped postscript.

David McAllester and Robert E. Schapire.
On the convergence rate of GoodTuring estimators.
Preliminary version appeared in Proceedings of the
Thirteenth Annual Conference on Computational Learning
Theory, 2000.
Postscript or
gzipped postscript of journal submission (5/17/01).

Erin L. Allwein, Robert E. Schapire and Yoram Singer.
Reducing multiclass to binary: A unifying approach for
margin classifiers.
Journal of Machine Learning Research, 1:113141, 2000.
Pdf.

Yoav Freund, Yishay Mansour and Robert E. Schapire.
Why averaging classifiers can protect against
overfitting.
Proceedings of the Eighth
International Workshop on Artificial Intelligence and
Statistics, 2001.

Yoav Freund and Robert E. Schapire.
Discussion of the paper "Additive logistic regression: a
statistical view of boosting" by Jerome Friedman, Trevor Hastie and
Robert Tibshirani.
The Annals of Statistics, 28(2):391393, April, 2000.
Postscript or
gzipped postscript.

Robert E. Schapire.
Theoretical views of boosting and applications.
In Tenth International Conference on Algorithmic Learning
Theory, 1999.
Postscript or
gzipped postscript.

Yoav Freund and Robert E. Schapire.
A short introduction to boosting.
Journal of Japanese Society for Artificial Intelligence,
14(5):771780, September, 1999. (Appearing in Japanese, translation
by Naoki Abe.)
Postscript or
gzipped postscript.

Robert E. Schapire.
A brief introduction to boosting.
In Proceedings of the Sixteenth International Joint
Conference on Artificial Intelligence, 1999.
Postscript or
gzipped postscript.

Robert E. Schapire.
Drifting games.
Machine Learning, 43(3):265291, 2001.
Postscript or
gzipped postscript.

Steven Abney and Robert E. Schapire and Yoram Singer.
Boosting applied to tagging and PP attachment.
In Proceedings of the Joint SIGDAT Conference on Empirical
Methods in Natural Language Processing and Very Large Corpora, 1999.
Postscript or
gzipped postscript.

Robert E. Schapire.
Theoretical views of boosting.
In Computational Learning Theory: Fourth European
Conference, EuroCOLT'99, pages 110, 1999.
Postscript or
gzipped postscript.

Yoav Freund, Raj Iyer, Robert E. Schapire and Yoram Singer.
An efficient boosting algorithm for combining
preferences.
Journal of Machine Learning Research, 4:933969, 2003.
Postscript or
compressed postscript.

Robert E. Schapire and Yoram Singer.
BoosTexter: A boostingbased system for text
categorization.
Machine Learning, 39(2/3):135168, 2000.
Postscript or
compressed postscript.

Yoav Freund and Robert E. Schapire.
Large margin classification using the perceptron
algorithm.
Machine Learning, 37(3):277296, 1999.
Postscript or
compressed postscript.

Robert E. Schapire and Yoram Singer.
Improved boosting algorithms using confidencerated
predictions.
Machine Learning, 37(3):297336, 1999.
Postscript or
compressed postscript.

Robert E. Schapire, Yoram Singer and Amit Singhal.
Boosting and Rocchio applied to text filtering.
In SIGIR '98: Proceedings of the 21st Annual International
Conference on Research and Development in Information
Retrieval, pages 215223, 1998.
Postscript or
compressed postscript.

Yoav Freund and Robert E. Schapire.
Adaptive game playing using multiplicative weights.
Games and Economic Behavior, 29:79103, 1999.
Pdf.

William W. Cohen, Robert E. Schapire and Yoram Singer.
Learning to order things.
Journal of Artificial Intelligence Research,
10:243270, 1999.
Postscript or
compressed postscript.

Robert E. Schapire, Yoav Freund, Peter Bartlett and Wee Sun Lee.
Boosting the margin: A new explanation for the
effectiveness of voting methods.
The Annals of Statistics, 26(5):16511686, 1998.
Postscript or
compressed postscript.

Yoav Freund and Robert E. Schapire.
Discussion of the paper "Arcing Classifiers" by Leo
Breiman.
The Annals of Statistics, 26(3):824832, 1998.
Postscript or
compressed postscript.

Robert E. Schapire.
Using output codes to boost multiclass learning problems.
In Machine Learning: Proceedings of the Fourteenth International
Conference, pages 313321, 1997.
Postscript or
compressed postscript.

Yoav Freund, Robert E. Schapire, Yoram Singer and Manfred K. Warmuth.
Using and combining predictors that specialize.
In Proceedings of the TwentyNinth Annual ACM Symposium on
the Theory of Computing, pages 334343, 1997.
Postscript or
compressed postscript.

Yoav Freund and Robert E. Schapire.
Experiments with a new boosting algorithm.
In Machine Learning: Proceedings of the Thirteenth International
Conference, pages 148156, 1996.
Postscript or
compressed postscript.

Yoav Freund and Robert E. Schapire.
Game theory, online prediction and boosting.
In Proceedings of the Ninth Annual Conference on Computational
Learning Theory, pages 325332, 1996.
Pdf.

David D. Lewis, Robert E. Schapire, James P. Callan, and Ron Papka.
Training algorithms for linear text classifiers.
In SIGIR '96: Proceedings of the 19th Annual International
Conference on Research and Development in Information Retrieval, 1996.
Postscript or
compressed postscript.

David P. Helmbold, Robert E. Schapire, Yoram Singer, and Manfred K. Warmuth.
Online portfolio selection using multiplicative updates.
Mathematical Finance, 8(4):325347, 1998.
Postscript or
compressed postscript.

Peter Auer, Nicolò CesaBianchi, Yoav Freund, and Robert E. Schapire.
Gambling in a rigged casino: The adversarial multiarmed bandit
problem.
Extended abstract appeared in 36th Annual Symposium on Foundations
of Computer Science, pages 322331, 1995.
Postscript or
compressed postscript of significantly revised journal submission (6/8/98).

Yoav Freund, Michael Kearns, Yishay Mansour, Dana Ron, Ronitt Rubinfeld, and
Robert E. Schapire.
Efficient algorithms for learning to play repeated games against
computationally bounded adversaries.
In 36th Annual Symposium on Foundations of Computer Science,
pages 332341, 1995.
Postscript or
compressed postscript.

David P. Helmbold and Robert E. Schapire.
Predicting nearly as well as the best pruning of a decision
tree.
Machine Learning, 27(1):5168, 1997.
Postscript or
compressed postscript.

David P. Helmbold, Robert E. Schapire, Yoram Singer, and Manfred K. Warmuth.
A comparison of new and old algorithms for a mixture estimation
problem.
Machine Learning, 27(1):97119, 1997.
Postscript or
compressed postscript.

Yoav Freund and Robert E. Schapire.
A decisiontheoretic generalization of online learning and an
application to boosting.
Journal of Computer and System Sciences, 55(1):119139, 1997.
Postscript or
compressed postscript.

Robert E. Schapire and Manfred K. Warmuth.
On the worstcase analysis of temporaldifference learning
algorithms.
Machine Learning, 22(1/2/3):95121, 1996.
Postscript or
compressed postscript.

Michael Kearns, Yishay Mansour, Dana Ron, Ronitt Rubinfeld, Robert E. Schapire,
and Linda Sellie.
On the learnability of discrete distributions.
In Proceedings of the TwentySixth Annual ACM Symposium on the
Theory of Computing, pages 273282, 1994.
Postscript or
compressed postscript.

Robert E. Schapire and Linda M. Sellie.
Learning sparse multivariate polynomials over a field with queries
and counterexamples.
Journal of Computer and System Sciences,
52(2):201213, April, 1996.
Postscript or
compressed postscript.

Yoav Freund, Michael Kearns, Dana Ron, Ronitt Rubinfeld, Robert E. Schapire,
and Linda Sellie.
Efficient learning of typical finite automata from random
walks.
Information and Computation,
138(1):2348, 1997.
Pdf.

Nicolò CesaBianchi, Yoav Freund, David P. Helmbold, David Haussler,
Robert E. Schapire, and Manfred K. Warmuth.
How to use expert advice.
Journal of the Association for Computing Machinery,
44(3):427485, 1997.
Postscript or
compressed postscript.

Harris Drucker, Robert Schapire, and Patrice Simard.
Boosting performance in neural networks.
International Journal of Pattern Recognition and Artificial
Intelligence, 7(4):705719, 1993.

Harris Drucker, Robert Schapire, and Patrice Simard.
Improving performance in neural networks using a boosting
algorithm.
In Advances in Neural Information Processing Systems 5, pages
4249. Morgan Kaufmann, 1993.

Michael J. Kearns, Robert E. Schapire, and Linda M. Sellie.
Toward efficient agnostic learning.
Machine Learning, 17:115141, 1994.
Pdf.

David Haussler, Michael Kearns, Manfred Opper, and Robert Schapire.
Estimating averagecase learning curves using Bayesian,
statistical physics and VC dimension methods.
In Advances in Neural Information Processing Systems 4, pages
855862. Morgan Kaufmann, 1992.

Robert E. Schapire.
Learning probabilistic readonce formulas on product
distributions.
Machine Learning, 14(1):4781, 1994.
Pdf.

David Haussler, Michael Kearns, and Robert E. Schapire.
Bounds on the sample complexity of Bayesian learning using
information theory and the VC dimension.
Machine Learning, 14:83113, 1994.
Pdf.

Robert E. Schapire.
The Design and Analysis of Efficient Learning
Algorithms.
MIT Press, 1992.

Sally A. Goldman, Michael J. Kearns, and Robert E. Schapire.
Exact identification of readonce formulas using fixed points of
amplification functions.
SIAM Journal on Computing, 22(4):705726, August 1993.

Michael J. Kearns and Robert E. Schapire.
Efficient distributionfree learning of probabilistic
concepts.
Journal of Computer and System Sciences, 48(3):464497, 1994.
Pdf.

Sally A. Goldman, Michael J. Kearns, and Robert E. Schapire.
On the sample complexity of weak learning.
Information and Computation, 117(2):276287, March 1995.
Pdf.

Robert E. Schapire.
The emerging theory of averagecase complexity.
Technical Report Technical memo MIT/LCS/TM431, MIT Laboratory for Computer
Science, June 1990.

Sally A. Goldman, Ronald L. Rivest, and Robert E. Schapire.
Learning binary relations and total orders.
SIAM Journal on Computing, 22(5):10061034, 1993.

Robert E. Schapire.
The strength of weak learnability.
Machine Learning, 5(2):197227, 1990.
Pdf.

Robert E. Schapire.
Pattern languages are not learnable.
In Proceedings of the Third Annual Workshop on Computational
Learning Theory, pages 122129, August 1990.

Ronald L. Rivest and Robert E. Schapire.
Diversitybased inference of finite automata.
Journal of the Association for Computing Machinery,
41(3):555589, May 1994.
Pdf.

Ronald L. Rivest and Robert E. Schapire.
Inference of finite automata using homing sequences.
Information and Computation, 103(2):299347, April 1993.
Pdf.

Ronald L. Rivest and Robert E. Schapire.
A new approach to unsupervised learning in deterministic
environments.
In Yves Kodratoff and Ryszard Michalski, editors, Machine Learning:
An Artificial Intelligence Approach, volume III, pages 670684. Morgan
Kaufmann, 1990.
Free software for viewing postscript files is available
here.
.