Robust Ensemble Learning for Data Mining
We propose a new boosting algorithm which similarly to v-Support-Vector Classification allows for the possibility of a pre-specified fraction v of points to lie in the margin area or even on the wrong side of the decision boundary. It gives a nicely interpretable way of controlling the trade-off between minimizing training error and capacity. Furthermore, it can act as a filter for finding and selecting informative patterns from a database.
Unable to display preview. Download preview PDF.
- 1.L. Breiman. Prediction games and arcing algorithms. Technical Report 504, Statistics Department, University of California, December 1997.Google Scholar
- 3.Y. LeCun et al. Learning algorithms for classification: A comparison on handwritten digit recognition. Neural Networks, pages 261–276, 1995.Google Scholar
- 4.A. Grove and D. Schuurmans. Boosting in the limit: Maximizing the margin of learned ensembles. In Proc. of the 15th Nat. Conf. on AI, pages 692–699, 1998.Google Scholar
- 5.C. J. Merz and P. M. Murphy. UCI repository of machine learning databases, 1998. [http://www.ics.uci.edu/~mlearn/MLRepository.html]. Irvine, CA.
- 6.J. R. Quinlan. Boosting first-order learning (invited lecture). Lecture Notes in Computer Science, 1160:143, 1996.Google Scholar
- 7.G. Rätsch, T. Onoda, and K.-R. Müller. Soft margins for AdaBoost. Technical Report NC-TR-1998-021, NeuroColt, 1998. To appear in Machine Learning.Google Scholar
- 8.G. Rätsch, B. Schökopf, A. Smola, S. Mika, T. Onoda, and K.-R. Müller. Robust ensemble learning. In A.J. Smola, P.L. Bartlett, B. Schölkopf, and D. Schuurmans, editors, Advances in Large Margin Classifiers, pages 207–219. MIT Press, Cambridge, MA, 1999.Google Scholar
- 11.H. Schwenk and Y. Bengio. Training methods for adaptive boosting of neural networks. In Michael I. Jordan, Michael J. Kearns, and Sara A. Solla, editors, Advances in Neural Inf. Processing Systems, volume 10. The MIT Press, 1998.Google Scholar