Advertisement

An asymptotic analysis of AdaBoost in the binary classification case

  • T. Onoda
  • G. Rätsch
  • K.-R. Müller
Conference paper
Part of the Perspectives in Neural Computing book series (PERSPECT.NEURAL)

Abstract

Recent work has shown that combining multiple versions of weak classifiers such as decision trees or neural networks results in reduced test set error. To study this in greater detail, we analyze the asymptotic behavior of AdaBoost type algorithms. The theoretical analysis establishes the relation between the distribution of margins of the training examples and the generated voting classification rule. The paper shows asymptotic experimental results for the binary classification case underlining the theoretical findings. Finally, the relation between the model complexity and noise in the training data, and how to improve AdaBoost type algorithms in practice are discussed.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    C.M. Bishop. Neural Networks for Pattern Recognition. Clarendon, 1995.Google Scholar
  2. [2]
    L. Breiman. Bagging predictors. Machine Learning, 26(2):123–140, 1996.zbMATHGoogle Scholar
  3. [3]
    L. Breiman. Prediction games and arcing algorithms. Technical Report 504, Statistics Department, University of Berkeley, December 1997.Google Scholar
  4. [4]
    B. Schölkopf et al. Comparing support vector machines with gaussian kernels to radial basis function classifiers. IEEE Trans. Signal Processing, 45(11):2758–2765, 1997.CrossRefGoogle Scholar
  5. [5]
    Y. LeCun et al. Learning algorithms for classification: A comparism on handwritten digit recognistion. Neural Networks, pages 261–276, 1995.Google Scholar
  6. [6]
    R. Schapire et al. Boosting the margin: A new explanation for the effectiveness of voting methods. Machine Learning, pages 148–156, 1998.Google Scholar
  7. [7]
    H. Schwenk, Y. Bengio. Adaboosting neural networks: Application to online character recongnition. In ICANN’97, LNCS 1327, 967–972, 1997.Google Scholar
  8. [8]
    V. N. Vapnik. The Nature of Statistical Learning Theory. Springer, 1995.Google Scholar

Copyright information

© Springer-Verlag London 1998

Authors and Affiliations

  1. 1.GMD FIRSTBerlinGermany

Personalised recommendations