Skip to main content

An asymptotic analysis of AdaBoost in the binary classification case

  • Conference paper
  • First Online:
Book cover ICANN 98 (ICANN 1998)

Part of the book series: Perspectives in Neural Computing ((PERSPECT.NEURAL))

Included in the following conference series:

Abstract

Recent work has shown that combining multiple versions of weak classifiers such as decision trees or neural networks results in reduced test set error. To study this in greater detail, we analyze the asymptotic behavior of AdaBoost type algorithms. The theoretical analysis establishes the relation between the distribution of margins of the training examples and the generated voting classification rule. The paper shows asymptotic experimental results for the binary classification case underlining the theoretical findings. Finally, the relation between the model complexity and noise in the training data, and how to improve AdaBoost type algorithms in practice are discussed.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. C.M. Bishop. Neural Networks for Pattern Recognition. Clarendon, 1995.

    Google Scholar 

  2. L. Breiman. Bagging predictors. Machine Learning, 26(2):123–140, 1996.

    MATH  Google Scholar 

  3. L. Breiman. Prediction games and arcing algorithms. Technical Report 504, Statistics Department, University of Berkeley, December 1997.

    Google Scholar 

  4. B. Schölkopf et al. Comparing support vector machines with gaussian kernels to radial basis function classifiers. IEEE Trans. Signal Processing, 45(11):2758–2765, 1997.

    Article  Google Scholar 

  5. Y. LeCun et al. Learning algorithms for classification: A comparism on handwritten digit recognistion. Neural Networks, pages 261–276, 1995.

    Google Scholar 

  6. R. Schapire et al. Boosting the margin: A new explanation for the effectiveness of voting methods. Machine Learning, pages 148–156, 1998.

    Google Scholar 

  7. H. Schwenk, Y. Bengio. Adaboosting neural networks: Application to online character recongnition. In ICANN’97, LNCS 1327, 967–972, 1997.

    Google Scholar 

  8. V. N. Vapnik. The Nature of Statistical Learning Theory. Springer, 1995.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to T. Onoda .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1998 Springer-Verlag London

About this paper

Cite this paper

Onoda, T., Rätsch, G., Müller, KR. (1998). An asymptotic analysis of AdaBoost in the binary classification case. In: Niklasson, L., Bodén, M., Ziemke, T. (eds) ICANN 98. ICANN 1998. Perspectives in Neural Computing. Springer, London. https://doi.org/10.1007/978-1-4471-1599-1_26

Download citation

  • DOI: https://doi.org/10.1007/978-1-4471-1599-1_26

  • Published:

  • Publisher Name: Springer, London

  • Print ISBN: 978-3-540-76263-8

  • Online ISBN: 978-1-4471-1599-1

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics