Skip to main content

AdaBoost

  • Reference work entry
  • First Online:

Synonyms

Adaptive boosting; Discrete adaBoost

Definition

AdaBoost is an algorithm that builds a classifier by combining additively a set of weak classifiers. The weak classifiers are incorporated sequentially one at a time so that their combination reduces the empirical exponential loss.

Background

Boosting is a procedure to combine several classifiers with weak performance into one with arbitrarily high performance [1, 2] and was originally introduced by Robert Schapire in the machine learning community [3]. AdaBoost is a popular implementation of boosting for binary classification [4]. The enthusiasm generated by boosting, and in particular by AdaBoost, in machine learning can be highlighted via a quote of Breiman [1] saying that AdaBoost with trees is the “best off-the-shelf classifier in the world.” In practice, much of the popularity of AdaBoost is due to both its performance being in the same league as support vector machines [5] and its algorithmic simplicity. In the computer...

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   649.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD   899.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

References

  1. Friedman JH, Hastie T, Tibshirani R (2000) Additive logistic regression: a statistical view of boosting. Ann Stat 28(2): 337–374

    Article  MathSciNet  MATH  Google Scholar 

  2. Hastie T, Tibshirani R, Friedman J (2001) The elements of statistical learning. Springer, New York

    Book  MATH  Google Scholar 

  3. Schapire RE (1990) The strength of weak learnability. Mach Learn 5(2):197–227

    Google Scholar 

  4. Freund Y, Schapire RE (1996) Experiments with a new boosting algorithm. In: Proceedings of the 13th international conference on machine learning, Bari, 148–156

    Google Scholar 

  5. Vapnik V (1995) The nature of statistical learning theory. Springer, New York

    Book  MATH  Google Scholar 

  6. Viola P, Jones M (2001) Robust real-time object detection. In: Proceedings of IEEE workshop on statistical and computational theories of vision, Vancouver, Canada

    Google Scholar 

  7. Papageorgiou CP, Oren M, Poggio T (1998) A general framework for object detection. In: International conference on computer vision, Bombay, pp 555–562

    Google Scholar 

  8. Sun Y, Li J, Hager W (2004) Two new regularized adaboost algorithms. In: Machine learning and applications, Louisville, pp 41–48

    Google Scholar 

  9. Schapire RE, Singer Y (1998) Improved boosting algorithms using confidence-rated predictions. In: Computational learning theory, Springer, New York, pp 80–91

    MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer Science+Business Media New York

About this entry

Cite this entry

Favaro, P., Vedaldi, A. (2014). AdaBoost. In: Ikeuchi, K. (eds) Computer Vision. Springer, Boston, MA. https://doi.org/10.1007/978-0-387-31439-6_663

Download citation

Publish with us

Policies and ethics