Encyclopedia of Machine Learning

2010 Edition
| Editors: Claude Sammut, Geoffrey I. Webb


Reference work entry
DOI: https://doi.org/10.1007/978-0-387-30164-8_8

Adaboost is an ensemble learning technique, and the most well-known of the Boosting family of algorithms. The algorithm trains models sequentially, with a new model trained at each round. At the end of each round, mis-classified examples are identified and have their emphasis increased in a new training set which is then fed back into the start of the next round, and a new model is trained. The idea is that subsequent models should be able to compensate for errors made by earlier models. See ensemble learning for full details.

Cross References

Copyright information

© Springer Science+Business Media, LLC 2011