Encyclopedia of Machine Learning and Data Mining

2017 Edition
| Editors: Claude Sammut, Geoffrey I. Webb

Adaboost

Reference work entry
DOI: https://doi.org/10.1007/978-1-4899-7687-1_917

Adaboost is an  ensemble learning technique, and the most well-known of the  Boosting family of algorithms. The algorithm trains models sequentially, with a new model trained at each round. At the end of each round, mis-classified examples are identified and have their emphasis increased in a new training set which is then fed back into the start of the next round, and a new model is trained. The idea is that subsequent models should be able to compensate for errors made by earlier models. See  ensemble learning for full details.

Copyright information

© Springer Science+Business Media New York 2017