Abstract
Recent work has shown that combining multiple versions of weak classifiers such as decision trees or neural networks results in reduced test set error. To study this in greater detail, we analyze the asymptotic behavior of AdaBoost type algorithms. The theoretical analysis establishes the relation between the distribution of margins of the training examples and the generated voting classification rule. The paper shows asymptotic experimental results for the binary classification case underlining the theoretical findings. Finally, the relation between the model complexity and noise in the training data, and how to improve AdaBoost type algorithms in practice are discussed.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
C.M. Bishop. Neural Networks for Pattern Recognition. Clarendon, 1995.
L. Breiman. Bagging predictors. Machine Learning, 26(2):123–140, 1996.
L. Breiman. Prediction games and arcing algorithms. Technical Report 504, Statistics Department, University of Berkeley, December 1997.
B. Schölkopf et al. Comparing support vector machines with gaussian kernels to radial basis function classifiers. IEEE Trans. Signal Processing, 45(11):2758–2765, 1997.
Y. LeCun et al. Learning algorithms for classification: A comparism on handwritten digit recognistion. Neural Networks, pages 261–276, 1995.
R. Schapire et al. Boosting the margin: A new explanation for the effectiveness of voting methods. Machine Learning, pages 148–156, 1998.
H. Schwenk, Y. Bengio. Adaboosting neural networks: Application to online character recongnition. In ICANN’97, LNCS 1327, 967–972, 1997.
V. N. Vapnik. The Nature of Statistical Learning Theory. Springer, 1995.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1998 Springer-Verlag London
About this paper
Cite this paper
Onoda, T., Rätsch, G., Müller, KR. (1998). An asymptotic analysis of AdaBoost in the binary classification case. In: Niklasson, L., Bodén, M., Ziemke, T. (eds) ICANN 98. ICANN 1998. Perspectives in Neural Computing. Springer, London. https://doi.org/10.1007/978-1-4471-1599-1_26
Download citation
DOI: https://doi.org/10.1007/978-1-4471-1599-1_26
Published:
Publisher Name: Springer, London
Print ISBN: 978-3-540-76263-8
Online ISBN: 978-1-4471-1599-1
eBook Packages: Springer Book Archive