Abstract
To improve weak classifiers bagging and boosting could be used. These techniques are based on combining classifiers. Usually, a simple majority vote or a weighted majority vote are used as combining rules in bagging and boosting. However, other combining rules such as mean, product and average are possible. In this paper, we study bagging and boosting in Linear Discriminant Analysis (LDA) and the role of combining rules in bagging and boosting. Simulation studies, carried out for two artificial data sets and one real data set, show that bagging and boosting might be useful in LDA: bagging for critical training sample sizes and boosting for large training sample sizes. In contrast to a common opinion, we demonstrate that the usefulness of boosting does not directly depend on the instability of a classifier. It is also shown that the choice of the combining rule may affect the performance of bagging and boosting.
Chapter PDF
Similar content being viewed by others
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
Jain, A.K., Chandrasekaran, B.: Dimensionality and Sample Size Considerations in Pattern Recognition Practice. In: Krishnaiah, P.R., Kanal, L.N. (eds.): Handbook of Statistics, Vol. 2. North-Holland, Amsterdam (1987) 835–855
Friedman, J.H.: Regularized Discriminant Analysis. Journal of the American Statistical Association (JASA) 84 (1989) 165–175
An, G.: The Effects of Adding Noise During Backpropagation Training on a Generalization Performance. Neural Computation 8 (1996) 643–674
Efron, B., Tibshirani, R.: An Introduction to the Bootstrap. Chapman and Hall, New York (1993)
Breiman, L.: Bagging predictors. Machine Learning Journal 24(2) (1996) 123–140
Freund, Y., Schapire, R.E.: Experiments with a New Boosting Algorithm. In: Machine Learning: Proceedings of the Thirteenth International Conference (1996) 148–156
Schapire, R.E., Freund, Y, Bartlett, P., Lee, W.: Boosting the Margin: A New Explanation for the Effectiveness of Voting Methods. The Annals of Statistics 26(5) (1998) 1651–1686
Breiman, L.: Arcing Classifiers. Annals of Statistics, 26(3) (1998) 801–849
Friedman, J., Hastie, T., Tibshirani, R.: Additive Logistic Regression: a Statistical View of Boosting. Technical Report (1999)
Dietterich, T.G.: An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees: Bagging, Boosting, and Randomization. Machine Learning, to appear
Skurichina, M., Duin, R.P.W.: Bagging for Linear Classifiers. Pattern Recognition 31(7) (1998) 909–930
Skurichina, M., Duin, R.P.W.: Boosting in Linear and Quadratic Discriminant Analysis. In preparation for submission to the First Int. Workshop on Multiple Classifier Systems
Fukunaga, K.: Introduction to Statistical Pattern Recognition. Academic Press (1990) 400–407
Cortes, C., Vapnik, V.: Support-Vector Networks. Machine Learning 20 (1995) 273–297
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2000 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Skurichina, M., Duin, R.P.W. (2000). The Role of Combining Rules in Bagging and Boosting. In: Ferri, F.J., Iñesta, J.M., Amin, A., Pudil, P. (eds) Advances in Pattern Recognition. SSPR /SPR 2000. Lecture Notes in Computer Science, vol 1876. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-44522-6_65
Download citation
DOI: https://doi.org/10.1007/3-540-44522-6_65
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-67946-2
Online ISBN: 978-3-540-44522-7
eBook Packages: Springer Book Archive