Abstract
To improve the generalization ability of neural network ensemble, a selective method based on clustering is proposed. The method follows the overproduce and choose paradigm. It first produces a large number of individual networks, and then clusters these networks according to their diversity. Networks with the highest classification accuracies in each cluster are selected for the final integration. Experiments on ten UCI data sets showed the superiority of the proposed algorithm to the other two similiar ensemble learning algorithms.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Krogh, A., Vedelsby, J.: Neural Network Ensembles, Cross Validation, and Active Learning. In: Tesauro, G., Touretzky, D., Leen, T. (eds.) Advances In Neural Information Processing Systems, vol. 7, pp. 231–238. MIT Press, Cambridge (1995)
Kuncheva, L.I., Skurichina, M., Duin, R.P.W.: An Experimental Study on Diversity for Bagging and Boosting with Linear Classifiers. Information Fusion 3, 245–258 (2002)
Bauer, E., Kohavi, R.: An Empirical Comparison of Voting Classification Algorithms: Bagging, Boosting, and Variants. Machine Learning 36(1,2), 105–139 (1999)
Brodley, C., Lane, T.: Creating and Exploiting Coverage and Diversity. In: Proc. AAAI 1996 Workshop on Integrating Multiple Learned Models, pp. 8–14 (1996)
Giacinto, G., Roli, F., Fumera, G.: Design of Effective Multiple Classifier Systems by Clustering of Classifiers. In: Proc. of ICPR 2000, 15th Int’l Conf. on Pattern Recognition, Barcelona, Spain, pp. 3–8 (2000)
Patridge, D., Yates, W.B.: Engineering Multiversion Neural-Net Systems. Neural Computation 8(4), 869–893 (1996)
Rätsch, G., Onoda, T., Müller, K.R.: Soft Margins for Adaboost. Machine Learning 42(3), 287–320 (2001)
Oliveira, L.S., Morita, M., Sabourin, R., Bortolozzi, F.: Multi-Objective Genetic Algorithms To Create Ensemble of Classifiers. In: Coello Coello, C.A., Hernández Aguirre, A., Zitzler, E. (eds.) EMO 2005. LNCS, vol. 3410, pp. 592–606. Springer, Heidelberg (2005)
Ho, T.K.: The Random Subspace Method for Constructing Decision Forests. IEEE Transactions on Pattern Analysis and Machine Intelligence 20(8), 832–844 (1998)
Chindaro, S., Sirlantzis, K., Fairhurst, M.: Analysis and Modeling of Diversity Contribution to Ensemble-Based Texture Recognition Performance. In: Oza, N.C., Polikar, R., Kittler, J., Roli, F. (eds.) MCS 2005. LNCS, vol. 3541, pp. 387–396. Springer, Heidelberg (2005)
Blake, C.L., Merz, C.J.: UCI Repository of Machine Learning Databases. Dept. of Information and Computer Science. University of California, Irvine (1998), http://www.ics.uci.edu/mlearn/MLRepository.html
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Chen, H., Yuan, S., Jiang, K. (2006). Selective Neural Network Ensemble Based on Clustering. In: Wang, J., Yi, Z., Zurada, J.M., Lu, BL., Yin, H. (eds) Advances in Neural Networks - ISNN 2006. ISNN 2006. Lecture Notes in Computer Science, vol 3971. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11759966_81
Download citation
DOI: https://doi.org/10.1007/11759966_81
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-34439-1
Online ISBN: 978-3-540-34440-7
eBook Packages: Computer ScienceComputer Science (R0)