Abstract
Neural network ensemble is a recently developed technology, which trains a few of neural networks and then combines their prediction results. It significantly improves the generalization ability of neural network system and relieves the trial-by-error process of tuning architectures. However, it is time-consuming. In order to overcome the disadvantage, a parallel-distributed neural network ensemble named PDNNE is proposed in this paper. The design and implementation of the PDNNE are presented through discussing the main issues such as partitioning, communication, and the component neural network. The experiments show both the generalization ability and time efficiency are significantly improved.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Hansen, L.K., Salamon, P.: Neural network ensembles. IEEE Trans Pattern Analysis and Machine Intelligence, 12,993–1001 (1990)
Sollich, P., Krogh, A.: Learning with ensembles: when over-fitting can be useful. In: Touretzky, D.S., Mozer, M.C., Hasselmo, M.E. (ed) Advances in Neural Information Processing System. Cambridge, MA(1996)
Gutta, S., Wechsler, H.: Face recognition using hybrid classifier systems. In: ICNN-96. IEEE Computer Society Press, Washington, DC,Los Alamitos, CA (1996)
Mao, J.: A case study on Bagging, Boosting and basic ensembles of neural networks for OCR. In: IEEE International Conference on Neural Networks. IEEE Computer Society, Anchorage, AK (1998)
Zhou, Z., Jiang, Y., Yang, Y., Chen, S.: Lung cancer cell identification based on artificial neural network ensembles. Artificial Intelligence in Medicine, 24, 25–36 (2002)
Sharkey, A.J.C.: On combining artificial neural nets. Connection Science, 8, 299–313 (1996)
Breiman, L.: Bagging predictors. Machine Learning, 24, 123–140 (1996)
Efron, B., Tibshirani, R.: An Introduction to The Bootstrap. Chapman & Hall, New York (1993)
Schapire, R.E.: The strength of weak learnability. Machine Learning, 5, 197–227 (1990)
Yu, C., Skilliom, D.B.: Parallelizing Boosting and Bagging. Technical Report, Queen’s University, Kingston, CA (2001)
Lozano, E., Acuna, E.: Parallel computation of kernel density estimates classifiers and their ensembles. In: International Conference on Computer, Communication and Control Technologies (2003)
Du, Zhihui, Li, Sanli: High Performance Computing and Parallel Programming Technologies MPI Parallel Programming. Tsinghua University Press, Beijing (2001)
Blake, C., Keogh, E., Merz, C.J.: UCI repository of machine learning databases. http://www.ics.uci.edu/mlearn/MLRepository.html. Department of Information and Computer Science, University of California, Irvine, California (1998)
Moody, J., Darken, C.: Learning with localized receptive fields. In: the 1988 Connectionist Models Summer School. San Mateo, CA (1989)
Han, J., Kamber, M.: Data Mining: Concepts and Techniques. Morgan Kaufmann Publishers Inc. (2000)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Liu, Y., Li, Y., Zhang, B., Wu, G. (2005). Design & Implementation of the Parallel-distributed Neural Network Ensemble. In: Zhang, W., Tong, W., Chen, Z., Glowinski, R. (eds) Current Trends in High Performance Computing and Its Applications. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-27912-1_10
Download citation
DOI: https://doi.org/10.1007/3-540-27912-1_10
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-25785-1
Online ISBN: 978-3-540-27912-9
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)