Abstract
This paper describes two methods on how to generate different neural networks in an ensemble. One is based on negative correlation learning. The other is based on cross-validation with negative correlation learning, i.e., bagging with negative correlation learning. In negative correlation learning, all individual networks are trained simultaneously on the same training set. In bagging with negative correlation learning, different individual networks are trained on the different sampled data set with replacement from the training set. The performance and correct response sets are compared between two learning methods. The purpose of this paper is to find how to design more effective neural network ensembles.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Hansen, L.K., Salamon, P.: Neural network ensembles. IEEE Trans. on Pattern Analysis and Machine Intelligence 12(10), 993–1001 (1990)
Sharkey, A.J.C.: On combining artificial neural nets. Connection Science 8, 299–313 (1996)
Jacobs, R.A., Jordan, M.I., Nowlan, S.J., Hinton, G.E.: Adaptive mixtures of local experts. Neural Computation 3, 79–87 (1991)
Jacobs, R.A., Jordan, M.I.: A competitive modular connectionist architecture. In: Lippmann, R.P., Moody, J.E., Touretzky, D.S. (eds.) Advances in Neural Information Processing Systems, vol. 3, pp. 767–773. Morgan Kaufmann, San Mateo (1991)
Jacobs, R.A., Jordan, M.I., Barto, A.G.: Task decomposition through competition in a modular connectionist architecture: the what and where vision task. Cognitive Science 15, 219–250 (1991)
Jacobs, R.A.: Bias/variance analyses of mixture-of-experts architectures. Neural Computation 9, 369–383 (1997)
Drucker, H., Cortes, C., Jackel, L.D., LeCun, Y., Vapnik, V.: Boosting and other ensemble methods. Neural Computation 6, 1289–1301 (1994)
Schapire, R.E.: The strength of weak learnability. Machine Learning 5, 197–227 (1990)
Drucker, H., Schapire, R., Simard, P.: Improving performance in neural networks using a boosting algorithm. In: Hanson, S.J., Cowan, J.D., Giles, C.L. (eds.) Advances in Neural Information Processing Systems, vol. 5, pp. 42–49. Morgan Kaufmann, San Mateo (1993)
Sarkar, D.: Randomness in generalization ability: a source to improve it. IEEE Trans. on Neural Networks 7(3), 676–685 (1996)
Clemen, R.T., Winkler, R.L.: Limits for the precision and value of information from dependent sources. Operations Research 33, 427–442 (1985)
Opitz, D.W., Shavlik, J.W.: Actively searching for an effective neural network ensemble. Connection Science 8, 337–353 (1996)
Rosen, B.E.: Ensemble learning using decorrelated neural networks. Connection Science 8, 373–383 (1996)
Liu, Y., Yao, X.: Negatively correlated neural networks can produce best ensembles. Australian Journal of Intelligent Information Processing Systems 4, 176–185 (1998)
Liu, Y., Yao, X.: A cooperative ensemble learning system. In: Proc. of the 1998 IEEE International Joint Conference on Neural Networks (IJCNN 1998), pp. 2202–2207. IEEE Press, Piscataway (1998)
Liu, Y., Yao, X.: Simultaneous training of negatively correlated neural networks in an ensemble. IEEE Trans. on Systems, Man, and Cybernetics, Part B: Cybernetics 29(6), 716–725 (1999)
Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Learning internal representations by error propagation. In: Rumelhart, D.E., McClelland, J.L. (eds.) Parallel Distributed Processing: Explorations in the Microstructures of Cognition, vol. I, pp. 318–362. MIT Press, Cambridge (1986)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Liu, Y. (2005). Generate Different Neural Networks by Negative Correlation Learning. In: Wang, L., Chen, K., Ong, Y.S. (eds) Advances in Natural Computation. ICNC 2005. Lecture Notes in Computer Science, vol 3610. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11539087_17
Download citation
DOI: https://doi.org/10.1007/11539087_17
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-28323-2
Online ISBN: 978-3-540-31853-8
eBook Packages: Computer ScienceComputer Science (R0)