Abstract
In order to learn a given data set, a learning system often has to learn too much on some data points in the given data set in order to learn well the rest of the given data. Such unnecessary learning might lead to both the higher complexity and overfitting in the learning system. In order to control the complexity of neural network ensembles, difference learning is introduced into negative correlation learning. The idea of difference learning is to let each individual in an ensemble learn to be different to the ensemble on some selected data points when the outputs of the ensemble are too close to the target values of these data points. It has been found that such difference learning could control not only overfitting in an ensemble, but also weakness among the individuals in the ensemble. Experimental results were conducted to show how such difference learning could create rather weak learners in negative correlation learning.
Keywords
- Negative Correlation Learning (NCL)
- Learning Differences
- Individual Neural Network
- Weak Learners
- Training Error Rate
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Fahlman, S.E., Lebiere, C.: The cascade-correlation learning architecture. In: Touretzky, D.S. (ed.) Advances in Neural Information Processing Systems 2, pp. 524–532. Morgan Kaufmann, San Mateo, CA (1990)
Śmieja, F.J.: Neural network constructive algorithms: trading generalization for learning efficiency? Circuits Syst. Sig. Process. 12(2), 331–374 (1993)
Kwok, T.-Y., Yeung, D.-Y.: Constructive algorithms for structure learning in feedforward neural networks for regression problems. IEEE Trans. Neural Netw. 8(3), 630–645 (1997)
Mozer, M.C., Smolensky, P.: Skeletonization: a technique for trimming the fat from a network via relevance assessment. Connect. Sci. 1, 3–26 (1989)
Sietsma, J., Dow, R.J.F.: Creating artificial neural networks that generalize. Neural Netw. 4, 67–79 (1991)
LeCun, Y., Denker, J.S., Solla, S.A.: Optimal brain damage. In: Touretzky, D.S. (ed.) Advances in Neural Information Processing Systems 2, pp. 598–605. Morgan Kaufmann, San Mateo, CA (1990)
Hassibi, B., Stork, D.G.: Second derivatives for network pruning: optimal brain surgeon. In: Hanson, S.J., Cowan, J.D., Giles, C.L. (eds.) Advances in Neural Information Processing Systems 5, pp. 164–171. Morgan Kaufmann, San Mateo, CA (1993)
Tolstrup, N.: Pruning of a large network by optimal brain damage and surgeon: an example from biological sequence analysis. Int. J. Neural Syst. 6(1), 31–42 (1995)
Schapire, R.E.: The strength of weak learnability. Mach. Learn. 5, 197–227 (1990)
Liu, Y., Yao, X.: Simultaneous training of negatively correlated neural networks in an ensemble. IEEE Trans. Syst. Man Cybern. Part B Cybern. 29(6), 716–725 (1999)
Liu, Y.: A balanced ensemble learning with adaptive error functions. In: Kang, L., Cai, Z., Yan, X., Liu, Y. (eds.) ISICA 2008. LNCS, vol. 5370, pp. 1–8. Springer, Heidelberg (2008)
Liu, Y.: Balanced learning for ensembles with small neural networks. In: Cai, Z., Li, Z., Kang, Z., Liu, Y. (eds.) ISICA 2009. LNCS, vol. 5821, pp. 163–170. Springer, Heidelberg (2009)
Liu, Y.: Create weak learners with small neural networks by balanced ensemble learning. In: Proceedings of the 2011 IEEE International Conference on Signal Processing, Communications and Computing (2011)
Liu, Y.: Target shift awareness in balanced ensemble learning. In: Proceedings of the 3rd International Conference on Awareness Science and Technology
Liu, Y.: Balancing ensemble learning through error shift. In: Proceedings of the Fourth International Workshop on Advanced Computational Intelligence
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer Science+Business Media Singapore
About this paper
Cite this paper
Liu, Y. (2016). Negative Correlation Learning with Difference Learning. In: Li, K., Li, J., Liu, Y., Castiglione, A. (eds) Computational Intelligence and Intelligent Systems. ISICA 2015. Communications in Computer and Information Science, vol 575. Springer, Singapore. https://doi.org/10.1007/978-981-10-0356-1_27
Download citation
DOI: https://doi.org/10.1007/978-981-10-0356-1_27
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-10-0355-4
Online ISBN: 978-981-10-0356-1
eBook Packages: Computer ScienceComputer Science (R0)