Skip to main content

Generate Different Neural Networks by Negative Correlation Learning

  • Conference paper
Book cover Advances in Natural Computation (ICNC 2005)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 3610))

Included in the following conference series:

Abstract

This paper describes two methods on how to generate different neural networks in an ensemble. One is based on negative correlation learning. The other is based on cross-validation with negative correlation learning, i.e., bagging with negative correlation learning. In negative correlation learning, all individual networks are trained simultaneously on the same training set. In bagging with negative correlation learning, different individual networks are trained on the different sampled data set with replacement from the training set. The performance and correct response sets are compared between two learning methods. The purpose of this paper is to find how to design more effective neural network ensembles.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 119.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Hansen, L.K., Salamon, P.: Neural network ensembles. IEEE Trans. on Pattern Analysis and Machine Intelligence 12(10), 993–1001 (1990)

    Article  Google Scholar 

  2. Sharkey, A.J.C.: On combining artificial neural nets. Connection Science 8, 299–313 (1996)

    Article  Google Scholar 

  3. Jacobs, R.A., Jordan, M.I., Nowlan, S.J., Hinton, G.E.: Adaptive mixtures of local experts. Neural Computation 3, 79–87 (1991)

    Article  Google Scholar 

  4. Jacobs, R.A., Jordan, M.I.: A competitive modular connectionist architecture. In: Lippmann, R.P., Moody, J.E., Touretzky, D.S. (eds.) Advances in Neural Information Processing Systems, vol. 3, pp. 767–773. Morgan Kaufmann, San Mateo (1991)

    Google Scholar 

  5. Jacobs, R.A., Jordan, M.I., Barto, A.G.: Task decomposition through competition in a modular connectionist architecture: the what and where vision task. Cognitive Science 15, 219–250 (1991)

    Article  Google Scholar 

  6. Jacobs, R.A.: Bias/variance analyses of mixture-of-experts architectures. Neural Computation 9, 369–383 (1997)

    Article  MATH  MathSciNet  Google Scholar 

  7. Drucker, H., Cortes, C., Jackel, L.D., LeCun, Y., Vapnik, V.: Boosting and other ensemble methods. Neural Computation 6, 1289–1301 (1994)

    Article  MATH  Google Scholar 

  8. Schapire, R.E.: The strength of weak learnability. Machine Learning 5, 197–227 (1990)

    Google Scholar 

  9. Drucker, H., Schapire, R., Simard, P.: Improving performance in neural networks using a boosting algorithm. In: Hanson, S.J., Cowan, J.D., Giles, C.L. (eds.) Advances in Neural Information Processing Systems, vol. 5, pp. 42–49. Morgan Kaufmann, San Mateo (1993)

    Google Scholar 

  10. Sarkar, D.: Randomness in generalization ability: a source to improve it. IEEE Trans. on Neural Networks 7(3), 676–685 (1996)

    Article  Google Scholar 

  11. Clemen, R.T., Winkler, R.L.: Limits for the precision and value of information from dependent sources. Operations Research 33, 427–442 (1985)

    Article  MATH  Google Scholar 

  12. Opitz, D.W., Shavlik, J.W.: Actively searching for an effective neural network ensemble. Connection Science 8, 337–353 (1996)

    Article  Google Scholar 

  13. Rosen, B.E.: Ensemble learning using decorrelated neural networks. Connection Science 8, 373–383 (1996)

    Article  Google Scholar 

  14. Liu, Y., Yao, X.: Negatively correlated neural networks can produce best ensembles. Australian Journal of Intelligent Information Processing Systems 4, 176–185 (1998)

    Google Scholar 

  15. Liu, Y., Yao, X.: A cooperative ensemble learning system. In: Proc. of the 1998 IEEE International Joint Conference on Neural Networks (IJCNN 1998), pp. 2202–2207. IEEE Press, Piscataway (1998)

    Google Scholar 

  16. Liu, Y., Yao, X.: Simultaneous training of negatively correlated neural networks in an ensemble. IEEE Trans. on Systems, Man, and Cybernetics, Part B: Cybernetics 29(6), 716–725 (1999)

    Article  Google Scholar 

  17. Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Learning internal representations by error propagation. In: Rumelhart, D.E., McClelland, J.L. (eds.) Parallel Distributed Processing: Explorations in the Microstructures of Cognition, vol. I, pp. 318–362. MIT Press, Cambridge (1986)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Liu, Y. (2005). Generate Different Neural Networks by Negative Correlation Learning. In: Wang, L., Chen, K., Ong, Y.S. (eds) Advances in Natural Computation. ICNC 2005. Lecture Notes in Computer Science, vol 3610. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11539087_17

Download citation

  • DOI: https://doi.org/10.1007/11539087_17

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-28323-2

  • Online ISBN: 978-3-540-31853-8

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics