Advertisement

A New Adaptive Strategy for Pruning and Adding Hidden Neurons during Training Artificial Neural Networks

  • Md. Monirul Islam
  • Md. Abdus Sattar
  • Md. Faijul Amin
  • Kazuyuki Murase
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5326)

Abstract

This paper presents a new strategy in designing artificial neural networks. We call this strategy as adaptive merging and growing strategy (AMGS). Unlike most previous strategies on designing ANNs, AMGS puts emphasis on autonomous functioning in the design process. The new strategy reduces or increases an ANN size during training based on the learning ability of hidden neurons and the training progress of the ANN, respectively. It merges correlated hidden neurons to reduce the network size, while it splits existing hidden neuron to increase the network size. AMGS has been tested on designing ANNs for five benchmark classification problems, including Australian credit card assessment, diabetes, heart, iris, and thyroid problems. The experimental results show that the proposed strategy can design compact ANNs with good generalization ability.

Keywords

Artificial neural network design merging neuron splitting neuron generalization ability 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Kwok, T.Y., Yeung, D.Y.: Constructive algorithms for structure learning in feedforward neural networks for regression problems. IEEE Transactions on Neural Networks 8, 630–645 (1997)CrossRefGoogle Scholar
  2. 2.
    Reed, R.: Pruning algorithms - a survey. IEEE Transactions on Neural Networks 4, 740–747 (1993)CrossRefGoogle Scholar
  3. 3.
    Schaffer, J.D., Whitely, D., Eshelman, L.J.: Combinations of genetic algorithms and neural networks- a survey of the state of the art. In: Whitely, D., Schaffer, J.D. (eds.) International Workshop of Genetic Algorithms and Neural Networks, pp. 1–37. IEEE Computer Society Press, Los Alamitos (1992)Google Scholar
  4. 4.
    Odri, S.V., Petrovacki, D.P., Krstonosic, G.A.: Evolutional development of a multilevel neural network. Neural Networks 6, 583–595 (1993)CrossRefGoogle Scholar
  5. 5.
    LeCun, Y., Denker, J.S., Solla, S.A.: Optimal brain damage. In: Touretzky, D.S. (ed.) Advances in Neural Information Processing Systems, vol. 2, pp. 598–605. Morgan Kaufmann, San Francisco (1990)Google Scholar
  6. 6.
    Hassibi, B., Stork, D.G.: Second-order derivatives for network pruning: optimal brain surgeon. In: Lee, C., Hanson, S., Cowan, J. (eds.) Advances in Neural Information Processing Systems, vol. 5, pp. 164–171. Morgan Kaufmann, San Mateo (1993)Google Scholar
  7. 7.
    Engelbretch, A.P.: A new pruning heuristic based on variance analysis of sensitivity information. IEEE Transaction on Neural Networks 12, 1386–1399 (2001)CrossRefGoogle Scholar
  8. 8.
    Ludermir, T.B., Yamazaki, A., Zanchettin, C.: An optimization methodology for neural network weights and architectures. IEEE Transactions on Neural Networks 17, 1452–1459 (2006)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2008

Authors and Affiliations

  • Md. Monirul Islam
    • 1
    • 2
  • Md. Abdus Sattar
    • 1
  • Md. Faijul Amin
    • 2
  • Kazuyuki Murase
    • 2
  1. 1.Department of Computer Science and EngineeringBangladesh University of Engineering and TechnologyDhakaBangladesh
  2. 2.Department of Human and Artificial Intelligence SystemsGraduate School of Engineering, University of FukuiFukuiJapan

Personalised recommendations