Advertisement

Training Neural Networks Using Multiobjective Particle Swarm Optimization

  • John Paul T. Yusiong
  • Prospero C. NavalJr.
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4221)

Abstract

This paper suggests an approach to neural network training through the simultaneous optimization of architectures and weights with a Particle Swarm Optimization (PSO)-based multiobjective algorithm. Most evolutionary computation-based training methods formulate the problem in a single objective manner by taking a weighted sum of the objectives from which a single neural network model is generated. Our goal is to determine whether Multiobjective Particle Swarm Optimization can train neural networks involving two objectives: accuracy and complexity. We propose rules for automatic deletion of unnecessary nodes from the network based on the following idea: a connection is pruned if its weight is less than the value of the smallest bias of the entire network. Experiments performed on benchmark datasets obtained from the UCI machine learning repository show that this approach provides an effective means for training neural networks that is competitive with other evolutionary computation-based methods.

Keywords

Neural Network Particle Swarm Optimization Multiobjective Optimization Connection Weight Minimum Description Length 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Abbass, H.: An Evolutionary Artificial Neural Networks Approach to Breast Cancer Diagnosis. Artificial Intelligence in Medicine 25(3), 265–281 (2002)CrossRefGoogle Scholar
  2. 2.
    Alfassio Grimaldi, E., Grimaccia, F., Mussetta, M., Zich, R.: PSO as an Effective Learning Algorithm for Neural Network Applications. In: Proceedings of the International Conference on Computational Electromagnetics and its Applications, Beijing - China, pp. 557–560 (2004)Google Scholar
  3. 3.
    Al-kazemi, B., Mohan, C.: Training Feedforward Neural Networks using Multi-phase Particle Swarm Optimization. In: Proceedings of the 9th International Conference on Neural Information Processing (ICONIP 2002), Singapore (2002)Google Scholar
  4. 4.
    Barron, A., Rissanen, J., Yu, B.: The Minimum Description Length Principle in Coding and Modeling. IEEE Trans. Inform. Theory 44, 2743–2760 (1998)MATHCrossRefMathSciNetGoogle Scholar
  5. 5.
    Coello, C., Lechuga, M.: MOPSO: A Proposal for Multiple Objective Particle Swarm Optimization. In: Proceedings of the IEEE Congress on Evolutionary Computation (CEC 2002), Honolulu, Hawaii USA (2002)Google Scholar
  6. 6.
    Deb, K., Pratap, A., Agarwal, S., Meyarivan, T.: A Fast and Elitist Multiobjective Genetic Algorithm: NSGA-II. IEEE Transactions on Evolutionary Computation 6(2), 182–197 (2002)CrossRefGoogle Scholar
  7. 7.
    Fieldsend, J.: Multi-Objective Particle Swarm Optimisation Methods. Technical Report # 419, Department of Computer Science, University of Exeter (2004)Google Scholar
  8. 8.
    Grunwald, P.: A Tutorial Introduction to the Minimum Description Length Principle. Advances in Minimum Description Length: Theory and Applications. MIT Press, Cambridge (2004)Google Scholar
  9. 9.
    Gudise, V., Venayagamoorthy, G.: Comparison of Particle Swarm Optimization and Backpropagation as Training Algorithms for Neural Networks. In: IEEE Swarm Intelligence Symposium, Indianapolis, IN, USA, pp. 110–117 (2003)Google Scholar
  10. 10.
    Hinton, G., van Camp, D.: Keeping Neural Networks Simple by Minimizing the Description Length of the Weights. In: Proceedings of COLT 1993 (1993)Google Scholar
  11. 11.
    Jin, Y., Sendhoff, B., Körner, E.: Evolutionary Multi-objective Optimization for Simultaneous Generation of Signal-type and Symbol-type Representations. In: Coello Coello, C.A., Hernández Aguirre, A., Zitzler, E. (eds.) EMO 2005. LNCS, vol. 3410, pp. 752–766. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  12. 12.
    Kennedy, J., Eberhart, R.: Particle Swarm Optimization. In: Proceedings of the 1995 IEEE International Conference on Neural Networks, Perth, Australia, vol. 4, pp. 1942–1948 (1995)Google Scholar
  13. 13.
    Liu, Y., Yao, X.: A Population-Based Learning Algorithm Which Learns Both Architectures and Weights of Neural Networks. Chinese J. Advanced Software Res. 3(1), 54–65 (1996)Google Scholar
  14. 14.
    Newman, D., Hettich, S., Blake, C., Merz, C.: UCI Repository of machine learning databases. University of California, Irvine, CA, Department of Information and Computer Science (1998)Google Scholar
  15. 15.
    Palmes, P., Hayasaka, T., Usui, S.: Mutation-based Genetic Neural Network. IEEE Transactions on Neural Networks 16(3), 587–600 (2005)CrossRefGoogle Scholar
  16. 16.
    Raquel, C., Naval, P.: An Effective Use of Crowding Distance in Multiobjective Particle Swarm Optimization. In: Beyer, H. (ed.) Proceedings of the 2005 Conference on Genetic and Evolutionary Computation, GECCO 2005, Washington DC, USA, June 25-29, 2005, pp. 257–264. ACM Press, New York (2005)CrossRefGoogle Scholar
  17. 17.
    Shahin, M., Jaksa, M., Maier, H.: Application of Neural Networks in Foundation Engineering. Theme paper to the International e-Conference on Modern Trends in Foundation Engineering: Geotechnical Challenges and Solutions, Theme No. 5: Numerical Modelling and Analysis, Chennai, India (2004)Google Scholar
  18. 18.
    Sugisaka, M., Fan, X.: An Effective Search Method for Neural Network Based Face Detection Using Particle Swarm Optimization. IEICE Transactions 88-D(2), 214–222 (2005)CrossRefGoogle Scholar
  19. 19.
    van den Bergh, F.: Particle Swarm Weight Initialization in Multi-layer Perceptron Artificial Neural Networks. Development and Practice of Artificial Intelligence Techniques, Durban, South Africa, pp. 41–45 (1999)Google Scholar
  20. 20.
    Yao, X.: Evolving Artificial Neural Networks. Proceedings of the IEEE 87, 1423–1447 (1999)CrossRefGoogle Scholar
  21. 21.
    Yao, X., Liu, Y.: Evolving Artificial Neural Networks through Evolutionary Programming. In: The Fifth Annual Conference on Evolutionary Programming, San Diego, CA, USA, February 29-March 2, pp. 257–266. MIT Press, Cambridge (1996)Google Scholar
  22. 22.
    Yao, X., Liu, Y.: Towards Designing Artificial Neural Networks by Evolution. Applied Mathematics and Computation 91(1), 83–90 (1998)MATHCrossRefMathSciNetGoogle Scholar
  23. 23.
    Zhao, F., Ren, Z., Yu, D., Yang, Y.: Application of An Improved Particle Swarm Optimization Algorithm for Neural Network Training. In: Proceedings of the 2005 International Conference on Neural Networks and Brain, Beijing, China, vol. 3, pp. 1693–1698 (2005)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • John Paul T. Yusiong
    • 1
  • Prospero C. NavalJr.
    • 2
  1. 1.Division of Natural Sciences and MathematicsUniversity of the Philippines-VisayasTacloban City, LeytePhilippines
  2. 2.Department of Computer ScienceUniversity of the Philippines-DilimanDiliman, Quezon CityPhilippines

Personalised recommendations