The Influence of Gaussian, Uniform, and Cauchy Perturbation Functions in the Neural Network Evolution

  • Paulito P. Palmes
  • Shiro Usui
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3316)

Abstract

Majority of algorithms in the field of evolutionary artificial neural networks (EvoANN) rely on the proper choice and implementation of the perturbation function to maintain their population’s diversity from generation to generation. Maintaining diversity is an important factor in the evolution process since it helps the population of ANN (Artificial Neural Networks) to escape local minima. To determine which among the perturbation functions are ideal for ANN evolution, this paper analyzed the influence of the three commonly used functions, namely: Gaussian, Cauchy, and Uniform. Statistical comparisons were conducted to examine their influence in the generalization and training performance of EvoANN. Our simulations using the glass classification problem indicated that for mutation-with-crossover-based EvoANN, generalization performance among the three perturbation functions were not significantly different. On the other hand, mutation-based EvoANN that used Gaussian mutation performed as good as that with crossover but it performed worst when it used either Uniform or Cauchy distribution function. These observations suggest that crossover operation becomes a significant operation in systems that employ strong perturbation functions but has less significance in systems that use weak or conservative perturbation functions.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Chellapilla, A., Fogel, D.: Two new mutation operators for enhanced search and optimization in evolutionary programming. In: Bosacchi, B., Bezdek, J.C., Fogel, D.B. (eds.) Proc. of SPIE: Applications of Soft Computing, vol. 3165, pp. 260–269 (1997)Google Scholar
  2. 2.
    Murphy, P.M., Aha, D.W.: UCI Repository of machine learning databases. University of California, Department of Information and Computer Science, Irvine, CA (1994)Google Scholar
  3. 3.
    Palmes, P., Hayasaka, T., Usui, S.: Evolution and adaptation of neural networks. In: Proceedings of the International Joint Conference on Neural Networks, IJCNN, Portland, Oregon, USA, July 19-24, 2003, vol. II, pp. 397–404. IEEE Computer Society Press, Los Alamitos (2003)Google Scholar
  4. 4.
    Palmes, P., Hayasaka, T., Usui, S.: SEPA: Structure evolution and parameter adaptation. In: Cantu Paz, E. (ed.) Proceedings of the Genetic and Evolutionary Computation Conference, Chicago, Illinois, 11-17 July 2003, vol. 2, p. 223. Morgan Kaufmann, San Francisco (2003)Google Scholar
  5. 5.
    Palmes, P., Hayasaka, T., Usui, S.: Mutation-based genetic neural network. IEEE Transactions on Neural Network (2004) (In press)Google Scholar
  6. 6.
    Prechelt, L.: Proben1–a set of neural network benchmark problems and benchmarking rules. Technical Report 21/94, Fakultat fur Informatik, Univ. Karlsruhe, Karlsruhe, Germany (September 1994)Google Scholar
  7. 7.
    Rudolph, G.: Local convergence rates of simple evolutionary algorithms with Cauchy mutations. IEEE Trans. on Evolutionary Computation 1(4), 249–258 (1997)CrossRefMathSciNetGoogle Scholar
  8. 8.
    Yao, X., Liu, Y., Liu, G.: Evolutionary programming made faster. IEEE Trans. on Evolutionary Computation 3(2), 82–102 (1999)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2004

Authors and Affiliations

  • Paulito P. Palmes
    • 1
  • Shiro Usui
    • 1
  1. 1.RIKEN Brain Science InstituteSaitamaJapan

Personalised recommendations