Skip to main content
Log in

Differential Evolution Training Algorithm for Feed-Forward Neural Networks

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

An evolutionary optimization method over continuous search spaces, differential evolution, has recently been successfully applied to real world and artificial optimization problems and proposed also for neural network training. However, differential evolution has not been comprehensively studied in the context of training neural network weights, i.e., how useful is differential evolution in finding the global optimum for expense of convergence speed. In this study, differential evolution has been analyzed as a candidate global optimization method for feed-forward neural networks. In comparison to gradient based methods, differential evolution seems not to provide any distinct advantage in terms of learning rate or solution quality. Differential evolution can rather be used in validation of reached optima and in the development of regularization terms and non-conventional transfer functions that do not necessarily provide gradient information.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Alander, J. T.: An Indexed Bibliography of Genetic Algorithms and Neural Networks, ftp.uwasa.fi/cs/report94-1/gaNNbib.ps.Z.,1999.

  2. Charalambous, C.: Conjugate gradient algorithm for efficient training of artificial neural networks, IEE G (Circuits, Devices and Systems), 139(3) (1992), 301–310.

    Google Scholar 

  3. Day, S. P. and Camporese, D. S.: A stochastic training technique for feed-forward neural networks, In: IJCNN International Joint Conference on Neural Networks, 3 (1990), 607–612.

    Google Scholar 

  4. Doyle, S., Corcoran, D. and Connell, J.: Automated mirror design using an evolution strategy, Optical Engineering, 38(2) (1999), 323–333.

    Article  ADS  Google Scholar 

  5. Fischer, M. M., Hlavackova-Schindler, K. and Reismann, M.: A global search procedure for parameter estimation in neural spatial interaction modelling, Regional Science, 78(2) (1999), 119–134.

    Google Scholar 

  6. Gang, L., Yiqing, T. and Fu, T.: A fast evolutionary algorithm for neural network training using differential evolution, In: ICYCS '99 Fifth International Conference for Young Computer Scientists, 1 (1999), 507–511.

    Google Scholar 

  7. Hagan, M. T. and Menhaj, M. B.: Training feedforward networks with the Marquadt algorithm, IEEE Transactions on Neural Networks, 5(6) 1994.

  8. Japkowicz, N. and Hanson, S. J.: Adaptability of the backpropagation procedure, In: IJCNN '99 International Joint Conference on Neural Networks, 3 (1999), 1710–1715.

    Google Scholar 

  9. Lampinen, J.: A Bibliography of Differential Evolution Algorithm, http: //www.lut.fi/ ~jlampine/debiblio.htm., 2001.

  10. Lampinen, J. and Zelinka, I.: New Ideas in Optimization, Chapt. Mechanical Engineering Design Optimization by Differential Evolution, McGraw-Hill, (1999), pp. 127–146.

  11. Liang, M. Wang, S.-X. and Luo Y.-H.: Fast learning algorithms for multi-layered feedforward neural network, In: IEEE 1994 National Aerospace and Electronics Conference NAECON1994, 2 (1994), 787–790.

    Google Scholar 

  12. Magoulas, G., Plagianakos, V. and Vrahatis, M.: Hybrid methods using evolutionary algorithms for on-line training, In: Proceedings of IJCNN '01, International Joint Conference on Neural Networks, 3 (2001), 2218–2223.

    Google Scholar 

  13. Masters, T. and Land, W.: A new training algorithm for the general regression neural network, In: IEEE International Conference on Systems, Man, and Cybernetics, Computational Cybernetics and Simulation, 3 (1997), 1990–1994.

    Google Scholar 

  14. Moller, M.: Efficient training of feed-forward neural networks, Ph.D. thesis, Computer Science Department, Aarhus University, Arhus, Denmark, 1997.

    Google Scholar 

  15. Neelaveni, R., Gurusamy, G. and Hemavathy, L.: Adaptive genetic algorithm and differential evolution based backpropagation neural network for epileptic pattern recognition, In: Proceedings of the National Conference on Technology Convergence for Information, Communication and Entertainment, (2001), 26–30.

  16. Prechelt, L.: PROBEN1 — A set of benchmarks and benchmarking rules for neural network training algorithms,Technical report, Fakultat fur Informatik, Universitat Karlsruhe, Germany, 1994.

    Google Scholar 

  17. Prechelt, L.: Some notes on neural learning algorithm benchmarking, Neurocomputing, 9(3) (1995), 343–347.

    Article  Google Scholar 

  18. Rogalsky, T., Kocabiyik, S. and Derksen, R.: Differential evolution in aerodynamic optimization, Canadian Aeronautics and Space Journal, 46(4) (2000), 183–190.

    Google Scholar 

  19. Schmitz, G. P. and Aldrich, C.: Combinatorial Evolution of Regression Nodes in Feedforward Neural Networks, Neural Networks, 12(1) (1999), 175–189.

    Article  Google Scholar 

  20. Storn, R. and Price, K.: Differential Evolution — A Simple and Efficient Adaptive Scheme for Global Optimization Over Continuous Spaces, Technical Report TR-95-012, International Computer Science Institute, Berkeley, CA, USA (http://www.icsi.berkeley.edu/ techreports/1995.abstracts/tr-95-012.html), 1995.

    Google Scholar 

  21. Storn, R. and Price, K.: Differential evolution — a simple and efficient heuristic for global optimization over continuous spaces, Journal of Global Optimization, 11(4) 1997, 341–359.

    Article  MATH  MathSciNet  Google Scholar 

  22. Stumberger, G., Dolinar, D., Pahner, U. and Hameyer, K.: Optimization of radial active magnetic bearings using the finite element technique and differential evolution algorithm, IEEE Transactions on Magnetics, 36(4) (2000), 1009–1013.

    Article  Google Scholar 

  23. Wang, F.-S. and Sheu, J.-W.: Multiobjective parameter estimation problems of fermentation processes using a high ethanol tolerance yeast, Chemical Engineering Science, 55(18) (2000), 3685–3695.

    Article  Google Scholar 

  24. Yao, X.: Evolving artificial neural networks, Proceedings of the IEEE, 87(9) (1999), 1423–1447.

    Article  Google Scholar 

  25. Zelinka, I. and Lampinen, J.: An evolutionary learning algorithms for neural networks, In: 5th International Conference on Soft Computing MENDEL '99, (1999), 410–414.

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Ilonen, J., Kamarainen, JK. & Lampinen, J. Differential Evolution Training Algorithm for Feed-Forward Neural Networks. Neural Processing Letters 17, 93–105 (2003). https://doi.org/10.1023/A:1022995128597

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1023/A:1022995128597

Navigation