Training a Feed-Forward Neural Network Using Artificial Bee Colony with Back-Propagation Algorithm

  • Partha Pratim Sarangi
  • Abhimanyu Sahu
  • Madhumita Panda
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 243)

Abstract

Training a feed-forward neural network (FNN) is an optimization problem over continuous space. Back-propagation algorithm (BP) is the conventional and most popular gradient-based local search optimization technique. The major problem more often BP suffers is the poor generalization performance by getting stuck at local minima. The artificial bee colony (ABC) is one of the popular global optimization algorithms of swarm intelligence and is used to train the weights of the neural network, but it also suffers from slow convergence speed. Nevertheless, a hybrid algorithm by combining artificial bee colony and back-propagation (ABC-BP) is proposed to train the FNN. The results of the proposed algorithm are compared with hybrid real-coded genetic algorithms with back-propagation (GA-BP) to train the FNN using five benchmark datasets taken from the UCI machine learning repository. The simulation results indicate that ABC-BP hybrid algorithm gives promising results in terms of significantly improved convergence rate and classification rate. Hence, the proposed algorithm can be efficiently used for training the FNN.

Keywords

Artificial bee colony Multilayer perceptron Hybrid artificial bee colony with back-propagation algorithm 

Notes

Acknowledgments

I would like to acknowledge Seemanta Engineering College, Mayurbhanj, Odisha, for providing financial assistance.

References

  1. 1.
    Zhang, G.: Neural networks for classification: a survey. IEEE Transact. Syst. Man Cybernet Part C 30(4), 451–462 (2000)Google Scholar
  2. 2.
    Kotsiantis, S.B.: Supervised machine learning: a review of classification techniques. Informatica 31, 249–268 (2007)MATHMathSciNetGoogle Scholar
  3. 3.
    Rumelhart, D.E., Hinton, G.E., Williams, R.J.: Learning representations by back-propagating errors. Nature 323, 533–536 (1986)CrossRefGoogle Scholar
  4. 4.
    Montana, D.J., Davis, L.o.: Training feed forward neural networks using genetic algorithms. In: Proceedings of the Third International Conference on Genetic Algorithms, Morgan Kaufmann, San Mateo, CA, pp. 379–384 (1989)Google Scholar
  5. 5.
    Whitley, D.: Applying Genetic Algorithms to Neural Network Problems. International Neural Network Society, p. 230 (1988)Google Scholar
  6. 6.
    Sexton, R.S., Dorsey, R.E., Johnson, J.D.: Optimization of neural networks: a comparative analysis of the genetic algorithm and simulated annealing. Eur. J. Oper. Res. 114(3), 589–601 (1999)CrossRefMATHGoogle Scholar
  7. 7.
    Gupta, J.N.D., Sexton, R.S.: Comparing back-propagation with a genetic algorithm for neural network training. Omega, 27, 679–684 (1999)Google Scholar
  8. 8.
    Darrell, W., Timothy, S., Bogart, C.: Genetic algorithms and neural networks: optimizing connections and connectivity. Parallel Comput. 14, 347–361 (1990)CrossRefGoogle Scholar
  9. 9.
    Yao, X.: Evolving artificial neural networks. In: Proceedings IEEE 87 (91), 423–1447 (1999)Google Scholar
  10. 10.
    Schaffer, J.D., Whitley, D., Eshelman, L.J.: Combinations of genetic algorithms and neural networks: A survey of the state of the art. In: Proceedings of the IEEE Workshop on Combinations of Genetic Algorithms and Neural Network 254–262 (1999) Google Scholar
  11. 11.
    Sarangi, P.P., Majhi, B., Panda, M.: Performance analysis of neural networks training using real coded genetic algorithm. Int. J. Comput. Appl. 51(18), 30–36 (2012)Google Scholar
  12. 12.
    Karaboga, D.: Neural networks training by artificial bee colony algorithm on pattern classification. Int. J. Neural and Mass Parallel Comput. Inf. Syst. 19, (2009)Google Scholar
  13. 13.
    Karaboga, D., Akay, B., Ozturk, C.: Artificial bee colony (ABC) optimization algorithm for training feed-forward neural networks. In: International Conference Proceedings on Modeling Decisions for Artificial Intelligence, 4617, 318–329 Springer (2007)Google Scholar
  14. 14.
    Kumbhar, P.Y., Krishnan, S.: Use of artificial bee colony (ABC) algorithm in artificial neural network synthesis. Int. J. Adv. Eng. Sci. Technol. 11(1) 162–171 (2011)Google Scholar
  15. 15.
    Nandy, S., Sarkar, P.P., Das, A.: Training a feed-forward neural network with artificial bee colony based back propagation method. Int. J. Comput. Sci. Inf. Technol. (IJCSIT) 4(4), 33–46 (2012)Google Scholar
  16. 16.
    Haykin, H.: Neural networks: a comprehensive foundation. Pearson Education Asia, Seventh Indian Reprint (2004)Google Scholar
  17. 17.
    Karaboga, D., Basturk, B.: A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm. J. Global Optim. 39(3), 459–471 (2007)CrossRefMATHMathSciNetGoogle Scholar
  18. 18.
    Karaboga, D., Basturk, B.: On the performance of artificial bee colony (ABC) algorithm. Appl. Soft Comput. 8, 687–697 (2008)Google Scholar
  19. 19.
    UCI repository of machine learning databases. Department of Information and Computer Sciences, University of California, IrvineGoogle Scholar
  20. 20.
    Prechelt, L.P.: A set of neural network benchmark problems and benchmarking rules. Technical Report 21, Fakult AatfAur Informatik University, Karlsruhe, Germany (1994)Google Scholar

Copyright information

© Springer India 2014

Authors and Affiliations

  • Partha Pratim Sarangi
    • 1
  • Abhimanyu Sahu
    • 1
  • Madhumita Panda
    • 2
  1. 1.Department of CSESeemanta Engineering CollegeJharpokhariaIndia
  2. 2.Department of MCASeemanta Engineering CollegeJharpokhariaIndia

Personalised recommendations