An optimization algorithm guided by a machine learning approach

Abstract

Extracting knowledge is the multidisciplinary process of identifying novel, significant, potentially useful, and consistent information in data. One of the most interesting techniques in the fields of extracting knowledge and machine learning are the self-organization maps (SOMs). They have the capacity of mapping complex high-dimensional relations onto a reduced lattice preserving the topological organization of the initial data. On the other hand, Evolutionary approaches provide an effective alternative to solve complex optimization problems in different application domains. One important characteristic in the application of evolutionary methods to real-world problems is its high demand for function evaluations before obtaining a satisfying solution. In their operation, evolutionary techniques produce new solutions without extracting useful knowledge from a large number of solutions already generated. The use of acquired knowledge during the evolution process could significantly improve their performance in conducting the search strategy toward promising regions or increasing its convergence properties. This paper introduces an evolutionary optimization algorithm in which knowledge extracted during its operation is employed to guide its search strategy. In the approach, a SOM is used as extracting knowledge technique to identify the promising areas through the reduction of the search space. Therefore, in each generation, the proposed method uses a subset of the complete group of generated solutions seen so-far to train the SOM. Once trained, the neural unit from the SOM lattice that corresponds to the best solution is identified. Then, by using local information of this neural unit an entire population of candidate solutions is produced. With the use of the extracted knowledge, the new approach improves the convergence to difficult high multi-modal optima by using a reduced number of function evaluations. The performance of our approach is compared to several state-of-the-art optimization techniques considering a set of well-known functions and three real-world engineering problems. The results validate that the introduced method reaches the best balance regarding accuracy and computational cost over its counterparts.

This is a preview of subscription content, log in to check access.

Fig. 1
Fig. 2
Fig. 3
Algorithm 1
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

References

  1. 1.

    Han J, Kamber M (2001) Data mining: concepts and techniques. Morgan Kaufmann, San Francisco

    Google Scholar 

  2. 2.

    Manco G, Rullo P, Gallucci L, Paturzo M (2016) Rialto: a knowledge discovery suite for data analysis. Expert Syst Appl 59:145–164

    Article  Google Scholar 

  3. 3.

    Kohonen T (1982) Self-organized formation of topologically correct feature maps. Biol Cybern 43:59–69

    MathSciNet  Article  Google Scholar 

  4. 4.

    Qi Z, Yan W, Ping J, Xinyu S, Seung-Kyum C, Jiexiang H, Longchao C, Xiangzheng M (2017) An active learning radial basis function modeling method based on self-organization maps for simulation-based design problems. Knowl-Based Syst 131:10–27

    Article  Google Scholar 

  5. 5.

    Delgado S, Higuera C, Calle-Espinosa J, Morán F, Montero F (2017) A SOM prototype-based cluster analysis methodology. Expert Syst Appl 88:14–28

    Article  Google Scholar 

  6. 6.

    Ayodeji A, Evgeny M, Gorban N (2016) SOM: Stochastic initialization versus principal components. Inf Sci 364–365:213–221

    Google Scholar 

  7. 7.

    Jagannath Nanda S, Panda G (2014) A survey on nature inspired metaheuristic algorithms for partitional clustering. Swarm Evolut Comput 16:1–18

    Article  Google Scholar 

  8. 8.

    Kennedy J, Eberhart R, Particle swarm optimization, in Proceedings of the 1995 IEEE International Conference on Neural Networks, vol. 4, pp. 1942–1948

  9. 9.

    Karaboga D (2005) An idea based on honey bee swarm for numerical optimization. TechnicalReport-TR06. Engineering Faculty, Computer Engineering Department, Erciyes University,

  10. 10.

    Geem ZW, Kim JH, Loganathan GV (2001) A new heuristic optimization algorithm: harmony search. Simulations 76:60–68

    Article  Google Scholar 

  11. 11.

    Yang XS (2010) A new metaheuristic bat-inspired algorithm. In: Cruz C, González J, Krasnogor GTN, Pelta DA (eds) Nature inspired cooperative strategies for optimization (NISCO 2010), studies in computational intelligence, vol 284. Springer, Berlin, pp 65–74

    Google Scholar 

  12. 12.

    Yang XS, Firefly algorithms for multimodal optimization, In: Stochastic algorithms: foundations and applications, SAGA 2009, Lecture Notes in Computer Sciences, vol. 5792, 2009, pp. 169–178

    Google Scholar 

  13. 13.

    Cuevas E, Cienfuegos M, Zaldívar D, Pérez-Cisneros M (2013) A swarm optimization algorithm inspired in the behavior of the social-spider. Expert Syst Appl 40(16):6374–6384

    Article  Google Scholar 

  14. 14.

    Cuevas E, González M, Zaldivar D, Pérez-Cisneros M, García G (2012) An algorithm for global optimization inspired by collective animal behaviour. Discrete Dyn Nat Soc, art. no. 638275

  15. 15.

    Storn R, Price K (1995) Differential Evolution—a simple and efficient adaptive scheme for global optimisation over continuous spaces. Technical Report TR-95–012, ICSI, Berkeley, CA

  16. 16.

    Goldberg DE (1989) Genetic algorithm in search optimization and machine learning. Addison-Wesley, Boston

    Google Scholar 

  17. 17.

    Yang X-S, Deb S (2009) Cuckoo search via Lévy flights, Proc. World Congr. Nat. Biol. Inspired Computation (NABIC’09), pp. 210–214

  18. 18.

    Civicioglu P (2012) Transforming geocentric cartesian coordinates to geodetic coordinates by using differential search algorithm. Comput Geosci 46:229–247

    Article  Google Scholar 

  19. 19.

    Mirjalili S (2015) Moth-flame optimization algorithm: a novel nature-inspired heuristic paradigm. Knowl-Based Syst 89:228–249

    Article  Google Scholar 

  20. 20.

    Mirjalili S, Mirjalili SM, Hatamlou A (2016) Multi-verse optimizer: a nature-inspired algorithm for global optimization. Neural Comput Appl 27(2):495–513

    Article  Google Scholar 

  21. 21.

    Mirjalili S (2016) SCA: a sine cosine algorithm for solving optimization problems. Knowl-Based Syst 96:120–133

    Article  Google Scholar 

  22. 22.

    Nikolaus H, Kern S (2004) Evaluating the CMA evolution strategy on multimodal test functions. PPSN

  23. 23.

    Giraldez R, Aguilar-Ruiz JS, Riquelme JC (2005) Knowledge-based fast evaluation for evolutionary learning. IEEE Trans Cybern Part C 35(2):254–261

    Article  Google Scholar 

  24. 24.

    Kobeaga G, Merino M, Lozano J (2018) An efficient evolutionary algorithm for the orienteering problem. Comput Oper Res 90:42–59

    MathSciNet  Article  Google Scholar 

  25. 25.

    Thomsen R, Fogel GB, Krink T (2002) A clustal alignment improver using evolutionary algorithms. Proc. 4th Congr. Evolutionary Computation (CEC’2002), vol. 1, 121–126,

  26. 26.

    Wang S, Wang L (2015) A knowledge-based multi-agent evolutionary algorithm for semiconductor final testing scheduling problem. Knowl-Based Syst 84:1–9

    Article  Google Scholar 

  27. 27.

    Deveci M, Çetin N (2018) Evolutionary algorithms for solving the airline crew pairing problem. Comput Ind Eng 115:389–406

    Article  Google Scholar 

  28. 28.

    Mobin M, Mohsen S, Komaki M, Tavana M (2018) A hybrid desirability function approach for tuning parameters in evolutionary optimization algorithms. Measurements 114:417–427

    Google Scholar 

  29. 29.

    Agasiev T, Karpenko A (2017) The program system for automated parameter tuning of optimization algorithms. Proc Comput Sci 103:347–354

    Article  Google Scholar 

  30. 30.

    E.Yeguas MV, Luzón RPavón, Laza R, Arroyo G, Díaz F (2014) Automatic parameter tuning for evolutionary algorithms using a bayesian case-based reasoning system. Appl Soft Comput 18:185–195

    Article  Google Scholar 

  31. 31.

    Nobile MS, Cazzaniga P, Besozzi D, Colombo R, Mauri G, Pasi G (2018) Fuzzy self-tuning PSO: a settings-free algorithm for global optimization. Swarm Evolut Comput 39:70–85

    Article  Google Scholar 

  32. 32.

    Elsayed S, Sarker R, Coello-Coello C, Ray T (2018) Adaptation of operators and continuous control parameters in differential evolution for constrained optimization. Soft Comput 22(19):6595–6616

    Article  Google Scholar 

  33. 33.

    Hong L, Drake JH, Woodward JR, Özcan E (2018) A hyper-heuristic approach to automated generation of mutation operators for evolutionary programming. Appl Soft Comput 62:162–175

    Article  Google Scholar 

  34. 34.

    Hu Z, Yang J, Sun H, Wei L, Zhao Z (2017) An improved multi-objective evolutionary algorithm based on environmental and history information. Neurocomputing 222:170–182

    Article  Google Scholar 

  35. 35.

    Coello C, Landa R, Adding knowledge and efficient data structures to evolutionary programming: a cultural algorithm for constrained optimization, Proceeding GECCO’02 Proceedings of the 4th Annual Conference on Genetic and Evolutionary Computation, (2002), 201–209

  36. 36.

    Haykin S (1999) Neural networks: a comprehensive foundation. Prentice Hall, New York

    Google Scholar 

  37. 37.

    Anescu N, Further scalable test functions for multidimensional continuous optimization, no. 2017

  38. 38.

    Li MD, Zhao H, Weng XW, Han T (2016) A novel nature-inspired algorithm for optimization: virus colony search. Adv Eng Softw 92:65–88

    Article  Google Scholar 

  39. 39.

    Yang XS, Wiley InterScience (2010) Engineering optimization: an introduction with metaheuristic applications. Wiley, Hoboken (Online service)

    Google Scholar 

  40. 40.

    Askarzadeh A (2016) A novel metaheuristic method for solving constrained engineering optimization problems: crow search algorithm. Comput Struct 169:1–12

    Article  Google Scholar 

  41. 41.

    Rashedi E, Nezamabadi-pour H, Saryazdi S (2009) GSA: a gravitational search algorithm. Inf Sci (Ny) 179(13):2232–2248

    Article  Google Scholar 

  42. 42.

    Yu S, Zhu S, Ma Y, Mao D (2015) A variable step size firefly algorithm for numerical optimization. Appl Math Comput 263:214–220

    MathSciNet  MATH  Google Scholar 

  43. 43.

    Yang X-S, Karamanoglu M, He X (2014) Flower pollination algorithm: a novel approach for multiobjective optimization. Eng Optim 46(9):1222–1237

    MathSciNet  Article  Google Scholar 

  44. 44.

    Wilcoxon F (1945) Individual comparisons by ranking methods. Biometrics, pp. 80–83,

    Article  Google Scholar 

  45. 45.

    Liang JJ, Qu B-Y, Suganthan PN, Problem definitions and evaluation criteria for the CEC 2015 special session and competition on single objective realparameter numerical optimization, Technical Report 201311, Computational Intelligence Laboratory, Zhengzhou University, Zhengzhou China and Nanyang Technological University, Singapore, 2017

  46. 46.

    Olorunda O, Engelbrecht A, Measuring exploration/exploitation in particle swarms using swarm diversity, in: Proceedings of the IEEE Congress on Evolutionary Computation, Hong Kong, 2008, pp. 1128–1134

  47. 47.

    Mortazavi A, Toğan V, Nuhoğlu A (2018) Interactive search algorithm: a new hybrid metaheuristic optimization algorithm. Eng Appl Artif Intell 71:275–292

    Article  Google Scholar 

  48. 48.

    Ong Y, Nair P, Keane A (2003) Evolutionary optimization of computationally expensive problems via surrogate modeling. AIAA J 41(4):687–696

    Article  Google Scholar 

Download references

Author information

Affiliations

Authors

Corresponding author

Correspondence to Erik Cuevas.

Ethics declarations

Conflict of interest

The authors declare that there is no conflict of interests regarding the publication of this paper.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendices

Appendix A. List of benchmark functions.

See Table 19.

Table 19 Benchmark functions used in the experimental study

Appendix B.

See Table 20.

Table 20 List of CEC2017 benchmark functions

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Cuevas, E., Galvez, J. An optimization algorithm guided by a machine learning approach. Int. J. Mach. Learn. & Cyber. 10, 2963–2991 (2019). https://doi.org/10.1007/s13042-018-00915-0

Download citation

Keywords

  • Metaheuristics
  • Self-organization maps
  • Extracting knowledge
  • Machine learning
  • Hybrid systems