Advertisement

The Human Mental Search Algorithm for Solving Optimisation Problems

Chapter
Part of the Studies in Computational Intelligence book series (SCI, volume 911)

Abstract

The performance of most data science algorithms, and in particular machine learning algorithms, is largely dependent on the performance of their optimisation algorithm. In other words, without an effective optimisation algorithm there is no effective data science algorithm. Conventional optimisation algorithms suffer from drawbacks such as a tendency to get stuck in local optima and sensitivity to the initial conditions. To tackle these, population-based metaheuristic algorithms, which work on a population of candidate solution and incorporate stochastic behaviour, can be used. In this chapter, we the present the Human Mental Search (HMS) algorithm, population-based metaheuristic algorithm that is inspired by bid exploration in online auctions. HMS comprises three main operators:, mental search, grouping, and movement. The mental search operator is responsible for exploring the vicinity of a candidate solution, the grouping operator employs an unsupervised clustering technique, k-means, to partition candidate solutions, while the movement operator moves candidate solutions towards a promising area identified by the grouping operator. To evaluate the efficacy of the HMS algorithm, a set of experiments on different benchmark functions with diverse characteristics as well as both normal and large-scale problems. The obtained results clearly show the merit of HMS compared to other algorithms.

Keywords

Human Mental Search Algorithm Optimization Unsupervised clustering K-means 

References

  1. 1.
    Amirsadri, S., Mousavirad, S.J., Ebrahimpour-Komleh, H.: A levy flight-based grey wolf optimizer combined with back-propagation algorithm for neural network training. Neural Comput. Appl. 30(12), 3707–3720 (2018)CrossRefGoogle Scholar
  2. 2.
    Atashpaz-Gargari, E., Lucas, C.: Imperialist competitive algorithm: an algorithm for optimization inspired by imperialistic competition. In: IEEE Congress on Evolutionary Computation, pp. 4661–4667 (2007)Google Scholar
  3. 3.
    Brooks, S.P., Morgan, B.J.: Optimization using simulated annealing. J. R. Stat. Soc. Ser. D (The Statistician) 44(2), 241–257 (1995)Google Scholar
  4. 4.
    Carvalho, A.R., Ramos, F.M., Chaves, A.A.: Metaheuristics for the feedforward artificial neural network (ANN) architecture optimization problem. Neural Comput. Appl. 20(8), 1273–1284 (2011)CrossRefGoogle Scholar
  5. 5.
    Das, S., Abraham, A., Konar, A.: Automatic clustering using an improved differential evolution algorithm. IEEE Trans. Syst. Man Cybern. Part A Syst. Hum. 38(1), 218–237 (2008)CrossRefGoogle Scholar
  6. 6.
    Derrac, J., García, S., Molina, D., Herrera, F.: A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm Evolut. Comput. 1(1), 3–18 (2011)CrossRefGoogle Scholar
  7. 7.
    Espinoza-Pérez, S., Rojas-Domınguez, A., Valdez-Pena, S.I., Mancilla-Espinoza, L.E.: Evolutionary training of deep belief networks for handwritten digit recognition. Res. Comput. Sci. 148, 115–131 (2019)CrossRefGoogle Scholar
  8. 8.
    Eusuff, M., Lansey, K., Pasha, F.: Shuffled frog-leaping algorithm: a memetic meta-heuristic for discrete optimization. Eng. Optim. 38(2), 129–154 (2006)MathSciNetCrossRefGoogle Scholar
  9. 9.
    Geem, Z.W., Kim, J.H., Loganathan, G.V.: A new heuristic optimization algorithm: harmony search. Simulation 76(2), 60–68 (2001)CrossRefGoogle Scholar
  10. 10.
    Hancer, E., Xue, B., Zhang, M.: Differential evolution for filter feature selection based on information theory and feature ranking. Knowl.-Based Syst. 140, 103–119 (2018)CrossRefGoogle Scholar
  11. 11.
    Kapanova, K., Dimov, I., Sellier, J.: A genetic approach to automatic neural network architecture optimization. Neural Comput. Appl. 29(5), 1481–1492 (2018)CrossRefGoogle Scholar
  12. 12.
    Karaboga, D., Basturk, B.: A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm. J. Glob. Optim. 39(3), 459–471 (2007)MathSciNetCrossRefGoogle Scholar
  13. 13.
    Kennedy, J., Eberhart, R.: Particle swarm optimization (PSO). In: IEEE International Conference on Neural Networks, pp. 1942–1948 (1995)Google Scholar
  14. 14.
    Mahdavi, S., Shiri, M.E., Rahnamayan, S.: Metaheuristics in large-scale global continues optimization: a survey. Inform. Sci. 295, 407–428 (2015)MathSciNetCrossRefGoogle Scholar
  15. 15.
    Minija, S.J., Emmanuel, W.S.: Imperialist competitive algorithm-based deep belief network for food recognition and calorie estimation. Evolut. Intell., 1–16 (2019)Google Scholar
  16. 16.
    Mirjalili, S., Lewis, A.: The whale optimization algorithm. Adv. Eng. Softw. 95, 51–67 (2016)CrossRefGoogle Scholar
  17. 17.
    Mirjalili, S., Mirjalili, S.M., Lewis, A.: Grey wolf optimizer. Adv. Eng. Softw. 69, 46–61 (2014)CrossRefGoogle Scholar
  18. 18.
    Mladenović, N., Hansen, P.: Variable neighborhood search. Comput. Oper. Res. 24(11), 1097–1100 (1997)MathSciNetCrossRefGoogle Scholar
  19. 19.
    Mousavirad, S., Ebrahimpour-Komleh, H.: Feature selection using modified imperialist competitive algorithm. In: ICCKE 2013, pp. 400–405. IEEE (2013)Google Scholar
  20. 20.
    Mousavirad, S.J., Bidgoli, A.A., Ebrahimpour-Komleh, H., Schaefer, G.: A memetic imperialist competitive algorithm with chaotic maps for multi-layer neural network training. Int. J. Bio-Inspir. Comput. 14(4), 227–236 (2019)CrossRefGoogle Scholar
  21. 21.
    Mousavirad, S.J., Bidgoli, A.A., Ebrahimpour-Komleh, H., Schaefer, G., Korovin, I.: An effective hybrid approach for optimising the learning process of multi-layer neural networks. In: International Symposium on Neural Networks, pp. 309–317 (2019)Google Scholar
  22. 22.
    Mousavirad, S.J., Ebrahimpour-Komleh, H., Schaefer, G.: Effective image clustering based on human mental search. Appl. Soft Comput. 78, 209–220 (2019)CrossRefGoogle Scholar
  23. 23.
    Radicchi, F., Baronchelli, A.: Evolution of optimal lévy-flight strategies in human mental searches. Phys. Rev. E 85(6), 061121 (2012)Google Scholar
  24. 24.
    Radicchi, F., Baronchelli, A., Amaral, L.A.: Rationality, irrationality and escalating behavior in lowest unique bid auctions. PloS One 7(1) (2012)Google Scholar
  25. 25.
    Shi, Y., Eberhart, R.: A modified particle swarm optimizer. In: IEEE International Conference on Evolutionary Computation, pp. 69–73 (1998)Google Scholar
  26. 26.
    Simon, D.: Biogeography-based optimization. IEEE Trans. Evolut. Comput. 12(6), 702–713 (2008)CrossRefGoogle Scholar
  27. 27.
    Stützle, T.: Local search algorithms for combinatorial problems. Darmstadt University of Technology PhD Thesis, vol. 20 (1998)Google Scholar
  28. 28.
    Suganthan, P.N., Hansen, N., Liang, J.J., Deb, K., Chen, Y.P., Auger, A., Tiwari, S.: Problem definitions and evaluation criteria for the CEC 2005 special session on real-parameter optimization. Nanyang Technological University Singapore, Technical report (2005)Google Scholar
  29. 29.
    Sun, R.: Optimization for deep learning: theory and algorithms (2019). arXiv:1912.08957
  30. 30.
    Tran, T.H., Nguyen, H., Nhat-Duc, H., et al.: A success history-based adaptive differential evolution optimized support vector regression for estimating plastic viscosity of fresh concrete. Eng. Comput., 1–14 (2019)Google Scholar
  31. 31.
    Yang, X.S.: Firefly algorithm, stochastic test functions and design optimization (2010). arXiv:1003.1409

Copyright information

© The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Switzerland AG 2021

Authors and Affiliations

  1. 1.Faculty of EngineeringSabzevar University of New TechnologySabzevarIran
  2. 2.Department of Computer ScienceLoughborough UniversityLoughboroughUK
  3. 3.Department of Electrical and Computer EngineeringUniversity of KashanKashanIran

Personalised recommendations