Binary multi-verse optimization algorithm for global optimization and discrete problems


Multi-verse optimizer is one of the recently proposed nature-inspired algorithms that has proven its efficiency in solving challenging optimization problems. The original version of Multi-verse optimizer is able to solve problems with continuous variables. This paper proposes a binary version of this algorithm to solve problems with discrete variables such as feature selection. The proposed Binary Multi-verse optimizer is equipped with a V-shaped transfer function to covert continuous values to binary, and update the solutions over the course of optimization. A comparative study is conducted to compare Binary Multi-verse optimizer with other binary optimization algorithms such as Binary Bat Algorithm, Binary Particle Swarm Optimization, Binary Dragon Algorithm, and Binary Grey Wolf Optimizer. As case studies, a set of 13 benchmark functions including unimodal and multimodal is employed. In addition, the number of variables of these test functions are changed (5, 10, and 20) to test the proposed algorithm on problems with different number of parameters. The quantitative results show that the proposed algorithm significantly outperforms others on the majority of benchmark functions. Convergence curves qualitatively show that for some functions, proposed algorithm finds the best result at early iterations. To demonstrate the applicability of proposed algorithm, the paper considers solving feature selection and knapsack problems as challenging real-world problems in data mining. Experimental results using seven datasets for feature selection problem show that proposed algorithm tends to provide better accuracy and requires less number of features compared to other algorithms on most of the datasets. For knapsack problem 17 benchmark datasets were used, and the results show that the proposed algorithm achieved higher profit and lower error compared to other algorithms.

This is a preview of subscription content, log in to check access.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6


  1. 1.

    Knapsack problem benchmark set. Accessed 1 Oct 2018

  2. 2.

    Abdel-Basset M, Luo Q, Miao F, Zhou Y (2017) Solving 0–1 knapsack problems by binary dragonfly algorithm. International conference on intelligent computing. Springer, New York, pp 491–502

    Google Scholar 

  3. 3.

    Ahmed S, Mafarja M, Faris H, Aljarah I (2018) Feature selection using salp swarm algorithm with chaos. In: Proceedings of the 2nd international conference on intelligent systems, metaheuristics & swarm intelligence. ACM, pp 65–69

  4. 4.

    Alamedine D, Marque C, Khalil M (2013) Binary particle swarm optimization for feature selection on uterine electrohysterogram signal. Advances in biomedical engineering (ICABME), 2013 2nd international conference on. IEEE, Piscataway, pp 125–128

    Google Scholar 

  5. 5.

    Aljarah I, Mafarja M, Heidari AA, Faris H, Zhang Y, Mirjalili S (2018) Asynchronous accelerating multi-leader salp chains for feature selection. Appl Soft Comput 71:964–979

    Article  Google Scholar 

  6. 6.

    Altman NS (1992) An introduction to kernel and nearest-neighbor nonparametric regression. Am Stat 46(3):175–185

    MathSciNet  Google Scholar 

  7. 7.

    Babaoglu İ, Findik O, Ülker E (2010) A comparison of feature selection models utilizing binary particle swarm optimization and genetic algorithm in determining coronary artery disease using support vector machine. Expert Syst Appl 37(4):3177–3183

    Article  Google Scholar 

  8. 8.

    Bansal JC, Deep K (2012) A modified binary particle swarm optimization for knapsack problems. Appl Math Comput 218(22):11042–11061

    MathSciNet  MATH  Google Scholar 

  9. 9.

    Bhattacharjee KK, Sarmah SP (2014) Shuffled frog leaping algorithm and its application to 0/1 knapsack problem. Appl Soft Comput 19:252–263

    Article  Google Scholar 

  10. 10.

    Bhattacharjee KK, Sarmah SP (2015) A binary cuckoo search algorithm for knapsack problems. Industrial engineering and operations management (IEOM), 2015 international conference on. IEEE, Piscataway, pp 1–5

    Google Scholar 

  11. 11.

    Digalakis JG, Margaritis KG (2001) On benchmarking functions for genetic algorithms. Int J Comput Math 77(4):481–506

    MathSciNet  MATH  Article  Google Scholar 

  12. 12.

    Dorigo M, Birattari M, Stutzle T (2006) Ant colony optimization. IEEE Comput Intell Mag 1(4):28–39

    Article  Google Scholar 

  13. 13.

    Emary E, Zawbaa HM, Grosan C, Hassenian AE (2015) Feature subset selection approach by gray-wolf optimization. In: Afro-European conference for industrial advancement. Springer, New York, pp 1–13

    Google Scholar 

  14. 14.

    Emary E, Zawbaa HM, Hassanien AE (2016) Binary grey wolf optimization approaches for feature selection. Neurocomputing 172:371–381

    Article  Google Scholar 

  15. 15.

    Faris H, Aljarah I, Mirjalili S (2016) Training feedforward neural networks using multi-verse optimizer for binary classification problems. Appl Intell 45(2):322–332

    Article  Google Scholar 

  16. 16.

    Faris H, Hassonah MA, Al-Zoubi AM, Mirjalili S, Aljarah I (2017) A multi-verse optimizer approach for feature selection and optimizing svm parameters based on a robust system architecture. Neural Comput Appl pp 1–15

  17. 17.

    Faris H, Ala’M AZ, Heidari AA, Aljarah I, Mafarja M, Hassonah MA, Fujita H (2019) An intelligent system for spam detection and identification of the most relevant features based on evolutionary random weight networks. Inf Fusion 48:67–83

    Article  Google Scholar 

  18. 18.

    Feng Y, Wang GG, Deb S, Lu M, Zhao XJ (2017) Solving 0–1 knapsack problem by a novel binary monarch butterfly optimization. Neural Comput Appl 28(7):1619–1634

    Article  Google Scholar 

  19. 19.

    Haupt RL, Haupt SE (2004) The binary genetic algorithm, 2nd edn. In: Practical genetic algorithms, pp 27–50

  20. 20.

    Karaboga D (2005) An idea based on honey bee swarm for numerical optimization. Tech. rep., Technical report-tr06, Erciyes university, engineering faculty, computer engineering department

  21. 21.

    Kennedy J, Eberhart RC (1997) A discrete binary version of the particle swarm algorithm. Systems, man, and cybernetics, 1997. Computational cybernetics and simulation. 1997 IEEE international conference on, vol 5. IEEE, Piscataway, pp 4104–4108

  22. 22.

    Kennedy R (1995) J. and eberhart, particle swarm optimization. In: Proceedings of IEEE international conference on neural networks IV, pages, vol 1000

  23. 23.

    Khanesar MA, Teshnehlab M, Shoorehdeli MA (2007) A novel binary particle swarm optimization. In: Control & automation, 2007. MED’07. Mediterranean conference on. IEEE, Piscataway, pp 1–6

  24. 24.

    Kohavi R, John GH (1997) Wrappers for feature subset selection. Artif Intell 97(1):273–324

    MATH  Article  Google Scholar 

  25. 25.

    Kong M, Tian P (2006) Apply the particle swarm optimization to the multidimensional knapsack problem. In: International conference on artificial intelligence and soft computing. Springer, New York, pp 1140–1149

    Google Scholar 

  26. 26.

    Kulkarni AJ, Shabir H (2016) Solving 0–1 knapsack problem using cohort intelligence algorithm. Int J Mach Learn Cybern 7(3):427–441

    Article  Google Scholar 

  27. 27.

    Lazinica A (2009) Novel binary particle swarm optimization. In: Particle swarm optimization. InTech Kirchengasse. 10.5772/6738

  28. 28.

    Lee CY, Lee ZJ, Su SF (2006) A new approach for solving 0/1 knapsack problem. In: Systems, man and cybernetics, 2006. SMC’06. IEEE international conference on, vol 4. IEEE, Piscataway, pp 3138–3143

  29. 29.

    Mafarja M, Aljarah I, Faris H, Hammouri AI, Ala’M AZ, Mirjalili S (2019) Binary grasshopper optimisation algorithm approaches for feature selection problems. Expert Syst Appl 117:267–286

    Article  Google Scholar 

  30. 30.

    Meng T, Pan QK (2017) An improved fruit fly optimization algorithm for solving the multidimensional knapsack problem. Appl Soft Comput 50:79–93

    Article  Google Scholar 

  31. 31.

    Mirjalili S (2016) Dragonfly algorithm: a new meta-heuristic optimization technique for solving single-objective, discrete, and multi-objective problems. Neural Comput Appl 27(4):1053–1073

    Article  Google Scholar 

  32. 32.

    Mirjalili S, Lewis A (2013) S-shaped versus v-shaped transfer functions for binary particle swarm optimization. Swarm Evol Comput 9:1–14

    Article  Google Scholar 

  33. 33.

    Mirjalili S, Mirjalili SM, Hatamlou A (2016) Multi-verse optimizer: a nature-inspired algorithm for global optimization. Neural Comput Appl 27(2):495–513

    Article  Google Scholar 

  34. 34.

    Mirjalili S, Mirjalili SM, Yang XS (2014) Binary bat algorithm. Neural Comput Appl 25(3–4):663–681.

    Article  Google Scholar 

  35. 35.

    Mitchell M (1998) An introduction to genetic algorithms. MIT press, Cambridge

    Google Scholar 

  36. 36.

    Moghadasian M, Hosseini SP (2014) Binary cuckoo optimization algorithm for feature selection in high-dimensional datasets. In: International conference on innovative engineering technologies (ICIET2014), pp 18–21

  37. 37.

    Molga M, Smutnicki C (2005) Test functions for optimization needs. Test functions for optimization needs, p 101

  38. 38.

    Nakamura RY, Pereira LA, Costa K, Rodrigues D, Papa JP, Yang XS (2012) Bba: a binary bat algorithm for feature selection. In: Graphics, patterns and images (SIBGRAPI), 2012 25th SIBGRAPI conference on. IEEE, Piscataway, pp 291–297

  39. 39.

    Nguyen PH, Wang D, Truong TK (2016) A new hybrid particle swarm optimization and greedy for 0–1 knapsack problem. Indones J Electr Eng Comput Sci 1(3):411–418

    Article  Google Scholar 

  40. 40.

    Pal A, Maiti J (2010) Development of a hybrid methodology for dimensionality reduction in mahalanobis-taguchi system using mahalanobis distance and binary particle swarm optimization. Expert Syst Appl 37(2):1286–1293

    Article  Google Scholar 

  41. 41.

    Qian C, Shi JC, Yu Y, Tang K (2017) On subset selection with general cost constraints. In: Proceedings of the 26th international joint conference on artificial intelligence (IJCAI-2017), pp 2613–2619

  42. 42.

    Qian C, Shi JC, Yu Y, Tang K, Zhou ZH (2017) Subset selection under noise. In: Advances in neural information processing systems, pp 3560–3570

  43. 43.

    Qian C, Yu Y, Zhou ZH (2015) Subset selection by pareto optimization. In: Advances in neural information processing systems, pp 1774–1782

  44. 44.

    Qiao LY, Peng XY, Peng Y (2006) Bpso-svm wrapper for feature subset selection. Dianzi Xuebao (Acta Electronica Sinica) 34(3):496–498

    Google Scholar 

  45. 45.

    Sabba S, Chikhi S (2014) A discrete binary version of bat algorithm for multidimensional knapsack problem. Int J Bio-Inspired Comput 6(2):140–152

    Article  Google Scholar 

  46. 46.

    Sundar S, Singh A, Rossi A (2010) An artificial bee colony algorithm for the 0–1 multidimensional knapsack problem. In: International conference on contemporary computing. Springer, New York, pp 141–151

    Google Scholar 

  47. 47.

    Unler A, Murat A (2010) A discrete particle swarm optimization method for feature selection in binary classification problems. Eur J Oper Res 206(3):528–539

    MATH  Article  Google Scholar 

  48. 48.

    Wan Y, Wang M, Ye Z, Lai X (2016) A feature selection method based on modified binary coded ant colony optimization algorithm. Appl Soft Comput 49:248–258

    Article  Google Scholar 

  49. 49.

    Witten IH, Frank E, Hall MA, Pal CJ (2016) Data Mining: practical machine learning tools and techniques. Morgan Kaufmann, Burlington

    Google Scholar 

  50. 50.

    Yang XS (2010) Test problems in optimization. Eng Optim pp 261–266

  51. 51.

    Yao X, Liu Y, Lin G (1999) Evolutionary programming made faster. IEEE Trans Evol Comput 3(2):82–102

    Article  Google Scholar 

  52. 52.

    Yassien E, Masadeh R, Alzaqebah A, Shaheen A (2017) Grey wolf optimization applied to the 0–1 knapsack problem. Int J Comput Appl 196(5):11–15

    Google Scholar 

  53. 53.

    Zabidi A, Khuan L, Mansor W, Yassin I, Sahak R (2011) Binary particle swarm optimization for feature selection in detection of infants with hypothyroidism. In: Engineering in medicine and biology society, EMBC, 2011 annual international conference of the IEEE. IEEE, Piscataway, pp 2772–2775

  54. 54.

    Zhao H, Han X, Guo S (2016) Dgm (1, 1) model optimized by mvo (multi-verse optimizer) for annual peak load forecasting. Neural Comput Appl, pp 1–15

  55. 55.

    Zhou Y, Li L, Ma M (2016) A complex-valued encoding bat algorithm for solving 0–1 knapsack problem. Neural Process Lett 44(2):407–430

    Article  Google Scholar 

Download references

Author information



Corresponding author

Correspondence to Nailah Al-Madi.

Ethics declarations

Conflict of interest

All authors declare that there is no conflict of interest.

Ethical standard

This article does not contain any studies with human participants or animals performed by any of the authors.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Al-Madi, N., Faris, H. & Mirjalili, S. Binary multi-verse optimization algorithm for global optimization and discrete problems. Int. J. Mach. Learn. & Cyber. 10, 3445–3465 (2019).

Download citation


  • Feature selection
  • Optimization
  • Multi-verse optimization algorithm
  • Global optimization