Skip to main content
Log in

An advanced initialization technique for metaheuristic optimization: a fusion of Latin hypercube sampling and evolutionary behaviors

  • Published:
Computational and Applied Mathematics Aims and scope Submit manuscript

Abstract

Many new metaheuristic algorithms prioritize their search strategy phase, often neglecting equally critical stages like initialization. Latin hypercube sampling (LHS) is one technique that stands out in this context. LHS selects representative samples through permutations in a multidimensional space, effectively preventing the clustering of points in specific areas. However, its limitations become apparent in high-dimensional problems, as it fails to provide crucial search space information and struggles to determine samples across all dimensions. Addressing these challenges, this paper introduces an innovative population initialization approach, combining the strengths of LHS with evolutionary behaviors. This technique is divided into two main sections: spatial and quality. The spatial section divides the search space into equal intervals across each dimension to establish initial solutions. Meanwhile, the quality section employs evolutionary strategies like mutation and crossover. These strategies serve a dual purpose: they explore the search space thoroughly and refine solutions, bringing them closer to the objective function. To validate the effectiveness of this method, these principles have been integrated it into the classic Differential Evolution algorithm. We conducted extensive tests using 30 representative benchmark functions to assess its performance. The experimental results are encouraging; our methodology not only speeds up convergence but also enhances solution quality, outperforming other similar techniques.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

References

  • Agushaka JO, Ezugwu AE, Abualigah L (2022) Dwarf mongoose optimization algorithm. Comput Methods Appl Mech Eng 391:114570

    Article  MathSciNet  Google Scholar 

  • Agushaka JO, Ezugwu AE, Abualigah L (2023) Gazelle optimization algorithm: a novel nature-inspired metaheuristic optimizer. Neural Comput Appl 35(5):4099–4131

    Article  Google Scholar 

  • Ahmad MF, Isa NAM, Limb WH, Ang KM (2022) Differential evolution with modified initialization scheme using chaotic oppositional based learning strategy. Alex Eng J 61:11835–11858

    Article  Google Scholar 

  • Andre J, Siarry P, Dognon T (2001) An improvement of the standard genetic algorithm fighting premature convergence in continuous optimization. Adv Eng Softw 32(1):49–60

    Article  Google Scholar 

  • Askarzadeh A (2016) A novel metaheuristic method for solving constrained engineering optimization problems: crow search algorithm. Comput Struct 169:1–12

    Article  Google Scholar 

  • Black PE (ed) Big-O notation. In: Dictionary of algorithms and data structures, Sept 2019. Accessed: Sept 23 2023 (Online). Available: https://www.nist.gov/dads/HTML/bigOnotation.html

  • Birbil ŞI, Fang SC (2003) An electromagnetism-like mechanism for global optimization. J Glob Optim 25(3):263–282. https://doi.org/10.1023/A:1022452626305

    Article  MathSciNet  Google Scholar 

  • Črepinšek M, Liu SH, Mernik M (2013) Exploration and exploitation in evolutionary algorithms. ACM Comput Surv 45(3):1–33

    Article  Google Scholar 

  • Cuevas E, Cienfuegos M, Zaldívar D, Pérez-Cisneros M (2013) A swarm optimization algorithm inspired in the behavior of the socialspider. Expert Syst Appl 40(16):6374–6384. https://doi.org/10.1016/j.eswa.2013.05.041

    Article  Google Scholar 

  • Cuevas E, Gálvez J, Avalos O (2020) Fuzzy logic based optimization algorithm. Recent Metaheurist Algorithms Parameter Identif 2020:135–181

    Google Scholar 

  • Cuevas E, Becerra H, Escobar H, Luque-Chang A, Pérez M, Eid HF, Jiménez M (2021) Search patterns based on trajectories extracted from the response of second-order systems. Appl Sci 11(8):3430

    Article  Google Scholar 

  • De Castro L, Timmis J (2022) An artificial immune network for multimodal function optimization. In: Proceedings of the congress on evolutionary computation (CEC), vol 1. IEEE Computer Society, Los Alamitos, pp 699–704

  • Dhiman G, Kumar V (2018) Emperor penguin optimizer: a bio-inspired algorithm for engineering problems. Knowl Based Syst 159:20–50

    Article  Google Scholar 

  • Dorigo M, Birattari M, Stutzle T (2006) Ant colony optimization. IEEE Comput Intell Mag 1(4):28–39

    Article  Google Scholar 

  • Ezugwu AE, Agushaka JO, Abualigah L, Mirjalili S, Gandomi AH (2022) Prairie dog optimization algorithm. Neural Comput Appl 34(22):20017–20065

    Article  Google Scholar 

  • Fogel DB (1998) Artificial intelligence through simulated evolution. Wiley-IEEE Press, London, pp 227–296

    Google Scholar 

  • Formato RA (2007) Central force optimization: a new metaheuristic with applications in applied electromagnetics. In: Progress in electronagnetics research, PIER 77, pp 425–491

  • Gandomi AH, Yang XS, Alavi AH (2013) Cuckoo search algorithm: a metaheuristic approach to solve structural optimization problems. Eng Comput 29:17–35

    Article  Google Scholar 

  • Garcia S, Fernandez A, Luengo J, Herrera F (2010) Advanced nonparametric tests for multiple comparisons in the design of experiments in computational intelligence and data mining: experimental analysis of power. Inf Sci 180(10):2044–2064

    Article  Google Scholar 

  • Geem ZW, Kim JH, Loganathan GV (2001) A new heuristic optimization algorithm: harmony search. SIMULATION 76(2):60–68

    Article  Google Scholar 

  • Geyer CJ (1992) Practical Carlo chain Monte Markov. Stat Sci 7(4):473–483

    Google Scholar 

  • Ghasemi M, Zare M, Zahedi A, Akbari MA, Mirjalili S, Abualigah L (2023) Geyser inspired algorithm: a new geological-inspired meta-heuristic for real-parameter and constrained engineering optimization. J Bionic Eng 2023:1–35

    Google Scholar 

  • Hansen N (2016). The CMA evolution strategy: a tutorial. ArXiv, Cornell University. https://doi.org/10.48550/arxiv.1604.00772

  • Holland JH (1975) Adaptation in natural and artificial systems. Univ. of Mich. Press, Ann Arbor

    Google Scholar 

  • Hu G, Guo Y, Wei G, Abualigah L (2023a) Genghis Khan shark optimizer: a novel nature-inspired algorithm for engineering optimization. Adv Eng Inform 58:102210

    Article  Google Scholar 

  • Hu G, Zheng Y, Abualigah L, Hussien AG (2023b) DETDO: an adaptive hybrid dandelion optimizer for engineering optimization. Adv Eng Inform 57:102004

    Article  Google Scholar 

  • Hui W, Zhijian W, Liu Y, Jing W, Dazhi J, Lili C (2009) Space transformation search: a new evolutionary technique. In: Proceedings of the first ACM/SIGEVO summit on genetic and evolutionary computation (GEC’09). association for computing machinery, New York, pp 537–544. https://doi.org/10.1145/1543834.1543907

  • Hussain K, Mohd Salleh MN, Cheng S, Shi Y (2018) Metaheuristic research: a comprehensive survey. Artif Intell Rev 52(4):2191–2233. https://doi.org/10.1007/s10462-017-9605-z

    Article  Google Scholar 

  • Jerebic J, Mernik M, Liu SH, Ravber M, Baketarić M, Mernik L, Črepinšek M (2021) A novel direct measure of exploration and exploitation based on attraction basins. Expert Syst Appl 167:114353

    Article  Google Scholar 

  • Karaboga D (2010) Artificial bee colony algorithm. Scholarpedia 5(3):6915

    Article  Google Scholar 

  • Kennedy J, Eberhart R (1995) Particle swarm optimization. In: Proceedings of ICNN’95-international conference on neural networks, vol 4. IEEE, pp 1942–1948

  • Kirkpatrick S, Gelatt CD, Vecchi MP (1983) Optimization by simulated annealing. Science 220(4598):671–680. https://doi.org/10.1126/science.220.4598.671

    Article  MathSciNet  Google Scholar 

  • Kononova AV, Caraffini F, Bäck T (2021) Differential evolution outside the box. Inf Sci 581:587–604

    Article  Google Scholar 

  • Lambora A, Gupta K, Chopra K (2019, February) Genetic algorithm—a literature review. In: 2019 international conference on machine learning, big data, cloud and parallel computing (COMITCon). IEEE, pp 380–384

  • Li Q, Bai Y, Gao W (2021) Improved initialization method for metaheuristic algorithms: a novel search space view. EEE Access 9:158508–158539

    Google Scholar 

  • Li Y, Wang S, Yang B, Hu C, Wu Z, Yang H (2023) Population reduction with individual similarity for differential evolution. Artif Intell Rev 56:3887–3949

    Article  Google Scholar 

  • Lim SM, Sultan ABM, Sulaiman MN, Mustapha A, Leong KY (2017) Crossover and mutation operators of genetic algorithms. Int J Mach Learn Comput 7(1):9–12

    Article  Google Scholar 

  • Loh WL (1996) On Latin hypercube sampling. Ann Stat 24(5):2058–2080

    Article  MathSciNet  Google Scholar 

  • Mandal PK (2023) A review of classical methods and nature-inspired algorithms (NIAs) for optimization problems. Results Control Optim 13:100315–100315. https://doi.org/10.1016/j.rico.2023.100315

    Article  Google Scholar 

  • Michalewicz Z, Janikow CZ (1996) GENOCOP: a genetic algorithm for numerical optimization problems with linear constraints. Commun ACM 39(12es):175-es

    Article  Google Scholar 

  • Mirjalili S (2015) The ant lion optimizer. Adv Eng Softw 83:80–98

    Article  Google Scholar 

  • Mirjalili S, Lewis A (2016) The whale optimization algorithm. Adv Eng Softw 95:51–67

    Article  Google Scholar 

  • Mirjalili S, Mirjalili SM, Lewis A (2014) Grey wolf optimizer. Adv Eng Softw 69:46–61

    Article  Google Scholar 

  • Morales-Castañeda B, Zaldívar D, Cuevas E, Fausto F, Rodríguez A (2020) A better balance in metaheuristic algorithms: Does it exist? Swarm Evol Comput 54:100671. https://doi.org/10.1016/j.swevo.2020.100671

    Article  Google Scholar 

  • Pan W, Li K, Wang M, Wang J, Jiang B (2014) Adaptive randomness: a new population initialization method. Math Problems Eng 2014:14

    Article  Google Scholar 

  • Piotrowski AP (2017) Review of differential evolution population size. Swarm Evol Comput 32:1–24

    Article  Google Scholar 

  • Rahnamayan S, Tizhoosh HR, Salama MMA (2007a) A novel population initialization method for accelerating evolutionary algorithms. Comput Math Appl 53(10):1605–1614. https://doi.org/10.1016/j.camwa.2006.07.013

    Article  MathSciNet  Google Scholar 

  • Rahnamayan S, Tizhoosh HR, Salama MMA (2007b) A novel population initialization method for accelerating evolutionary algorithms. Comput Math with Appl 53(10):1605–1614

    Article  MathSciNet  Google Scholar 

  • Rashedi E, Nezamabadi-Pour H, Saryazdi S (2011) Filter modeling using gravitational search algorithm. Eng Appl Artif Intell 24(1):117–122. https://doi.org/10.1016/j.engappai.2010.05.007

    Article  Google Scholar 

  • Sarhani M, Voß S, Jovanovic R (2023) Initialization of metaheuristics: comprehensive review, critical analysis, and research directions. Int Trans Oper Res 30(6):3361–3397

    Article  MathSciNet  Google Scholar 

  • Sharma K, Trivedi MK (2020) Latin hypercube sampling-based NSGA-III optimization model for multimode resource constrained time–cost–quality–safety trade-off in construction projects. Int J Constr Manag. https://doi.org/10.1080/15623599.2020.1843769

    Article  Google Scholar 

  • Stein M (1987) Large sample properties of simulations using Latin hypercube sampling. Technometrics 29(2):143–151

    Article  MathSciNet  Google Scholar 

  • Storn R, Price K (1997) Differential evolution—a simple and efficient heuristic for global optimization over continuous spaces. J Glob Optim 11:341–359

    Article  MathSciNet  Google Scholar 

  • Wang H, Wu Z, Rahnamayan S (2011) Enhanced opposition based differential evolution for solving high-dimensional continuous optimization problems. Soft Comput 15(11):2127–2140

    Article  Google Scholar 

  • Yang XS (2010) Nature-inspired metaheuristic algorithm, 2nd edn. Luniver Press, Beckington

    Google Scholar 

  • Yang X, Hossein Gandomi A (2012) Bat algorithm: a novel approach for global engineering optimization. Eng Comput 29(5):464–483. https://doi.org/10.1108/02644401211235834

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Karla Avila.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix

Appendix

 

Name

Minimum

S

D

Function

\({f(x)}_{1}\)

Ackley

\(f\left({\mathbf{x}}^{\mathbf{*}}\right)=0; {\mathbf{x}}^{\mathbf{*}}=\left(0,\dots , 0\right)\)

\({[-30, 30]}^{d}\)

30

\(f\left(x\right)=-20{\text{exp}}\left(-0.2\sqrt{\frac{1}{d}\sum\limits_{i=d}^{d}{x}_{i}^{2}}\right)-{\text{exp}}\left(\frac{1}{d}\sum\limits_{i=1}^{d}{\text{cos}}\left(2\pi {x}_{i}\right)\right)+{\text{exp}}\left(1\right)+20\)

\({f(x)}_{2}\)

Dixon-Price

\(f\left({\mathbf{x}}^{\mathbf{*}}\right)=0; {\mathbf{x}}^{\mathbf{*}}={2}^{-\frac{{2}^{i}-2}{{2}^{i}}}\mathrm{ for} i=1,\dots ,n\)

\({[-10, 10]}^{d}\)

30

\(f\left(x\right)={({x}_{1}-1)}^{2}+\sum\limits_{i=2}^{d}i{(2{x}_{i}^{2}-{x}_{i-1})}^{2}\)

\({f(x)}_{3}\)

Griewank

\(f\left({\mathbf{x}}^{\mathbf{*}}\right)=0; {\mathbf{x}}^{\mathbf{*}}=\left(0,\dots , 0\right)\)

\({[-600, 600]}^{d}\)

30

\(f\left(x\right)=\sum\limits_{i=1}^{d}\frac{{x}_{i}^{2}}{4000}-\prod_{i=1}^{d}{\text{cos}}\left(\frac{{x}_{i}}{\sqrt{i}}\right)+1\)

\({f(x)}_{4}\)

Infinity

\(f\left({\mathbf{x}}^{\mathbf{*}}\right)=0; {\mathbf{x}}^{\mathbf{*}}=\left(0,\dots , 0\right)\)

\({[-1, 1]}^{d}\)

30

\(f\left(x\right)=\sum\limits_{i=1}^{d}{x}_{i}^{6}(sen\left({x}_{i}\right)+2)\)

\({f(x)}_{5}\)

Levy

\(f\left({\mathbf{x}}^{\mathbf{*}}\right)=0; {\mathbf{x}}^{\mathbf{*}}=\left(1,\dots , 1\right)\)

\({[-10, 10]}^{d}\)

30

\(f\left(x\right)={sin}^{2}\left(\pi {\omega }_{1}\right)+\sum\limits_{i}^{d-1}{\left({\omega }_{1}-1\right)}^{2}\left[1+10{sin}^{2}\left(\pi {\omega }_{i}+1\right)\right]+{\left({\omega }_{d}-1\right)}^{2}\left[1+\right]{sin}^{2}\left(2\pi {\omega }_{d}\right) whre {\omega }_{i}=1+\frac{{x}_{i}-1}{4}\)

\({f(x)}_{6}\)

Mishra 1

\(f\left({\mathbf{x}}^{\mathbf{*}}\right)=2; {\mathbf{x}}^{\mathbf{*}}=\left(1,\dots , 1\right)\)

\({[0, 1]}^{d}\)

30

\(f\left(x\right)={(1+(d-\sum\limits_{i=1}^{d-1}{x}_{i}))}^{d-\sum\limits_{i=1}^{d-1}{x}_{i}}\)

\({f(x)}_{7}\)

Mishra 2

\(f\left({\mathbf{x}}^{\mathbf{*}}\right)=2; {\mathbf{x}}^{\mathbf{*}}=\left(1,\dots , 1\right)\)

\({[0, 1]}^{d}\)

30

\(f\left(x\right)={\left(1+\left(d-\sum\limits_{i=1}^{d-1}\frac{{x}_{i}+{x}_{i+1}}{2}\right)\right)}^{d-\sum\limits_{i=1}^{d-1}(\frac{{x}_{i}+{x}_{i+1}}{2})}\)

\({f(x)}_{8}\)

Mishra 11

\(f\left({\mathbf{x}}^{\mathbf{*}}\right)=0; {\mathbf{x}}^{\mathbf{*}}=\left(0,\dots , 0\right)\)

\({[-10, 10]}^{d}\)

30

\(f\left(x\right)={\left[\frac{1}{d}\sum\limits_{i=1}^{d}\left|{x}_{i}\right|-{\left(\prod_{i=1}^{d}|{x}_{i}|\right)}^\frac{1}{d}\right]}^{2}\)

\({f(x)}_{9}\)

MultiModal

\(f\left({\mathbf{x}}^{\mathbf{*}}\right)=0; {\mathbf{x}}^{\mathbf{*}}=\left(0,\dots , 0\right)\)

\({[-10, 10]}^{d}\)

30

\(f\left(x\right)=\sum\limits_{i=1}^{d}|{x}_{i}|\prod_{i=1}^{d}|{x}_{i}|\)

\({f(x)}_{10}\)

Penalty 1

\(f\left({\mathbf{x}}^{\mathbf{*}}\right)=0; {\mathbf{x}}^{\mathbf{*}}=\left(-1,\dots , -1\right)\)

\({[-50, 50]}^{d}\)

30

\(f\left(x\right)=\frac{\pi }{30}\left(10{sin}^{2}\left(\pi {y}_{1}\right)+\sum\limits_{i=1}^{d-1}{\left({y}_{i}-1\right)}^{2}\left[1+10{sin}^{2}\left(\pi {y}_{i+1}\right)\right]+{\left({y}_{i}-1\right)}^{2}\right)+\sum\limits_{i=1}^{d}u\left({x}_{i},\mathrm{10,100,4}\right) {y}_{i}=1+\frac{{x}_{i}+1}{4}, u({x}_{i},a,k,m)\left\{\begin{array}{c}\begin{array}{cc}k{({x}_{i}-a)}^{m}& ,{x}_{i}>a\end{array}\\ \begin{array}{cc}0& ,-a\le {x}_{i}\le a\end{array}\\ \begin{array}{cc}k{(-{x}_{i}-a)}^{m}& ,{x}_{i}<-a\end{array}\end{array}\right.\)

\({f(x)}_{11}\)

Penalty 2

\(f\left({\mathbf{x}}^{\mathbf{*}}\right)=0; {\mathbf{x}}^{\mathbf{*}}=\left(1,\dots , 1\right)\)

\({[-50, 50]}^{d}\)

30

\(f\left(x\right)=0.1\left({(sin\left(3\pi {x}_{1}\right))}^{2}+\sum\limits_{i=1}^{d-1}{\left({x}_{i}-1\right)}^{2}\left[1+{sin}^{2}\left(3\pi {x}_{i+1}\right)\right]+\left[{\left({x}_{i}-1\right)}^{2}{(sin\left(2\pi {x}_{i}\right))}^{2}\right]\right)+\sum\limits_{i=1}^{d}u\left({x}_{i},\mathrm{5,100,4}\right) u({x}_{i},a,k,m)\left\{\begin{array}{c}\begin{array}{cc}k{({x}_{i}-a)}^{m}& ,{x}_{i}>a\end{array}\\ \begin{array}{cc}0& ,-a\le {x}_{i}\le a\end{array}\\ \begin{array}{cc}k{(-{x}_{i}-a)}^{m}& ,{x}_{i}<-a\end{array}\end{array}\right.\)

\({f(x)}_{12}\)

Perm 1

\(f\left({\mathbf{x}}^{\mathbf{*}}\right)=0; {\mathbf{x}}^{\mathbf{*}}=\left(1, 2,\dots , n\right)\)

\({[-d, d]}^{d}\)

30

\(f\left(x\right)=\sum\limits_{k=1}^{d}{\left[\sum\limits_{i=1}^{d}({i}^{k}+50)({\left(\frac{{x}_{i}}{i}\right)}^{k}-1)\right]}^{2}\)

\({f(x)}_{13}\)

Perm 2

\(f\left({\mathbf{x}}^{\mathbf{*}}\right)=0; {\mathbf{x}}^{\mathbf{*}}=\left(1, 1/2,\dots , 1/n\right)\)

\({[-d, d]}^{d}\)

30

\(f\left(x\right)=\sum\limits_{i=1}^{d}{\left[\sum\limits_{j=1}^{d}({j}^{i}+10)({\left({x}_{j}^{i}-\frac{1}{{j}^{i}}\right)}^{i}\right]}^{2}\)

\({f(x)}_{14}\)

Plateau

\(f\left({\mathbf{x}}^{\mathbf{*}}\right)=30; {\mathbf{x}}^{\mathbf{*}}=\left(0,\dots , 0\right)\)

\({[-5.12, 5.12]}^{d}\)

30

\(f\left(x\right)=30+\sum\limits_{i=1}^{d}|{x}_{i}|\)

\({f(x)}_{15}\)

Powell

\(f\left({\mathbf{x}}^{\mathbf{*}}\right)=0; {\mathbf{x}}^{\mathbf{*}}=\left(0,\dots , 0\right)\)

\({[-4, 5]}^{d}\)

30

\(f\left(x\right)=\sum\limits_{i=1}^\frac{d}{4}\left[{({x}_{4i-3}+10{x}_{4i-2})}^{2}+5{({x}_{4i-1}-{x}_{4i})}^{2}+{({x}_{4i-2}-{x}_{4i-1})}^{4}+10{({x}_{4i-3}-{x}_{4i})}^{4}\right]\)

\({f(x)}_{16}\)

Quing

\(f\left({\mathbf{x}}^{\mathbf{*}}\right)=0; {\mathbf{x}}^{\mathbf{*}}=\left(0,\dots , 0\right)\)

\({[-1.28, 1.28]}^{d}\)

30

\(f\left(x\right)=\sum\limits_{i=1}^{d}{({x}_{i}^{2}-i)}^{2}\)

\({f(x)}_{17}\)

Quartic

\(f\left({\mathbf{x}}^{\mathbf{*}}\right)=0; {\mathbf{x}}^{\mathbf{*}}=\left(-1,\dots , -1\right)\)

\({[-10, 10]}^{d}\)

30

\(f\left(x\right)=\sum\limits_{i=1}^{d}i{x}_{i}^{4}+rand[\mathrm{0,1})\)

\({f(x)}_{18}\)

Quintic

\(f\left({\mathbf{x}}^{\mathbf{*}}\right)=0; {\mathbf{x}}^{\mathbf{*}}=\left(0,\dots , 0\right)\)

\({[-5.12, 5.12]}^{d}\)

30

\(f\left(x\right)=\sum\limits_{i=1}^{d}|{x}_{i}^{5}-3{x}_{i}^{4}+4{x}_{i}^{3}+2{x}_{i}^{2}-10{x}_{i}-4|\)

\({f(x)}_{19}\)

Rastrigin

\(f\left({\mathbf{x}}^{\mathbf{*}}\right)=0; {\mathbf{x}}^{\mathbf{*}}=\left(1,\dots , 1\right)\)

\({[-5, 10]}^{d}\)

30

\(f\left(x\right)=10d+\sum\limits_{i=1}^{d}[{x}_{i}^{2}-10{\text{cos}}(2\pi {x}_{i})]\)

\({f(x)}_{20}\)

Rosenbrock

\(f\left({\mathbf{x}}^{\mathbf{*}}\right)=0; {\mathbf{x}}^{\mathbf{*}}=\left(0.5,\dots , 0.5\right)\)

\({[-100, 100]}^{d}\)

30

\(f\left(x\right)=\sum\limits_{i=1}^{d}100{({x}_{i+1}-{{x}_{i}}^{2})}^{2}+{({x}_{i}-1)}^{2}\)

\({f(x)}_{21}\)

Schwefel 21

\(f\left({\mathbf{x}}^{\mathbf{*}}\right)=0; {\mathbf{x}}^{\mathbf{*}}=\left(0,\dots , 0\right)\)

\({[-100, 100]}^{d}\)

30

\(f\left(x\right)=max\left\{\left|{x}_{i}\right|, 1\le i\le d\right\}\)

\({f(x)}_{22}\)

Schwefel 22

\(f\left({\mathbf{x}}^{\mathbf{*}}\right)=0; {\mathbf{x}}^{\mathbf{*}}=\left(0,\dots , 0\right)\)

\({[-100, 100]}^{d}\)

30

\(f\left(x\right)=\sum\limits_{i=1}^{d}\left|{x}_{i}\right|+\prod_{i=1}^{d}\left|{x}_{i}\right|\)

\({f(x)}_{23}\)

Step

\(f\left({\mathbf{x}}^{\mathbf{*}}\right)=0; {\mathbf{x}}^{\mathbf{*}}=\left(0,\dots , 0\right)\)

\({[-100, 100]}^{d}\)

30

\(f\left(x\right)=\sum\limits_{i=1}^{d}|{x}_{i}^{2}|\)

\({f(x)}_{24}\)

Stybtang

\(f\left({\mathbf{x}}^{\mathbf{*}}\right)=-39.1659n; {\mathbf{x}}^{\mathbf{*}}=\left(-2.90,\dots , 2.90\right)\)

\({[-5, 5]}^{d}\)

30

\(f\left(x\right)=\frac{1}{2}\sum\limits_{i=1}^{d}({x}_{i}^{4}-16{x}_{i}^{2}+5{x}_{i})\)

\({f(x)}_{25}\)

Trid

\(f\left({\mathbf{x}}^{\mathbf{*}}\right)=-n(n+4)(n-1)/6; {\mathbf{x}}^{\mathbf{*}}=\left[i(n+1-i)\right]\mathrm{ for} i=1,\dots ,n\)

\({[{-d}^{2}, {d}^{2}]}^{d}\)

30

\(f\left(x\right)=\sum\limits_{i=1}^{d}{({x}_{i}-1)}^{2}-\sum\limits_{i=2}^{d}{x}_{i}{x}_{i-1}\)

\({f(x)}_{26}\)

Vincent

\(f\left({\mathbf{x}}^{\mathbf{*}}\right)=-n; {\mathbf{x}}^{\mathbf{*}}=\left(7.70,\dots , 7.70\right)\)

\({[0.25, 10]}^{d}\)

30

\(f\left(x\right)=-\frac{1}{n}\sum\limits_{i=1}^{n}{\text{sin}}[10{\text{log}}({x}_{i})]\)

\({f(x)}_{27}\)

Zakharov

\(f\left({\mathbf{x}}^{\mathbf{*}}\right)=0; {\mathbf{x}}^{\mathbf{*}}=\left(0,\dots , 0\right)\)

\({[-5, 10]}^{d}\)

30

\(f\left(x\right)=\sum\limits_{i=1}^{d}{x}_{i}^{2}+{\left(\sum\limits_{i=1}^{d}0.5i{x}_{i}\right)}^{2}+{\left(\sum\limits_{i=1}^{d}0.5i{x}_{i}\right)}^{4}\)

\({f(x)}_{28}\)

Sphere

\(f\left({\mathbf{x}}^{\mathbf{*}}\right)=0; {\mathbf{x}}^{\mathbf{*}}=0,\dots , 0\)

\({[-5, 5]}^{d}\)

30

\(f\left(x\right)=\sum\limits_{i=1}^{d}{x}_{i}^{2}\)

\({f(x)}_{29}\)

Sumpow

\(f\left({\mathbf{x}}^{\mathbf{*}}\right)=0; {\mathbf{x}}^{\mathbf{*}}=\left(0,\dots , 0\right)\)

\({[-1, 1]}^{d}\)

30

\(f\left(x\right)=\sum\limits_{i=1}^{d}{|{x}_{i}|}^{i+1}\)

\({f(x)}_{30}\)

Rastringin + Schwefel22 + Sphere

\(f\left({\mathbf{x}}^{\mathbf{*}}\right)=0; {\mathbf{x}}^{\mathbf{*}}=\left(0,\dots , 0\right)\)

\({[-100, 100]}^{d}\)

30

\(f\left(x\right)=\left[10d+\sum\limits_{i=1}^{d}[{x}_{i}^{2}-10{\text{cos}}(2\pi {x}_{i})]\right]+\left[\sum\limits_{i=1}^{d}\left|{x}_{i}\right|+\prod_{i=1}^{d}\left|{x}_{i}\right|\right]+\left[\sum\limits_{i=1}^{d}{x}_{i}^{2}\right]\)

Rights and permissions

Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Escobar-Cuevas, H., Cuevas, E., Avila, K. et al. An advanced initialization technique for metaheuristic optimization: a fusion of Latin hypercube sampling and evolutionary behaviors. Comp. Appl. Math. 43, 234 (2024). https://doi.org/10.1007/s40314-024-02744-0

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s40314-024-02744-0

Keywords

Mathematics Subject Classification

Navigation