Skip to main content
Log in

A chaotic salp swarm algorithm based on quadratic integrate and fire neural model for function optimization

  • Regular Paper
  • Published:
Progress in Artificial Intelligence Aims and scope Submit manuscript

Abstract

Real-world problems generally do not possess mathematical features such as differentiability and convexity and thus require non-traditional approaches to find optimal solutions. SSA is a meta-heuristic optimization algorithm based on the swimming behaviour of salps. Though a novel idea, it suffers from a slow convergence rate to the optimal solution, due to lack of diversity in salp population. In order to improve its performance, chaotic oscillations generated from quadratic integrate and fire model have been augmented to SSA. This improves the balance between exploration and exploitation, generating diversity in the salp population, thus avoiding local entrapment. CSSA has been tested against twenty-two bench mark functions. Its performance has been compared with existing standard optimization algorithms, namely particle swarm optimization, ant–lion optimization and salp swarm algorithm. Statistical tests have been carried out to prove the superiority of chaotic salp swarm algorithm over the other three algorithms. Finally, chaotic SSA is applied on three engineering problems to demonstrate its practicability in real-life applications.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  1. Karaboga, D., Akay, B.: A modified artificial bee colony (abc) algorithm for constrained optimization problems. Appl. Soft Comput. 11(3), 3021–3031 (2011)

    Article  Google Scholar 

  2. Fister Jr, I., Yang, X.S., Fister, I., Brest, J., Fister, D.: A brief review of nature-inspired algorithms for optimization. arXiv preprint arXiv:1307.4186 (2013)

  3. Mirjalili, S.: SCA: a sine cosine algorithm for solving optimization problems. Knowl. Based Syst. 96, 120–133 (2016)

    Article  Google Scholar 

  4. Parpinelli, R.S., Lopes, H.S.: New inspirations in swarm intelligence: a survey. Int. J. Bio-Inspir. Comput. 3, 1–16 (2011)

    Article  Google Scholar 

  5. Fonseca, C.M., Fleming, P.J.: An overview of evolutionary algorithms in multiobjective optimization. Evol. Comput. 3, 1–16 (1995)

    Article  Google Scholar 

  6. Biswas, A., Mishra, K., Tiwari, S., Misra, A.: Physics-inspired optimization algorithms: a survey. J. Optim. 2013, 438152 (2013)

    Google Scholar 

  7. Glover, F.: Tabu search–part I. ORSA J. Comput. 1(3), 190–206 (1989)

    Article  MathSciNet  MATH  Google Scholar 

  8. Selman, B., Gomes, C.P.: Hill-climbing Search. Encyclopedia of Cognitive Science, pp. 333–336. Wiley (2006)

  9. Lourenço, H.R., Martin, O.C., Stützle, T.: Iterated local search. In: Glover, F., Kochenberger, G.A. (eds.) Handbook of Metaheuristics, pp. 320–353. Springer, Boston (2003)

    Chapter  Google Scholar 

  10. Van Laarhoven, P.J., Aarts, E.H.: Simulated annealing. In: van Laarhoven, P.J., Aarts, E.H. (eds.) Simulated Annealing: Theory and Applications. Springer, Dordrecht (1987)

    Chapter  MATH  Google Scholar 

  11. Basturk, B., Karaboga, D.: An artificial bee colony (ABC) algorithm for numeric function optimization. In: IEEE Swarm Intelligence Symposium, pp. 12–14 (2006)

  12. Koza, J.R., Koza, J.R.: Genetic programming: on the programming of computers by means of natural selection (vol. 1). MIT press (1992)

  13. Storn, R., Price, K.: Differential evolution—a simple and efficient heuristic for global optimization over continuous spaces. J. Global Optim. 11, 341–59 (1997)

    Article  MathSciNet  MATH  Google Scholar 

  14. Yao, X., Liu, Y., Lin, G.: Evolutionary programming made faster. IEEE Trans. Evolut. Comput. 3, 82–102 (1999)

    Article  Google Scholar 

  15. Simon, D.: Biogeography-based optimization. IEEE Trans. Evolut. Comput. 12, 702–13 (2008)

    Article  Google Scholar 

  16. Hansen, N., Müller, S.D., Koumoutsakos, P.: Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (CMAES). Evolut. Comput. 11, 1–18 (2003)

    Article  Google Scholar 

  17. Webster, B., Bernhard, P.J.: A local search optimization algorithm based on natural principles of gravitation. In: Proceedings of the 2003 International Conference on Information and Knowledge Engineering (IKE’03), Las Vegas, Nevada, USA, pp. 255–261 (2003)

  18. Formato, R.A.: Central force optimization: a new metaheuristic with applications in applied electromagnetics. Prog. Electromagn. Res. 77, 425–91 (2007)

    Article  Google Scholar 

  19. Alatas, B.: ACROA: artificial chemical reaction optimization algorithm for global optimization. Expert Syst. Appl. 38, 13170–80 (2011)

    Article  Google Scholar 

  20. Du, H., Wu, X., Zhuang, J.: Small-world optimization algorithm for function optimization. In: Jiao, L., Wang, L., Gao, X., Liu, J., Wu, F. (eds.) Advances in Natural Computation, pp. 264–273. Springer, Berlin (2006)

    Chapter  Google Scholar 

  21. Shah-Hosseini, H.: Principal components analysis by the galaxy-based search algorithm: a novel metaheuristic for continuous optimisation. Int. J. Comput. Sci. Eng. 6, 132–40 (2011)

    Google Scholar 

  22. Moghaddam, F.F., Moghaddam, R.F., Cheriet, M.: Curved space optimization: a random search based on general relativity theory. arXiv, preprint arXiv:1208.2214 (2012)

  23. Rashedi, E., Nezamabadi-Pour, H., Saryazdi, S.: GSA: a gravitational search algorithm. Inf. Sci. 179, 2232–48 (2009)

    Article  MATH  Google Scholar 

  24. Kaveh, A., Talatahari, S.: A novel heuristic optimization method: charged system search. Acta Mech. 213, 267–89 (2010)

    Article  MATH  Google Scholar 

  25. Mirjalili, S., Mirjalili, S.M., Lewis, A.: Grey wolf optimizer. Adv. Eng. Softw. 69, 46–61 (2014)

    Article  Google Scholar 

  26. Kaveh, A., Khayatazad, M.: A new meta-heuristic method: ray optimization. Comput. Struct. 112, 283–94 (2012)

    Article  Google Scholar 

  27. Erol, O.K., Eksin, I.: A new optimization method: big bang-big crunch. Adv. Eng. Softw. 37, 106–11 (2006)

    Article  Google Scholar 

  28. Yang, X.-S., Deb, S.: Cuckoo search via Lévy flights. In: World Congress on Nature & Biologically Inspired Computing, 2009. NaBIC 2009, pp. 210–214 (2009)

  29. Kaveh, A.: Particle swarm optimization. In: Kaveh, A. (ed.) Advances in Metaheuristic Algorithms for Optimal Design of Structures, pp. 9–40. Springer, Cham (2014)

    Chapter  MATH  Google Scholar 

  30. Dorigo, M., Birattari, M., Stutzle, T.: Ant colony optimization. IEEE Comput. Intell. Mag. 1, 28–39 (2006)

    Article  Google Scholar 

  31. Kennedy, J., Eberhart, R.: Particle swarm optimization. In: Proceedings of the IEEE International Conference on Neural Networks, pp. 1942–1948 (1995)

  32. Yang, X.-S.: Firefly algorithm, stochastic test functions and design optimisation. Int. J. Bio-Inspir. Comput. 2, 78–84 (2010)

    Article  Google Scholar 

  33. Mirjalili, S., Mirjalili, S.M., Lewis, A.: Grey wolf optimizer. Adv. Eng. Softw. 69, 46–61 (2014)

    Article  Google Scholar 

  34. Rajabioun, R.: Cuckoo optimization algorithm. Appl. Soft Comput. 11, 5508–5518 (2011)

    Article  Google Scholar 

  35. Pan, W.-T.: A new fruit fly optimization algorithm: taking the financial distress model as an example. Knowl. Based Syst. 26, 69–74 (2012)

    Article  Google Scholar 

  36. Yang, X.-S.: A new metaheuristic bat-inspired algorithm. In: Nature Inspired Cooperative Strategies for Optimization (NICSO 2010), Springer, pp. 65–74 (2010)

  37. Mirjalili, S., Gandomi, A.H., Mirjalili, S.Z., Saremi, S., Faris, H., Mirjalili, S.M.: Salp swarm algorithm: a bio-inspired optimizer for engineering design problems. Adv. Eng. Softw. 114, 163–191 (2017)

    Article  Google Scholar 

  38. Montiel, O., Castillo, O., Melin, P., Díaz, A.R., Sepúlveda, R.: Human evolutionary model: a new approach to optimization. Inf. Sci. 177, 2075–2098 (2007)

    Article  Google Scholar 

  39. Liu, C., Han, M., Wang, X.: A novel evolutionary membrane algorithm for global numerical optimization, In: 2012 Third International Conference on Intelligent Control and Information Processing (ICICIP), pp. 727–732 (2012)

  40. Farasat, A., Menhaj, M.B., Mansouri, T., Moghadam, M.R.S.: ARO: a new modelfree optimization algorithm inspired from asexual reproduction. Appl. Soft Comput. 10, 1284–1292 (2010)

    Article  Google Scholar 

  41. Kaveh, A., Farhoudi, N.: A new optimization method: Dolphin echolocation. Adv. Eng. Softw. 59, 53–70 (2013)

    Article  Google Scholar 

  42. Mirjalili, S., Lewis, A.: The whale optimization algorithm. Adv. Eng. Softw. 95, 51–67 (2016)

    Article  Google Scholar 

  43. Shi, Y., Eberhart, R.: A modified particle swarm optimizer. In: The 1998 IEEE International Conference on Evolutionary Computation Proceedings, 1998. IEEE World Congress on Computational Intelligence, pp. 69–73 (1998)

  44. Karaboga, D., Akay, B.: A modified artificial bee colony (abc) algorithm for constrained optimization problems. Appl. Soft Comput. 11(3), 3021–3031 (2011)

    Article  Google Scholar 

  45. Gandomi, A., Yang, X.-S., Talatahari, S., Alavi, A.: Firefly algorithm with chaos. Commun. Nonlinear Sci. Numer. Simul. 18(1), 89–98 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  46. Zheng, G., Tonnelier, A.: Chaotic solutions in the quadratic integrate-and-fire neuron with adaptation. Cognit. Neurodyn. 3(3), 197–204 (2009)

    Article  Google Scholar 

  47. Mirjalili, S.: Moth-flame optimization algorithm: a novel nature-inspired heuristic paradigm. Knowl. Based Syst. 89, 228–249 (2015)

    Article  Google Scholar 

  48. Bidar, M., Kanan, H. R., Mouhoub, M., Sadaoui, S.: Mushroom Reproduction Optimization (MRO): A Novel Nature-Inspired Evolutionary Algorithm. In: 2018 IEEE Congress on Evolutionary Computation (CEC), pp. 1–10 (2018)

  49. Arora, S., Singh, S.: Butterfly optimization algorithm: a novel approach for global optimization. Soft Comput. 23(3), 715–734 (2019)

    Article  Google Scholar 

  50. dos Santos Coelho, L., Mariani, V.C.: Use of chaotic sequences in a biologically inspired algorithm for engineering design optimization. Expert Syst. Appl. 34(3), 1905–1913 (2008)

    Article  Google Scholar 

  51. Almonacid, B., Soto, R.: Andean Condor Algorithm for cell formation problems. Nat. Comput. 1–31 (2018). https://doi.org/10.1007/s11047-018-9675-0

  52. Han, X., Chang, X.: An intelligent noise reduction method for chaotic signals based on genetic algorithms and lifting wavelet transforms. Inf. Sci. 218, 103–118 (2013)

    Article  Google Scholar 

  53. Arora, S., Singh, S.: An improved butterfly optimization algorithm with chaos. J. Intell. Fuzzy Syst. 32(1), 1079–1088 (2017)

    Article  MATH  Google Scholar 

  54. Kaveh, A., Mahdavi, V.: Colliding Bodies Optimization method for optimum discrete design of truss structures. Comput. Struct. 139, 43–53 (2014)

    Article  Google Scholar 

  55. Hatamlou, A.: Blackhole:a new heuristic optimization approach for data clus-tering. Inf. Sci. 222, 175–184 (2013)

    Article  MathSciNet  Google Scholar 

  56. Mahdavi, M., Fesanghary, M., Damangir, E.: An improved harmony search algorithm for solving optimization problems. Appl. Math. Comput. 188(2), 1567–1579 (2007)

    MathSciNet  MATH  Google Scholar 

  57. Mirjalili, S.: The ant lion optimizer. Adv. Eng. Softw. 83, 80–98 (2015). https://doi.org/10.1016/j.advengsoft.2015.01.010

    Article  Google Scholar 

  58. Mirjalili, S., Gandomi, A.H., Mirjalili, S.Z., Saremi, S., Faris, H., Mirjalili, S.M.: Salp swarm algorithm: a bio-inspired optimizer for engineering design problems. Adv. Eng. Softw. 114, 163–191 (2017)

    Article  Google Scholar 

  59. Rizk-Allah, R.M.: Hybridizing sine cosine algorithm with multi-orthogonal search strategy for engineering design problems. J. Comput. Des. Eng. 5, 249–273 (2017)

    Google Scholar 

  60. Demšar, J.: Statistical comparisons of classifiers over multiple data sets. J. Mach. Learn. Res. 7(Jan), 1–30 (2006)

    MathSciNet  MATH  Google Scholar 

  61. Gerstner, W., Kistler, W.M., Naud, R., Paninski, L.: Neuronal dynamics: from single neurons to networks and models of cognition. Cambridge University Press, Cambridge (2014)

    Book  Google Scholar 

  62. Cormen, T.H., Leiserson, C.E., Rivest, R.L., Stein, C.: Introduction to Algorithms. MIT Press, Cambridge (2009)

    MATH  Google Scholar 

  63. Saremi, S., Mirjalili, S.M., Mirjalili, S.: Chaotic krill herd optimization algorithm. Proced. Technol. 12, 180–185 (2014)

    Article  Google Scholar 

  64. Arora, S., Anand, P.: Chaotic grasshopper optimization algorithm for global optimization. Neural Comput. Appl. 1–21 (2018). https://doi.org/10.1007/s00521-018-3343-2

  65. Kaur, G., Arora, S.: Chaotic whale optimization algorithm. J. Comput. Des. Eng. 5(3), 275–284 (2018)

    Google Scholar 

  66. Kohli, M., Arora, S.: Chaotic grey wolf optimization algorithm for constrained optimization problems. J. Comput. Des. Eng. 5(4), 458–472 (2018)

    Google Scholar 

  67. Alatas, B.: Chaotic bee colony algorithms for global numerical optimization. Expert Syst. Appl. 37(8), 5682–5687 (2010)

    Article  Google Scholar 

  68. Mishra, A., Majhi, S.K.: Design and Analysis of Modified Leaky Integrate and Fire Model—TENCON IEEE Region 10 Conference (2018)

  69. Mishra, A., Majhi, S.K.: A comprehensive survey of recent developments in neuronal communication and computational neuroscience. J. Ind. Inf. Integr. 13, 40–54 (2019)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Santosh Kumar Majhi.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Benchmark functions

Benchmark functions

In this section, details of the function used in Table 2 have been described.

  • Function F1

    • Mathematical expression: \(\sum _{i=1}^{n} x_i^2\)

    • Lower bound: \(-100\)

    • Upper bound: 100

    • Dimensions: 30

  • Function F2

    • Mathematical expression: \(\sum _{i=1}^{n} \left| x_i \right| + \prod _{i=1}^{n} \left| x_i \right| \)

    • Upper bound: -10

    • Lower bound: 10

    • Dimensions: 10

  • Function F3

    • Mathematical expression: \(\sum _{i=1}^{n}\left( \sum _{j=1}^{i} x^{j}\right) ^2\)

    • Upper bound: -100

    • Lower bound: 100

    • Dimensions: 10

  • Function F4

    • Mathematical expression: \(max_i\{ \left| x_i \right| , 1\le i \le n \}\)

    • Upper bound: -100

    • Lower bound: 100

    • Dimensions: 10

  • Function F5

    • Mathematical expression: \(\sum _{i=1}^{n-1}[100(x_{i+1}-x_{i}^2)^2 +(x_i -1)^2 ]\)

    • Upper bound: \(-30\)

    • Lower bound: 30

    • Dimensions: 10

  • Function F6

    • Mathematical expression: \(\sum _{i=1}^{n}([x_i +0.5])^2\)

    • Upper bound: \(-100\)

    • Lower bound: 100

    • Dimensions: 10

  • Function F7

    • Mathematical expression: \(\sum _{i=1}^{n} ix_i^4+\hbox {random}[0,1)\)

    • Upper bound: \(-1.28\)

    • Lower bound:1.28

    • Dimensions: 10

  • Function F8

    • Mathematical expression: \(\sum _{i=1}^{n} ix_i^4+random[0,1)\)

    • Upper bound: \(-500\)

    • Lower bound: 500

    • Dimensions: 10

  • Function F9

    • Mathematical expression: \(\sum _{i=1}^{n} [x_i^2-10\cos (2\pi x_i)+10]\)

    • Upper bound: \(-5.12\)

    • Lower bound: 5.12

    • Dimensions: 10

  • Function F10

    • Mathematical expression: \(-20exp\left( {-}0.2\sqrt{\frac{1}{n}\sum _{i{=}1}^{n}x_i^2} \right) -exp\left( \frac{1}{n}\sum _{i=1}^{n}\cos (2\pi x_i) \right) +20 +e\)

    • Upper bound: \(-32\)

    • Lower bound: 32

    • Dimensions: 10

  • Function F11

    • Mathematical expression: \(\frac{1}{400} \sum _{i=1}^{n}x_i^2 - \prod _{i=1}^{n}\cos \left( \frac{x_i}{\sqrt{i}} \right) +1\)

    • Upper bound: \(-600\)

    • Lower bound: 600

    • Dimensions: 10

  • Function F12

    • Mathematical expression: \(\frac{\pi }{n} \left\{ \right. 10 \sin (\pi y_1) +\sum _{i=1}^{n-1}(y_i -1)^2 [1 + 10 \sin ^2 (\pi y_{i+1})]+(y_n-1)^2\} + \sum _{i=1}^{n} u(x_i,10,100,4)\)

      \(y_i=1+\frac{x_i+1}{4}\)

      \(u(x_i,a,k,m)=\left\{ \begin{array}{ll} k(x_i-a)^m &{} x_i >a\\ 0 &{} -a<x_i<a\\ k(-x_i-a)^m &{} x_i < -a \end{array}\right. \)

    • Upper bound: \(-50\)

    • Lower bound: 50

    • Dimensions: 10

  • Function F13

    • Mathematical expression:

      \(0.1 \left\{ \sin ^2\left( 3 \pi x_1 \right) \right. + \sum _{i=1}^{n} (x_i-1)^2[1+\sin ^2(3 \pi x_i +1)]+(x_n -1)^2 [1+\sin ^2(2 \pi x_n)]\left. \right\} + \sum _{i=1}^{n} u(x_i,5,100,4)\)

      \(y_i=1+ \frac{x_i + 1 }{4}\)

      \(u(x_i,a,k,m)=\left\{ \begin{array}{ll} k(x_i-a)^m &{} x_i >a\\ 0 &{} -a<x_i<a\\ k(-x_i-a)^m &{} x_i < -a \end{array}\right. \)

    • Upper bound: \(-50\)

    • Lower bound: 50

    • Dimensions: 10

  • Function F14

    • Pseudocode:

      \(\hbox {aS}=[-32 -16 0 16 32 -32 -16 0 16 32 -32 -16 0 16 32 -32 -16 0 16 32 -32 -16 0 16 32;,\ldots \)

      \(-32 -32 -32 -32 -32 -16 -16 -16 -16 -16 0 0 0 0 0 16 16 16 16 16 32 32 32 32 32];\)

      \(\mathbf{for }\hbox { j}=1:25\)

      \(\hbox {bS(j)}=\mathbf{sum }(({x}'-\hbox {aS}(:,\hbox {j})).\hat{\,}6)\);

      end

      \(\hbox {o}=(1/500+\mathbf{sum }(1./([1:25]+\hbox {bS}))).\hat{\,}(-1)\);

      end

    • Upper bound: \(-65.536\)

    • Lower bound: 65.536

    • Dimensions: 2

  • Function F15

    • Pseudocode:

      \(\hbox {aK}=[.1957 .1947 .1735 .16 .0844 .0627 .0456 .0342 .0323 .0235 .0246]\);

      \(\hbox {bK}=[.25 .5 1 2 4 6 8 10 12 14 16]\);

      \(\hbox {bK}=1./\hbox {bK}\);

      \(\hbox {o}=\mathbf{sum }((\hbox {aK}-(({x}(1).^{*}(\hbox {bK}.\hat{\,}2 +{x}(2).^{*}\hbox {bK}))./(\hbox {bK}.\hat{\,}2+{x}(3).^{*} \hbox {bK}+{x}(4)))).\hat{\,}2)\);

      end

    • Upper bound: \(-5\)

    • Lower bound: 5

    • Dimensions: 4

  • Function F16

    • Pseudocode:

      $$\begin{aligned} O&=4*(x(1)^2)-2.1*(x(1)^4)+(x(1)^6)/3\\&\quad +x(1)*x(2)-4*(x(2)^2)+4*(x(2)^4); \end{aligned}$$
    • Upper bound: \(-5\)

    • Lower bound: 5

    • Dimensions: 4

  • Function F17

    • Pseudocode:

      \(\hbox {o}=(1+({x}(1)+{x}(2)+1)\hat{\,}2^{*}(19-14^{*}{x}(1) +3^{*}({x}(1)\hat{\,}2)-14^{*}{x}(2)+6^{*}{x}(1)^{*}{x} (2)+3^{*}{x}(2)\hat{\,}2))^{*}\ldots (30+(2^{*}{x}(1)-3^{*}{x}(2))\hat{\,}2^{*}(18-32^{*}{x}(1) +12^{*}({x}(1)\hat{\,}2)+48^{*}{x}(2)-36^{*}{x}(1)^{*}{x} (2)+27^{*}({x}(2)\hat{\,}2)))\);

    • Upper bound: \(-2\)

    • Lower bound: 2

    • Dimensions: 2

  • Function F18

    • Pseudocode:

      \(\hbox {aH}=[3 10 30;.1 10 35;3 10 30;.1 10 35]\);

      \(\hbox {cH}=[1 1.2 3 3.2]\);

      \(\hbox {pH}=[.3689 .117 .2673;.4699 .4387 .747;.1091 .8732 .5547; .03815 .5743 .8828]\);

      \(\hbox {o}=0\);

      for \(\hbox {i}=1:4\)

      \(\hbox {o}=\hbox {o}-\hbox {cH(i)}^{*}\hbox {exp}(-(\mathbf{sum }(\hbox {aH}(\hbox {i},:). ^{*}((\hbox {x-pH}(\hbox {i},:)).\hat{\,}2))))\);

      end

    • Upper bound: 0

    • Lower bound: 1

    • Dimensions: 3

  • Function F19

    • Pseudocode: \(\hbox {aSH}=[4 4 4 4;1 1 1 1;8 8 8 8;6 6 6 6;3 7 3 7;2 9 2 9;5 5 3 3; 8 1 8 1;6 2 6 2;7 3.6 7 3.6]\);

      \(\hbox {cSH}=[.1 .2 .2 .4 .4 .6 .3 .7 .5 .5]\);

      \(\hbox {o}=0\);

      for \(\hbox {i}=1:5\)

      \(\hbox {o}=\hbox {o-((x-aSH(i,:))}^{*}(\hbox {x-aSH}(\hbox {i},:))'+\hbox {cSH}(\hbox {i}))\hat{\,} (-1)\);

      end

    • Upper bound: 0

    • Lower bound: 1

    • Dimensions: 6

  • Function F20

    • Pseudocode:

      \(\hbox {aH}=[10 3 17 3.5 1.7 8;.05 10 17 .1 8 14;3 3.5 1.7 10 17 8;17 8 .05 10 .1 14]\);

      \(\hbox {cH}=[1 1.2 3 3.2]\);

      \(\hbox {pH}=[.1312 .1696 .5569 .0124 .8283 .5886;.2329 .4135 .8307 .3736 .1004 .9991\); \(\ldots \, .2348 .1415 .3522 .2883 .3047 .6650;.4047 .8828 .8732 .5743 .1091 .0381]\);

      \(\hbox {o}=0\);

      for \(\hbox {i}=1:4\)

      \(\hbox {o}=\hbox {o}-\hbox {cH(i)}^{*}\mathbf{exp }(-(\mathbf{sum }(\hbox {aH}(\hbox {i},:).^{*} ((\hbox {x-pH}(\hbox {i},:)).\hat{\,}2))))\);

      end

    • Upper bound: 0

    • Lower bound: 10

    • Dimensions: 4

  • Function F21

    • Pseudocode: \(\hbox {aSH}=[4 4 4 4;1 1 1 1;8 8 8 8; 6 6 6 6;3 7 3 7; 2 9 2 9;5 5 3 3;8 1 8 1;6 2 6 2;7 3.6 7 3.6]\);

      \(\hbox {cSH}=[.1 .2 .2 .4 .4 .6 .3 .7 .5 .5]\);

      \(\hbox {o}=0\);

      for \(\hbox {i}=1:7\)

      \(\hbox {o}=\hbox {o}-((\hbox {x-aSH}(\hbox {i},:))^{*}(\hbox {x-aSH}(\hbox {i},:)) '+\hbox {cSH}(\hbox {i}))\hat{\,}(-1)\);

      end

    • Upper bound: 0

    • Lower bound: 10

    • Dimensions: 4

  • Function F22

    • Pseudocode:

      \(\hbox {aSH}=[4 4 4 4;1 1 1 1;8 8 8 8;6 6 6 6; 3 7 3 7;2 9 2 9;5 5 3 3; 8 1 8 1;6 2 6 2;7 3.6 7 3.6]\);

      \(\hbox {cSH}=[.1 .2 .2 .4 .4 .6 .3 .7 .5 .5]\);

      \(\hbox {o}=0\);

      for \(\hbox {i}=1:10\)

      \(\hbox {o}=\hbox {o}-((\hbox {x-aSH}(\hbox {i},:))^{*}(\hbox {x-aSH}(\hbox {i},:))' +\hbox {cSH}(\hbox {i}))\hat{\,}(-1)\);

      end

    • Upper bound: 0

    • Lower bound: 10

    • Dimensions: 4

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Majhi, S.K., Mishra, A. & Pradhan, R. A chaotic salp swarm algorithm based on quadratic integrate and fire neural model for function optimization. Prog Artif Intell 8, 343–358 (2019). https://doi.org/10.1007/s13748-019-00184-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13748-019-00184-0

Keywords

Navigation