Computational Optimization and Applications

, Volume 56, Issue 2, pp 481–502 | Cite as

The continuous differential ant-stigmergy algorithm for numerical optimization

Article

Abstract

Many promising optimization algorithms for solving numerical optimization problems come from population-based metaheuristics. A few of them are based on Swarm-Intelligence Algorithms, which are inspired by the collective behavior of social organisms. One of the most successful of such algorithms is the Differential Ant-Stigmergy Algorithm (DASA), which uses stigmergy, a method of communication in emergent systems where the individual parts (artificial ants) of the system communicate with one another by modifying their local environment (pheromone intensity). The main characteristic of the DASA is its underlying structure (pheromone graph) that uses discrete steps to move through a continuous search space. As a consequence of this the search-space movement is in some way limited and the algorithm’s time/space complexity is increased. In order to overcome the problem an improved algorithm called the Continuous Differential Ant-Stigmergy Algorithm (CDASA) is proposed and then benchmarked on standard benchmark functions. This benchmarking showed that the CDASA performs better than the DASA, especially at lower dimensions, that its time/space complexity is decreased, and that the algorithm code is simplified. As such, the CDASA is more suitable for parallel implementations on General-Purpose Graphic Processing Units. Compared to the Swarm-Intelligence Algorithms presented in this paper, the CDASA is the best-performing algorithm and competitive to the state-of-the-art algorithms belonging to different metaheuristic approaches.

Keywords

Metaheuristics Stigmergy Ant-colony optimization Swarm intelligence Numerical optimization 

References

  1. 1.
    Pelikan, M., Goldberg, D.E., Lobo, F.G.: A survey of optimization by building and using probabilistic models. Comput. Optim. Appl. 21, 5–20 (2002) MathSciNetCrossRefMATHGoogle Scholar
  2. 2.
    Eiben, E.A., Smith, J.E.: Introduction to Evolutionary Computing. Springer, Berlin (2003) CrossRefMATHGoogle Scholar
  3. 3.
    Bonabeau, E., Dorigo, M., Theraulaz, G.: Swarm Intelligence: From Natural to Artificial Systems. Oxford University Press, New York (1999) MATHGoogle Scholar
  4. 4.
    Parpinelli, R.S., Lopes, H.S.: New inspirations in swarm intelligence: a survey. Int. J. Bio-Inspired Comput. 3, 1–16 (2011) CrossRefGoogle Scholar
  5. 5.
    Dorigo, M.: Optimization, learning and natural algorithms. Ph.D. Thesis, Politecnico di Milano, Italy (1992) Google Scholar
  6. 6.
    Karimi, A., Nobahari, H., Siarry, P.: Continuous ant colony system and tabu search algorithms hybridized for global minimization of continuous multi-minima functions. Comput. Optim. Appl. 45, 639–661 (2010) MathSciNetCrossRefMATHGoogle Scholar
  7. 7.
    Korošec, P., Šilc, J., Filipič, B.: The differential ant-stigmergy algorithm. Inf. Sci. 192, 82–97 (2012) CrossRefGoogle Scholar
  8. 8.
    Kennedy, J., Eberhart, R.C.: Particle swarm optimization. In: Proceedings of the IEEE International Conference on Neural Networks, vol. 4, pp. 1942–1948 (1995) CrossRefGoogle Scholar
  9. 9.
    Karaboga, D.: An idea based on honey bee swarm for numerical optimization. Technical Report TR06, Erciyes University, Engineering Faculty, Computer Engineering Department, Kayseri, Turkey (2005) Google Scholar
  10. 10.
    Yang, X.S.: Engineering Optimization: An Introduction with Metaheuristic Applications. Wiley, New York (2010) CrossRefGoogle Scholar
  11. 11.
    Rouhipour, M., Bentley, P.J., Shayani, H.: Fast bio-inspired computation using a GPU-based systemic computer. Parallel Comput. 36, 591–617 (2010) CrossRefGoogle Scholar
  12. 12.
    Owens, J.D., Luebke, D., Govindaraju, N., Harris, M., Krüger, J., Lefohm, A.E., Purcell, T.J.: A survey of general-purpose computation on graphics hardware. Comput. Graph. Forum 26(1), 80–113 (2007) CrossRefGoogle Scholar
  13. 13.
    Fang, J., Varbanescu, A.L., Sips, H.: A comprehensive performance comparison of CUDA and OpenCL. In: Proceedings of the International Conference on Parallel Processing, pp. 216–225 (2011) Google Scholar
  14. 14.
    Yu, Q., Chen, C., Pan, Z.: Parallel genetic algorithms on programmable graphics hardware. Lect. Notes Comput. Sci. 3612, 1051–1059 (2005) CrossRefGoogle Scholar
  15. 15.
    Fok, K.L., Wong, T.T., Wong, M.L.: Evolutionary computing on consumer graphics hardware. IEEE Intell. Syst. 22, 69–78 (2007) CrossRefGoogle Scholar
  16. 16.
    Zhu, W.: A study of parallel evolution strategy: pattern search on a GPU computing platform. In: Proceedings of the First ACM/SIGEVO Summit on Genetic and Evolutionary Computation, pp. 765–772 (2009) CrossRefGoogle Scholar
  17. 17.
    de Veronese, L.P., Krohling, R.A.: Differential evolution algorithm on the GPU with c-CUDA. In: Proceedings of the IEEE Congress on Evolutionary Computation, pp. 1–7 (2010) CrossRefGoogle Scholar
  18. 18.
    Massi, L., Daolio, F., Cagnoni, S.: Evaluation of parallel particle swarm optimization algorithms within the CUDATM architecture. Inf. Sci. 181, 4642–4657 (2011) CrossRefGoogle Scholar
  19. 19.
    Cecilia, J.M., García, J.M., Nisbet, A., Amos, M., Ujaldón, M.: Enhancing data parallelism for ant colony optimization on GPUs. J. Parallel Distrib. Comput. 73, 42–51 (2013) CrossRefGoogle Scholar
  20. 20.
    Socha, K., Dorigo, M.: Ant colony optimization for continuous domains. Eur. J. Oper. Res. 185, 1155–1173 (2008) MathSciNetCrossRefMATHGoogle Scholar
  21. 21.
    Hansen, N., Finck, S., Ros, R., Auger, A.: Real-parameter black-box optimization benchmarking 2009: noiseless functions definitions. Technical Report RR-6829, INRIA (2010) Google Scholar
  22. 22.
    Hansen, N., Auger, A., Finck, S., Ros, R.: Real-parameter black-box optimization benchmarking 2010: experimental setup. Technical Report RR-7215, INRIA (2010) Google Scholar
  23. 23.
    Korošec, P., Šilc, J.: A stigmergy-based algorithm for black-box optimization: noiseless function testbed. In: Proceedings of the 10th Annual Conference on Genetic and Evolutionary Computation, pp. 2295–2302 (2009) Google Scholar
  24. 24.
    El-Abd, M., Kamel, M.S.: Black-box optimization benchmarking for noiseless function testbed using particle swarm optimization. In: Proceedings of the 10th Annual Conference on Genetic and Evolutionary Computation, pp. 2269–2274 (2009) Google Scholar
  25. 25.
    El-Abd, M., Kamel, M.S.: Black-box optimization benchmarking for noiseless function testbed using PSO_Bounds. In: Proceedings of the 10th Annual Conference on Genetic and Evolutionary Computation, pp. 2275–2280 (2009) Google Scholar
  26. 26.
    El-Abd, M., Kamel, M.S.: Black-box optimization benchmarking for noiseless function testbed using an EDA and PSO hybrid. In: Proceedings of the 10th Annual Conference on Genetic and Evolutionary Computation, pp. 2263–2268 (2009) Google Scholar
  27. 27.
    García-Nieto, J., Alba, E., Apolloni, J.: Noiseless functions black-box optimization: evaluation of a hybrid particle swarm with differential operators. In: Proceedings of the 10th Annual Conference on Genetic and Evolutionary Computation, pp. 2231–2238 (2009) Google Scholar
  28. 28.
    El-Abd, M.: Black-box optimization benchmarking for noiseless function testbed using artificial bee colony algorithm. In: Proceedings of the 11th Annual Conference on Genetic and Evolutionary Computation, pp. 1719–1724 (2010) CrossRefGoogle Scholar
  29. 29.
    Rothlauf, F. (ed.) Proceedings of the 10th Annual Conference on Genetic and Evolutionary Computation, Montréal, Canada (2009) Google Scholar
  30. 30.
    Pelikan, M., Branke, J. (eds.) Proceedings of the 11th Annual Conference on Genetic and Evolutionary Computation, Portland, Oregon (2010) Google Scholar
  31. 31.
    Comparing continuous optimisers (2013). http://coco.gforge.inria.fr
  32. 32.
    Derrac, J., García, S., Molina, D., Herrera, F.: A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm Evol. Comput. 1, 3–18 (2011) CrossRefGoogle Scholar
  33. 33.
    Holm, S.: A simple sequentially rejective multiple test procedure. Scand. J. Stat. 6, 65–70 (1979) MathSciNetMATHGoogle Scholar
  34. 34.
    Hornby, G.S.: Steady-state ALPS for real-valued problems. In: Proceedings of the 10th Annual Conference on Genetic and Evolutionary Computation, pp. 795–802 (2009) Google Scholar
  35. 35.
    Ros, R.: Benchmarking the BFGS algorithm on the BBOB-2009 function testbed. In: Proceedings of the 10th Annual Conference on Genetic and Evolutionary Computation, pp. 2409–2414 (2009) Google Scholar
  36. 36.
    Pošík, P.: BBOB-benchmarking a simple estimation of distribution algorithm with Cauchy distribution. In: Proceedings of the 10th Annual Conference on Genetic and Evolutionary Computation, pp. 2309–2314 (2009) Google Scholar
  37. 37.
    Hansen, N.: Benchmarking a BI-population CMA-ES on the BBOB-2009 function testbed. In: Proceedings of the 10th Annual Conference on Genetic and Evolutionary Computation, pp. 2389–2396 (2009) Google Scholar
  38. 38.
    Huyer, W., Neumaier, A.: Benchmarking of MCS on the noiseless function testbed (2009). http://www.mat.univie.ac.at/~neum/papers.html, P989
  39. 39.
    Hansen, N.: Benchmarking the Nelder-Mead downhill simplex algorithm with many local restarts. In: Proceedings of the 10th Annual Conference on Genetic and Evolutionary Computation, pp. 2403–2408 (2009) Google Scholar
  40. 40.
    Beyer, H.G., Sendhoff, B.: Covariance matrix adaptation revisited—the CMSA evolution strategy. In: Proceedings of the 10th International Conference on Parallel Problem Solving from Nature, pp. 123–132 (2008) CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2013

Authors and Affiliations

  1. 1.Computer Systems DepartmentJožef Stefan InstituteLjubljanaSlovenia

Personalised recommendations