Advertisement

Performance Analysis of Dynamic Optimization Algorithms

  • Amir Nakib
  • Patrick Siarry
Part of the Studies in Computational Intelligence book series (SCI, volume 433)

Abstract

In recent years dynamic optimization problems have attracted a growing interest from the community of stochastic optimization researchers with several approaches developed to address these problems. The goal of this chapter is to present the different tools and benchmarks to evaluate the performances of the proposed algorithms. Indeed, testing and comparing the performances of a new algorithm to the different competing algorithms is an important and hard step in the development process. The existence of benchmarks facilitates this step, however, the success of these benchmarks is conditioned by their use by the community. In this chapter, we cite many tested problems (we focused only on the continuous case), and we only present the most used: the moving peaks benchmark , and the last proposed: the generalized approach to construct benchmark problems for dynamic optimization (also called benchmark GDBG).

Keywords

Global Optimum Benchmark Problem Dynamic Optimization Dynamic Optimization Problem Test Problem Generator 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Bird, S., Li, X.: Using regression to improve local convergence. In: Proc. Congr. Evol. Comput., Singapore, pp. 592–599. IEEE (2007)Google Scholar
  2. 2.
    Blackwell, T., Branke, J.: Multi-swarms, exclusion and anti-convergence in dynamic environments. IEEE Transactions on Evolutionary Computation 10(4), 459–472 (2006)CrossRefGoogle Scholar
  3. 3.
    Branke, J.: The Moving Peaks Benchmark website (1999), http://www.aifb.unikarlsruhe.de/~jbr/MovPeaks
  4. 4.
    Eberhart, R.C., Shi, Y.: Computational intelligence: concepts to implementation. Elsevier (2007)Google Scholar
  5. 5.
    Farina, M., Deb, K., Amato, P.: Dynamic multiobjective optimization problems: test cases, approximations, and applications. IEEE Transactions on Evolutionary Computation 8(5), 425–442 (2004)CrossRefGoogle Scholar
  6. 6.
    Jin, Y., Sendhoff, B.: Constructing Dynamic Optimization Test Problems Using the Multi-objective Optimization Concept. In: Raidl, G.R., Cagnoni, S., Branke, J., Corne, D.W., Drechsler, R., Jin, Y., Johnson, C.G., Machado, P., Marchiori, E., Rothlauf, F., Smith, G.D., Squillero, G. (eds.) EvoWorkshops 2004. LNCS, vol. 3005, pp. 525–536. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  7. 7.
    Lepagnot, J., et al.: Performance analysis of MADO dynamic optimization algorithm. In: Proc. IEEE Adaptive Computing in Design and Manufacturing. Int. Conf. on Intelligent Systems Design and Applications, Pisa, pp. 37–42. IEEE (2009)Google Scholar
  8. 8.
    Lepagnot, J., Nakib, A., Oulhadj, H., Siarry, P.: A new multiagent algorithm for dynamic continuous optimization. International Journal of Applied Metaheuristic Computing 1(1), 16–38 (2010)CrossRefGoogle Scholar
  9. 9.
    Lepagnot, J., Nakib, A., Oulhadj, H., Siarry, P.: Brain cine-MRI registration using MLSDO dynamic optimization algorithm. In: IXth Metaheuristics International Conference, pp. S1–25–1–S1–25–9 (2011)Google Scholar
  10. 10.
    Li, C., Yang, M., Kang, L.: A New Approach to Solving Dynamic Traveling Salesman Problems. In: Wang, T.-D., et al. (eds.) SEAL 2006. LNCS, vol. 4247, pp. 236–243. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  11. 11.
    Li, C., Yang, S.: A Generalized Approach to Construct Benchmark Problems for Dynamic Optimization. In: Li, X., Kirley, M., Zhang, M., Green, D., Ciesielski, V., Abbass, H.A., Michalewicz, Z., Hendtlass, T., Deb, K., Tan, K.C., Branke, J., Shi, Y. (eds.) SEAL 2008. LNCS, vol. 5361, pp. 391–400. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  12. 12.
    Li, C., Yang, S., Nguyen, T.T., Yu, E.L., Yao, X., Jin, Y., Beyer, H.-G., Suganthan, P.N.: Benchmark generator for CEC 2009 competition on dynamic optimization. Technical report, University of Leicester, University of Birmingham, Nanyang Technological University (2008)Google Scholar
  13. 13.
    Li, X., Branke, J., Blackwell, T.: Particle swarm with speciation and adaptation in a dynamic environment. In: Proc. Genetic Evol. Comput. Conf., Seattle, Washington, USA, pp. 51–58. ACM (2006)Google Scholar
  14. 14.
    Liu, L., Yang, S., Wang, D.: Particle swarm optimization with composite particles in dynamic environments. IEEE Trans. Syst. Man. Cybern. Part B 40(10), 1634–1648 (2010)CrossRefGoogle Scholar
  15. 15.
    Lung, R.I., Dumitrescu, D.: Collaborative evolutionary swarm optimization with a Gauss chaotic sequence generator. Innovations in Hybrid Intelligent Systems 44, 207–214 (2007)CrossRefGoogle Scholar
  16. 16.
    Lung, R.I., Dumitrescu, D.: ESCA: A new evolutionary-swarm cooperative algorithm. SCI, vol. 129, pp. 105–114 (2008)Google Scholar
  17. 17.
    Morrison, R.W., De Jong, K.A.: A test problem generator for non-stationary environments. In: Proc. Congr. Evol. Comput., pp. 2047–2053 (1999)Google Scholar
  18. 18.
    Moser, I., Chiong, R.: Dynamic function optimisation with hybridised extremal dynamics. Memetic Computing 2(2), 137–148 (2010)CrossRefGoogle Scholar
  19. 19.
    Moser, I., Hendtlass, T.: A simple and efficient multi-component algorithm for solving dynamic function optimisation problems. In: Proc. Congr. Evol. Comput., pp. 252–259. IEEE, Singapore (2007)Google Scholar
  20. 20.
    Parrott, D., Li, X.: Locating and tracking multiple dynamic optima by a particle swarm model using speciation. IEEE Transactions on Evolutionary Computation 10(4), 440–458 (2006)CrossRefGoogle Scholar
  21. 21.
    Talbi, E.-G.: Metaheuristics: from design to implementation. John Wiley and Sons Inc. (2009)Google Scholar
  22. 22.
    Yang, S.: Non-stationary problem optimization using the primal-dual genetic algorithm. In: Proc. Congr. Evol. Comput., pp. 2246–2253. IEEE, Canberra (2003)Google Scholar
  23. 23.
    Yang, S., Li, C.: A clustering particle swarm optimizer for locating and tracking multiple optima in dynamic environments. IEEE Transactions on Evolutionary Computation (2010)Google Scholar
  24. 24.
    Yang, S., Yao, X.: Experimental study on population-based incremental learning algorithms for dynamic optimization problems. Soft Computing – A Fusion of Foundations, Methodologies and Applications 9(11), 815–834 (2005)zbMATHGoogle Scholar
  25. 25.
    Yang, S., Yao, X.: Population-based incremental learning with associative memory for dynamic environments. IEEE Transactions on Evolutionary Computation 12(5), 542–562 (2008)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  1. 1.Laboratoire Images, Signaux et Systèmes Intelligents (LISSI, EA 3956)Université Paris Est CréteilCréteilFrance

Personalised recommendations