Memetic Computing

, Volume 5, Issue 1, pp 3–18 | Cite as

A framework for finding robust optimal solutions over time

  • Yaochu Jin
  • Ke Tang
  • Xin Yu
  • Bernhard Sendhoff
  • Xin Yao
Regular research paper


Dynamic optimization problems (DOPs) are those whose specifications change over time, resulting in changing optima. Most research on DOPs has so far concentrated on tracking the moving optima (TMO) as closely as possible. In practice, however, it will be very costly, if not impossible to keep changing the design when the environment changes. To address DOPs more practically, we recently introduced a conceptually new problem formulation, which is referred to as robust optimization over time (ROOT). Based on ROOT, an optimization algorithm aims to find an acceptable (optimal or sub-optimal) solution that changes slowly over time, rather than the moving global optimum. In this paper, we propose a generic framework for solving DOPs using the ROOT concept, which searches for optimal solutions that are robust over time by means of local fitness approximation and prediction. Empirical investigations comparing a few representative TMO approaches with an instantiation of the proposed framework are conducted on a number of test problems to demonstrate the advantage of the proposed framework in the ROOT context.


Robust optimisation over time (Root) Dynamic optimisation Evolutionary algorithms Particle swarm optimisation Fitness approximation 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Branke J (2002) Evolutionary optimization in dynamic environments. Kluwer, NorwellzbMATHCrossRefGoogle Scholar
  2. 2.
    Jin Y, Branke J (2005) Evolutionary optimization in uncertain environments—a survey. IEEE Trans Evol Comput 9(3): 303–317CrossRefGoogle Scholar
  3. 3.
    Weise T (2009) Global optimization algorithms—theory and application, 2nd edn.
  4. 4.
    Yu X, Tang K, Chen T, Yao X (2009) Empirical analysis of evolutionary algorithms with immigrants schemes for dynamic optimization. Memet Comput 1(1): 3–24CrossRefGoogle Scholar
  5. 5.
    Brest J, Zamuda A, Bošković B, Maučec MS, Žumer V (2009) Dynamic optimization using self-adaptive differential evolution. In: Proceedings of the 2009 IEEE congress on evolutionary computation, pp 415–422Google Scholar
  6. 6.
    Yu EL, Suganthan PN (2009) Evolutionary programming with ensemble of explicit memories for dynamic optimization. In: Proceedings of the 2009 IEEE congress on evolutionary computation, pp 431–438Google Scholar
  7. 7.
    Singh HK, Isaacs A, Nguyen TT, Ray T, Yao X (2009) Performance of infeasibility driven evolutionary algorithm (IDEA) on constrained dynamic single objective optimization problems. In: Proceedings of the 2009 IEEE congress on evolutionary computation, pp 3127–3134Google Scholar
  8. 8.
    Yang S, Li C (2010) A clustering particle swarm optimizer for locating and tracking multiple optima in dynamic environments. IEEE Trans Evol Comput 14(6):959–974Google Scholar
  9. 9.
    Yu X, Jin Y, Tang K, Yao X (2010) Robust optimization over time—a new perspective on dynamic optimization problems. In: Proceedings of the 2010 IEEE congress on evolutionary computation, pp 3998–4003Google Scholar
  10. 10.
    Handa H (2006) Fitness function for finding out robust solutions on time-varying functions. In: Proceedings of the 2006 genetic and evolutionary computation conference, pp 1195–1200Google Scholar
  11. 11.
    Bosman PAN (2005) Learning, anticipation and time-deception in evolutionary online dynamic optimization. In: Proceedings of the 2005 genetic and evolutionary computation conference, pp 39–47Google Scholar
  12. 12.
    Powell MJD (1992) The theory of radial basis function approximation in 1990. In: Light W (ed) Advances in numerical analysis, vol 2: wavelets, subdivision algorithms and radial basis functions. Oxfod Univ. Press, London, pp 105–210Google Scholar
  13. 13.
    Liang KH, Yao X, Newton C (2000) Evolutionary search of approximated n-dimensional landscapes. Int J Knowl Based Intell Eng Syst 4(3): 172–183Google Scholar
  14. 14.
    Regis RG, Shoemaker CA (2004) Local function approximation in evolutionary algorithms for the optimization of costly functions. IEEE Trans Evol Comput 8(5): 490–505CrossRefGoogle Scholar
  15. 15.
    Box GEP, Jenkins GM, Reinsel G (1994) Time series analysis: forecasting and control, 3rd edn. Prentice Hall PTR, Upper Saddle RiverGoogle Scholar
  16. 16.
    Deb K, Gupta S, Daum D, Branke J, Mall AK, Padmanabhan D (2009) Reliability-based optimization using evolutionary algorithms. IEEE Trans Evol Comput 13(5): 1054–1074CrossRefGoogle Scholar
  17. 17.
    Beyer HG, Sendhoff B (2007) Robust optimization—a comprehensive survey. Comput Methods Appl Mech Eng 196: 3190– 3218MathSciNetzbMATHCrossRefGoogle Scholar
  18. 18.
    Branke J (1998) Creating robust solutions by means of an evolutionary algorithm. Parallel Problem Solving from Nature–PPSN V, pp 119–128Google Scholar
  19. 19.
    Greiner H (1996) Robust optical coating design with evolution strategies. Appl Opt 35(28): 5477–5483CrossRefGoogle Scholar
  20. 20.
    Wiesmann D, Hammel U, Back T (1998) Robust design of multilayer optical coatings by means of evolutionary algorithms. IEEE Trans Evol Comput 2(4): 162–167CrossRefGoogle Scholar
  21. 21.
    Branke J (2001) Reducing the sampling variance when searching for robust solutions. In: Proceedings of genetic and evolutionary computation conference, pp 235–242Google Scholar
  22. 22.
    Loughlin DH, Ranjithan SR (1999) Chance-constrained genetic algorithms. In: Proceedings of genetic and evolutionary computation conference, pp 369–376Google Scholar
  23. 23.
    Jin Y (2002) Fitness approximation in evolutionary computation—a survey. In: Proceedings of genetic and evolutionary computation conference, pp 1105–1112, New YorkGoogle Scholar
  24. 24.
    Jin Y (2011) Surrogate-assisted evolutionary computation: recent advances and future challenges. Swarm Evol Comput 1(2): 61–70CrossRefGoogle Scholar
  25. 25.
    Paenke I, Branke J, Jin Y (2006) Efficient search for robust solutions by means of evolutionary algorithms and fitness approximation. IEEE Trans Evol Comput 10(4): 405–420CrossRefGoogle Scholar
  26. 26.
    Lim D, Ong YS, Jin Y, Sendhoff B, Lee BS (2006) Inverse multi-objective robust evolutionary optimization. Genet Program Evol Mach 7(4): 383–404CrossRefGoogle Scholar
  27. 27.
    Chen XS, Ong YS, Lim MH, Tan KC (2011) A multi-facet survey on memetic computation. IEEE Trans Evol Comput 15(5): 591–607CrossRefGoogle Scholar
  28. 28.
    Ong YS, Lim MH, Chen XS (2010) Research frontier: memetic computation—past, present & future. IEEE Comput Intell Mag 5(2): 24–36CrossRefGoogle Scholar
  29. 29.
    Ong YS, Nair PB, Lum KY (2006) Max-min surrogate-assisted evolutionary algorithm for robust design. IEEE Trans Evol Comput 10(4): 392–404CrossRefGoogle Scholar
  30. 30.
    Le MN, Ong YS, Menzel S, Jin Y, Sendhoff B (2012) Evolution by adapting surrogates. Evol Comput (accepted)Google Scholar
  31. 31.
    Le MN, Ong YS, Jin Y, Sendhoff B (2012) A unified framework for symbiosis of evolutionary mechanisms with application to water clusters potential model design. IEEE Comput Intell Mag 7(1): 20–35CrossRefGoogle Scholar
  32. 32.
    Le MN, Ong YS, Jin Y, Sendhoff B (2009) Lamarckian memetic algorithms: local optimum and connectivity structure analysis. Memet Comput J 1(3): 175–190CrossRefGoogle Scholar
  33. 33.
    Branke J (1999) Memory enhanced evolutionary algorithms for changing optimization problems. In: Proceedings of the 1999 IEEE congress on evolutionary computation, vol 3, pp 1875–1882Google Scholar
  34. 34.
    Yang S (2007) Explicit memory schemes for evolutionary algorithms in dynamic environments. In: Yang S, Ong Y-S, Jin Y (eds) Evolutionary computation in dynamic and uncertain environments, chapter 1. Springer, Berlin, pp 3–28CrossRefGoogle Scholar
  35. 35.
    Goldberg DE, Smith RE (1987) Nonstationary function optimization using genetic algorithms with dominance and diploidy. In: Grefenstette JJ (ed) Genetic algorithms. Lawrence Erlbaum, Mahwah, pp 59–68Google Scholar
  36. 36.
    Hadad BS, Eick CF et al (1997) Supporting polyploidy in genetic algorithms using dominance vectors. In: Angeline PJ (ed) Evolutionary programming. LNCS, vol 1213. Springer, Berlin, pp 223–234Google Scholar
  37. 37.
    Lewis J, Hart E, Ritchie G (1998) A comparison of dominance mechanisms and simple mutation on nonstationary problems. In: Eiben AE, Bäck T, Schoenauer M, Schwefel H-P (eds) Parallel problem solving from nature. LNCS, vol 1498. Springer, Berlin, pp 139–148Google Scholar
  38. 38.
    Ryan C (1997) Diploidy without dominance. In: Alander JT (ed.) Proceedings of 3rd Nordic workshop genetic algorithms, pp 63–70Google Scholar
  39. 39.
    Cobb HG (1990) An investigation into the use of hypermutation as an adaptive operator in genetic algorithms having continuous, time-dependent nonstationary environments. Naval Res Lab, Washington, DC, Tech. Rep. AIC-90-001Google Scholar
  40. 40.
    Jin Y, Sendhoff B (2004) Constructing dynamic test problems using the multi-objective optimization concept. In: Applications of Evolutionary Computing. LNCS, vol 3005. Springer, Berlin, pp 525–536Google Scholar
  41. 41.
    Yang S, Tinós R (2007) A hybrid immigrants scheme for genetic algorithms in dynamic environments. Int J Autom Comput 4(3): 243–254CrossRefGoogle Scholar
  42. 42.
    Yu X, Tang K, Yao X (2008) An immigrants scheme based on environmental information for genetic algorithms in changing environments. In: Proceedings of the 2008 IEEE congress on evolutionary computation, pp 1141–1147Google Scholar
  43. 43.
    Branke J, Kaußler T, Schmidt C, Schmeck H (2000) A multipopulation approach to dynamic optimization problems. In: Adaptive computing in design and manufacturing 2000. LNCS. Springer, BerlinGoogle Scholar
  44. 44.
    Ursem RK (2000) Multinational GA optimization techniques in dynamic environments. In: Whitley D et al (eds) Proceedings of genetic and evolutionary computation conference, pp 19–26Google Scholar
  45. 45.
    Simpson T, Mauery T, Korte J, Mistree F (1998) Comparison of response surface and Kriging models for multidisciplinary design optimization. Technical Report 98-4755, AIAAGoogle Scholar
  46. 46.
    Jin Y (2005) A comprehensive survey of fitness approximation in evolutionary computation. Soft Comput 9(1): 3–12CrossRefGoogle Scholar
  47. 47.
    Sapankevych N Sankar R (2009) Time series prediction using support vector machines: a survey. IEEE Comput Intell Mag 4(2): 24–38CrossRefGoogle Scholar
  48. 48.
    MacQueen J (1967) Some methods for classification and analysis of multivariate observations. In: Proceedings of 5th Berkeley Symposium on Mathematical Statistics and Probability, University of California Press, Berkeley, pp 281–297Google Scholar
  49. 49.
    Ye KQ, Li W, Sudjianto A (2000) Algorithmic construction of optimal symmetric Latin hypercube designs. J Stat Plan Inference 90: 145–159MathSciNetzbMATHCrossRefGoogle Scholar
  50. 50.
    Chen S, Cowan CFN, Grant PM (1991) Orthogonal least squares learning algorithm for radial basis function networks. IEEE Trans Neural Netw 2(2): 302–309CrossRefGoogle Scholar
  51. 51.
    Chng ES, Chen S, Mulgrew B (1996) Gradient radial basis function networks for nonlinear and nonstationary time series prediction. IEEE Trans Neural Netw 7(1): 190–194CrossRefGoogle Scholar
  52. 52.
    Kennedy J, Eberhart RC (1995) Particle swarm optimization. In: Proceedings of the IEEE international conference on neural networks, vol 4, pp 1942–1948Google Scholar
  53. 53.
    Kennedy J, Eberhart RC (2001) Swarm intelligence. Morgan Kaufmann, San FransiscoGoogle Scholar
  54. 54.
    Parrott D, Li X (2006) Locating and tracking multiple dynamic optima by a particle swarm model using speciation. IEEE Trans Evol Comput 10(4): 440–458CrossRefGoogle Scholar
  55. 55.
    Bird S, Li X (2006) Enhancing the robustness of a species-based PSO. In: Proceedings of the 2006 IEEE congress on evolutionary computation, pp 843–850Google Scholar
  56. 56.
    Petrowski A (1996) A clearing procedure as a niching method for genetic algorithms. In: Proceedings of the IEEE international conference on evolutionary computation, pp 798–803Google Scholar
  57. 57.
    Clerc M, Kenndy J (2002) The particle swarm—explosion, stability, and convergence in a multidimensional complex space. IEEE Trans Evol Comput 6(1): 58–73CrossRefGoogle Scholar
  58. 58.
    Bird S, Li X (2007) Using regression to improve local convergence. In: Proceedings of the 2007 IEEE congress on evolutionary computation, pp 592–599Google Scholar
  59. 59.
    Hoare CAR (1962) Quicksort. Comput J 5(1):10–15Google Scholar
  60. 60.
    Goloub GH, Loan CF (1996) Matrix computations, 3rd edn. The John Hopkins Press, BaltimoreGoogle Scholar
  61. 61.
    Press WH, Teukolsky SA, Vetterling WT, Flannery BP (1992) Numerical recipes in C: the art of scientific computing, 2nd edn. Cambridge Univ. Press, CambridgeGoogle Scholar
  62. 62.
    Li C, Yang S, Nguyen TT, Yu EL, Yao X, Jin Y, Beyer H-G, Suganthan PN (2008) Benchmark generator for CEC’2009 competition on dynamic optimization, Technical Report 2008, Department of Computer Science, University of Leicester, UKGoogle Scholar
  63. 63.
    Salomon R (1996) Reevaluating genetic algorithm performance under coordinate rotation of benchmark functions: A survey of some theoretical and practical aspects of genetic algorithms. BioSystems 39(3): 263–278CrossRefGoogle Scholar
  64. 64.
    Hu X, Eberhart RC (2002) Adaptive particle swarm optimization: Detection and response to dynamic systems. In: Proceedings of the 2002 IEEE congress on evolutionary computation, pp 1666– 1670Google Scholar
  65. 65.
    Richter H (2009) Detecting change in dynamic fitness landscapes. In: Proceedings of the 2009 IEEE congress on evolutionary computation, pp 1613–1620Google Scholar
  66. 66.
    Fu H, Sendhoff B, Tang K, Yao X (2012) Characterizing environmental changes in robust optimization over time. Congress on Evolutionary ComputationGoogle Scholar
  67. 67.
    Jin Y, Sendhoff B (2003) Trade-off between performance and robustness: an evolutionary multiobjective approach. In: Proceedings of second international conference on evolutionary multi-criteria optimization. LNCS, vol 2632. Springer, Berlin, pp 237–251, FaroGoogle Scholar
  68. 68.
    Jin Y, Sendhoff B (2009) A systems approach to evolutionary multi-objective structural optimization and beyond. IEEE Comput Intell Mag 4(3): 62–76CrossRefGoogle Scholar

Copyright information

© Springer-Verlag 2012

Authors and Affiliations

  • Yaochu Jin
    • 1
    • 2
  • Ke Tang
    • 1
  • Xin Yu
    • 1
  • Bernhard Sendhoff
    • 3
  • Xin Yao
    • 1
    • 4
  1. 1.Nature Inspired Computation and Applications Laboratory (NICAL), School of Computer Science and TechnologyUniversity of Science and Technology of ChinaHefeiChina
  2. 2.Department of ComputingUniversity of SurreySurreyUK
  3. 3.Honda Research Institute EuropeOffenbachGermany
  4. 4.Centre of Excellence for Research in Computational Intelligence and Applications (CERCIA), School of Computer ScienceUniversity of BirminghamBirminghamUK

Personalised recommendations