Memetic Computing

, Volume 3, Issue 2, pp 111–127 | Cite as

Genetic algorithm with automatic termination and search space rotation

Regular Research Paper

Abstract

In the last two decades, numerous evolutionary algorithms (EAs) have been developed for solving optimization problems. However, only a few works have focused on the question of the termination criteria. Indeed, EAs still need termination criteria prespecified by the user. In this paper, we develop a genetic algorithm (GA) with automatic termination and acceleration elements which allow the search to end without resort to predefined conditions. We call this algorithm “Genetic Algorithm with Automatic Termination and Search Space Rotation”, abbreviated as GATR. This algorithm utilizes the so-called “Gene Matrix” (GM) to equip the search process with a self-check in order to judge how much exploration has been performed, while maintaining the population diversity. The algorithm also implements a mutation operator called “mutagenesis” to achieve more efficient and faster exploration and exploitation processes. Moreover, GATR fully exploits the structure of the GM by calling a novel search space decomposition mechanism combined with a search space rotation procedure. As a result, the search operates strictly within two-dimensional subspaces irrespective of the dimension of the original problem. The computational experiments and comparisons with some state-of-the-art EAs demonstrate the effectiveness of the automatic termination criteria and the space decomposition mechanism of GATR.

Keywords

Genetic algorithms Termination criteria Gene matrix Mutagenesis Space rotation Space decomposition 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Back, T, Fogel, DB, Michalewicz, Z (eds) (1997) Handbook of evolutionary computation. IOP Publishing Ltd., BristolGoogle Scholar
  2. 2.
    Baker JE (1985) Adaptive selection methods for genetic algorithms. In: Proceedings of the 1st international conference on genetic algorithms. L. Erlbaum Associates Inc., Hillsdale, pp 101–111Google Scholar
  3. 3.
    Beyer HG, Schwefel HP (2002) Evolution strategies—a comprehensive introduction. Nat Comput 1: 3–52MATHCrossRefMathSciNetGoogle Scholar
  4. 4.
    García S, Molina D, Lozano M, Herrera F (2009) A study on the use of non-parametric tests for analyzing the evolutionary algorithms’ behaviour: a case study on the CEC’2005 special session on real parameter optimization. J Heuristics 15(6): 617–644MATHCrossRefGoogle Scholar
  5. 5.
    Giggs MS, Maier HR, Dandy GC, Nixon JB (2006) Minimum number of generations required for convergence of genetic algorithms. In: Proceedings of 2006 IEEE congress on evolutionary computation, Vancouver, BC, Canada, pp 2580–2587Google Scholar
  6. 6.
    Glover F (1986) Future paths for integer programming and links to artificial intelligence. Comput Oper Res 13: 533–549MATHCrossRefMathSciNetGoogle Scholar
  7. 7.
    Hansen N (2006) The CMA evolution strategy: a comparing review. In: Lozano J, Larranaga P, Inza I, Bengoetxea E (eds) Towards a new evolutionary computation. Advances on estimation of distribution algorithms. Springer, New York, pp 75–102Google Scholar
  8. 8.
    Hansen N, Kern S (2004) Evaluating the CMA evolution strategy on multimodal test functions. In: Proceedings of eighth international conference on parallel problem solving from nature PPSN VIII, pp 82–291Google Scholar
  9. 9.
    Hansen N, Ostermeier A (1996) Adapting arbitrary normal mutation distributions in evolution strategies: the covariance matrix adaptation. In: Proceedings of the 1996 IEEE international conference on evolutionary computation. Morgan Kaufmann, pp 312–317Google Scholar
  10. 10.
    Hansen N, Müller SD, Koumoutsakos P (2003) Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (CMA-ES). Evol Comput 11(1): 1–18CrossRefGoogle Scholar
  11. 11.
    Hansen N, Auger A, Kern S (2005) Performance evaluation of an advanced local search evolutionary algorithm. In: Proceedings of the IEEE congress on evolutionary computation. IEEE Press, pp 1777–1784Google Scholar
  12. 12.
    Hansen N, Auger A, Kern S (2005) A restart CMA evolution strategy with increasing population size. In: Proceedings of the IEEE congress on evolutionary computation. IEEE Press, pp 1769–1776Google Scholar
  13. 13.
    Hedar AR, Fukushima M (2003) Minimizing multimodal functions by simplex coding genetic algorithm. Optim Methods Softw 18: 265–282MATHMathSciNetGoogle Scholar
  14. 14.
    Hedar AR, Fukushima M (2006) Directed evolutionary programming: towards an improved performance of evolutionary programming. In: Proceedings of congress on evolutionary computation. IEEE World Congress on Computational Intelligence, Vancouver, Canada, pp 1521–1528Google Scholar
  15. 15.
    Hedar AR, Ong BT, Fukushima M (2007) Genetic algorithms with automatic accelerated termination. Technical report, Department of Applied Mathematics and Physics, Kyoto UniversityGoogle Scholar
  16. 16.
    Holland JH (1975) Adaptation in natural and artificial systems. The University of Michigan Press, MichiganGoogle Scholar
  17. 17.
    Jain BJ, Pohlheim H, Wegener J (2001) On termination criteria of evolutionary algorithms. In: Proceedings of the genetic and evolutionary computation conference. Morgan Kaufmann, p 768Google Scholar
  18. 18.
    Jakob W (2010) A general cost-benefit-based adaptation framework for multimeme algorithms. Memetic Comput 2: 201–218CrossRefGoogle Scholar
  19. 19.
    Kelley CT (1999) Detection and remediation of stagnation in the Nelder-Mead algorithm using a sufficient decrease condition. SIAM J Optim 10(1): 43–55MATHCrossRefMathSciNetGoogle Scholar
  20. 20.
    Konar A (2005) Computational intelligence: principles, techniques and applications. Springer, BerlinMATHGoogle Scholar
  21. 21.
    Koo W, Goh C, Tan K (2010) A predictive gradient strategy for multiobjective evolutionary algorithms in a fast changing environment. Memetic Comput 2: 87–110CrossRefGoogle Scholar
  22. 22.
    Koumousis VK, Katsaras CP (2006) A saw-tooth genetic algorithm combining the effects of variable popultion size and reinitialization to enhance performance. IEEE Trans Evol Comput 10(1): 19–28CrossRefGoogle Scholar
  23. 23.
    Kramer O (2010) Iterated local search with Powell’s method: a memetic algorithm for continuous global optimization. Memetic Comput 2: 69–83CrossRefGoogle Scholar
  24. 24.
    Kwok NM, Ha QP, Liu DK, Fang G, Tan KC (2007) Efficient particle swarm optimization: a termination condition based on the decision-making approach. In: Proceedings of the IEEE congress on evolutionary computation, Singapore, pp 25–28Google Scholar
  25. 25.
    Langdon WB, Poli R (2007) Evolving problems to learn about particle swarm optimizers and other search algorithms. IEEE Trans Evol Comput 11(5): 561–578CrossRefGoogle Scholar
  26. 26.
    Le M, Ong YS, Jin Y, Sendhoff B (2009) Lamarckian memetic algorithms: local optimum and connectivity structure analysis. Memetic Comput 1: 175–190CrossRefGoogle Scholar
  27. 27.
    Lee C, Yao X (2004) Evolutionary programming using the mutations based on the Lévy probability distribution. IEEE Trans Evol Comput 8: 1–13CrossRefGoogle Scholar
  28. 28.
    Leung YW, Wang Y (2001) An orthogonal genetic algorithm with quantization for numerical optimization. IEEE Trans Evol Comput 5: 41–53CrossRefGoogle Scholar
  29. 29.
    Lobo FG, Lima CF, Michalewicz Z (2007) Parameter setting in evolutionary algorithms. Springer, BerlinMATHCrossRefGoogle Scholar
  30. 30.
    Lozano M, Herrera F, Molina D (2005) Adaptive local search parameters for real-coded memetic algorithms. In: Proceedings of the 2005 IEEE congress on evolutionary computation, pp 888–895Google Scholar
  31. 31.
    Lunacek M, Whitley D (2006) The dispersion metric and the CMA evolution strategy. In: GECCO ’06: Proceedings of the 8th annual conference on genetic and evolutionary computation. ACM, New York, pp 477–484Google Scholar
  32. 32.
    McMinn P (2004) Search-based software test data generation: a survey. Softw Test Verif Reliab 14(2): 105–156CrossRefGoogle Scholar
  33. 33.
    Montgomery D, Runger G (2003) Applied statistics and probability for engineers. Wiley, New YorkGoogle Scholar
  34. 34.
    Moscato P (1999) Memetic algorithms: an introduction. In: Corne D, Dorigo M, Glover F, Dasgupta D, Moscato P, Poli R, Price KV (eds) New ideas in optimization. McGraw-Hill Ltd., MaidenheadGoogle Scholar
  35. 35.
    Nelder J, Mead R (1965) A simplex method for function minimization. Comput J 7: 308–313MATHGoogle Scholar
  36. 36.
    Noman N, Iba H (2008) Accelerating differential evolution using an adaptive local search. IEEE Trans Evol Comput 12(1): 107–125CrossRefGoogle Scholar
  37. 37.
    Ong YS, Keane A (2004) Meta-Lamarckian learning in memetic algorithms. IEEE Trans Evol Comput 8(2): 99–110CrossRefGoogle Scholar
  38. 38.
    Ong YS, Lim MH, Zhu N, Wong K (2006) Classification of adaptive memetic algorithms: a comparative study. IEEE Trans Syst Man Cybern B 36(1): 141–152CrossRefGoogle Scholar
  39. 39.
    O’Sullivan M, Vössner S, Wegener J (1998) Testing temporal correctness of real-time systems. In: EuroSTAR’98: Proceedings of the sixth international conference on software testing analysis and review, Munich, GermanyGoogle Scholar
  40. 40.
    Rechenberg I (1965) Cybernetic solution path of an experimental problem. Technical report, Royal Air Force EstablishmentGoogle Scholar
  41. 41.
    Safe M, Carballido J, Ponzoni I, Brignole N (2004) On stopping criteria for genetic algorithms. Lect Notes Comput Sci 3171: 405–413CrossRefGoogle Scholar
  42. 42.
    Schwefel HP (1974) Adaptive mechanismen in der biologischen evolution und ihr einfluss auf die evolutionsgeschwindigkeit (abschlussbericht zum dfg-vorhaben re 215/2). Tech. rep., Technical University of Berlin, BerlinGoogle Scholar
  43. 43.
    Storn R, Price K (1997) Differential evolution—a simple and efficient heuristic for global optimization over continuous spaces. J Glob Optim 11(4): 341–359MATHCrossRefMathSciNetGoogle Scholar
  44. 44.
    Suganthan PN, Hansen N, Liang JJ, Deb K, Chen YP, Auger A, Tiwari S (2005) Problem definitions and evaluation criteria for the CEC-2005 special session on real-parameter optimization. Technical report. Nanyang Technol. University, SingaporeGoogle Scholar
  45. 45.
    Ting CK, Ko CF, Huang CH (2009) Selecting survivors in genetic algorithm using tabu search strategies. Memetic Comput 1: 191–203CrossRefGoogle Scholar
  46. 46.
    Tsai JT, Liu TK, Chou JH (2004) Hybrid Taguchi-genetic algorithm for global numerical optimization. IEEE Trans Evol Comput 8(2): 365–377CrossRefGoogle Scholar
  47. 47.
    Tsutsui S, Yamamura M, Higuchi T (1999) Multi-parent recombination with simplex crossover in real-coded genetic algorithms. In: GECCO ’99: Proceedings of the genetic and evolutionary computation conference, pp 657–664Google Scholar
  48. 48.
    Tu Z, Lu Y (2004) A robust stochastic genetic algorithm (STGA) for global numerical optimization. IEEE Trans Evol Comput 8(5): 456–470CrossRefGoogle Scholar
  49. 49.
    Yao X, Liu Y, Lin G (1999) Evolutionary programming made faster. IEEE Trans Evol Comput 3(2): 82–102CrossRefGoogle Scholar
  50. 50.
    Yuen SY, Chow CK (2009) A genetic algorithm that adaptively mutates and never revisits. IEEE Trans Evol Comput 13(2): 454–472CrossRefGoogle Scholar
  51. 51.
    Zhong W, Liu J, Xue M, Jiao L (2004) A multiagent genetic algorithm for global numerical optimization. IEEE Trans Syst Man Cybern B 34(2): 1128–1141CrossRefGoogle Scholar
  52. 52.
    Zhou Z, Ong YS, Nair P, Keane A, Lum K (2007) Combining global and local surrogate models to accelerate evolutionary optimization. IEEE Trans Syst Man Cybern B 37(1): 66–76CrossRefGoogle Scholar

Copyright information

© Springer-Verlag 2011

Authors and Affiliations

  1. 1.Department of Applied Mathematics and Physics, Graduate School of InformaticsKyoto UniversityKyotoJapan

Personalised recommendations