Memetic Computing

, 1:175 | Cite as

Lamarckian memetic algorithms: local optimum and connectivity structure analysis

  • Minh Nghia Le
  • Yew-Soon Ong
  • Yaochu Jin
  • Bernhard Sendhoff
Regular Research Paper

Abstract

Memetic algorithms (MAs) represent an emerging field that has attracted increasing research interest in recent times. Despite the popularity of the field, we remain to know rather little of the search mechanisms of MAs. Given the limited progress made on revealing the intrinsic properties of some commonly used complex benchmark problems and working mechanisms of Lamarckian memetic algorithms in general non-linear programming, we introduce in this work for the first time the concepts of local optimum structure and generalize the notion of neighborhood to connectivity structure for analysis of MAs. Based on the two proposed concepts, we analyze the solution quality and computational efficiency of the core search operators in Lamarckian memetic algorithms. Subsequently, the structure of local optimums of a few representative and complex benchmark problems is studied to reveal the effects of individual learning on fitness landscape and to gain clues into the success or failure of MAs. The connectivity structure of local optimum for different memes or individual learning procedures in Lamarckian MAs on the benchmark problems is also investigated to understand the effects of choice of memes in MA design.

Keywords

Memetic algorithms Lamarckian evolution Search dynamics Fitness distance correlation Experimental analysis Numerical optimization 

Nomenclature

f (x)

Objective or fitness function

x*

Global optimum

x(i)

ith element of vector x

r(t)(y|x)

Conditional probability density function of having offspring y given parent x at generation t

d(x, y)

Euclidean distance \({\|{\bf x}-{\bf y}\|=\sqrt{\sum_{i=1}^{n}{(x_i - y_i)^2}}}\) between x and y

Ψ

A set of local optimums

Bv

Basin of attraction of local optimum v

pv(x)

Probability of converging to local optimum v from x by means of individual learning

T (x, y)

Probability of converging to local optimum y from x by means of reproduction and individual learning

C(x′, x′′)

Computational effort incurred to arrive at x′′ from x′ by means of individual learning

E[.|P]

Expectation of a measure conditioned to population P

Cv

Maximum computational effort required to converge to local optimum starting from any point within the basin of attraction Bv

n

Number of dimensions

N

Population size

S(.)

Selection operator

R(.)

Reproduction operator

I L(.)

Individual learning operator

References

  1. 1.
    Ong YS, Lim MH, Neri F, Ishibuchi H (2008) Special issue on emerging trends in soft computing: memetic algorithms. Soft Comput Fusion Found Methodol Appl 13(8-9): 739–740Google Scholar
  2. 2.
    Ong YS, Krasnogor N, Ishibuchi H (2007) Special issue on memetic algorithms. IEEE Trans Syst Man Cybern Part B 37(1): 2–5CrossRefGoogle Scholar
  3. 3.
    Neri F, Moscato P, Ishibuchi H (2009) Special session: memetic algorithms for hard to solve problems. IEEE World Congr Comput IntellGoogle Scholar
  4. 4.
    Ong YS, Neri F, Ishibuchi H, Lim MH (2007, 2008) Memetic algorithms: special session. IEEE World Congr Comput IntellGoogle Scholar
  5. 5.
    Lim MH, Gustafson S, Krasnogor N, Ong YS (2009) Editorial to the first issue. Memetic Comput 1(1): 1–2CrossRefGoogle Scholar
  6. 6.
    Moscato P (1989) On evolution, search, optimization, genetic algorithms and martial arts: towards memetic algorithms. Caltech Concurrent Computation Program, C3P Report 826Google Scholar
  7. 7.
    Gwee BH, Lim MH (1996) Polynominoes tiling by a genetic algorithm. Comput Optim Appl J 6(3): 273–291MATHMathSciNetGoogle Scholar
  8. 8.
    Lim MH, Yu Y, Omatu S (2000) Efficient genetic algorithms using simple genes exchange local search policy for the quadratic assignment problem. Comput Optim Appl 15(3): 249–268MATHCrossRefMathSciNetGoogle Scholar
  9. 9.
    Lewis R, Paechter B (2007) Finding feasible timetables using group-based operators. IEEE Trans Evol Comput 11(3): 397–413CrossRefGoogle Scholar
  10. 10.
    Ong YS, Keane A (2004) Meta-Lamarckian learning in memetic algorithms. IEEE Trans Evol Comput 8(2): 99–110CrossRefGoogle Scholar
  11. 11.
    Vicini A, Quagliarella D (1999) Airfoil and wing design through hybrid optimization strategies. Am Inst Aeronaut Astronaut J 37(5): 634–641Google Scholar
  12. 12.
    Michalewicz Z (1996) Genetic Algorithms—Data Structures—Evolution Programs. Springer, LondonMATHGoogle Scholar
  13. 13.
    Houck C, Joines J, Kay M (1996) Utilizing Lamarckian evolution and the Baldwin effect in hybrid genetic algorithms. Tech RepGoogle Scholar
  14. 14.
    Hart WE (1994) Adaptive global optimization with local search. Ph.D. dissertation, University of California, San DiegoGoogle Scholar
  15. 15.
    Renders J, Bersini H (1994) Hybridizing genetic algorithms with hill-climbing methods for global optimization: two possible ways. IEEE World Congress Comput Intell 1: 312–317CrossRefGoogle Scholar
  16. 16.
    Davis L (1991) Handbook of genetic algorithms. Van Nostrand Reinhold Company, New YorkGoogle Scholar
  17. 17.
    Noman N, Iba H (2008) Accelerating differential evolution using an adaptive local search. IEEE Trans Evol Comput 12(1): 107–125CrossRefGoogle Scholar
  18. 18.
    Ong YS, Nair PB, Lum K (2006) Max-min surrogate-assisted evolutionary algorithm for robust design. IEEE Trans Evol Comput 10(4): 392–404CrossRefGoogle Scholar
  19. 19.
    Krasnogor N, Smith JE (2005) A tutorial for competent memetic algorithms: model, taxonomy, and design issues. IEEE Trans Evol Comput 9(5): 474–488CrossRefGoogle Scholar
  20. 20.
    Ong YS, Nair PB, Keane AJ (2003) Evolutionary optimization of computationally expensive problems via surrogate modeling. Am Inst Aeronaut Astronaut J 41(4): 687–696Google Scholar
  21. 21.
    Ishibuchi H, Yoshida T, Murata T (2003) Balance between genetic search and local search in memetic algorithms for multiobjective permutation flowshop scheduling. IEEE Trans Evol Comput 7(2): 204–223CrossRefGoogle Scholar
  22. 22.
    Krasnogor N (2002) Studies on the theory and design space of memetic algorithms. Ph.D. dissertation, Doctoral dissertation, University of the West of England, Bristol, EnglandGoogle Scholar
  23. 23.
    Krasnogor N, Blackburne BP, Burke EK, Hirst JD (2002) Multimeme algorithms for protein structure prediction. In: Proceedings of the parallel problem solving from nature VII (Lecture notes in computer science), vol 2439/2002, pp 769–778Google Scholar
  24. 24.
    Meuth R, Lim MH, Ong YS, Wunsch DC II (2009) A proposition on memes and meta-memes in computing for higher-order learning. Memetic Comput 1(2): 85–100CrossRefGoogle Scholar
  25. 25.
    Ong YS, Lim MH, Zhu N, Wong KW (2006) Classification of adaptive memetic algorithms: a comparative study. IEEE Trans Syst Man Cybern Part B 36(1): 141–152CrossRefGoogle Scholar
  26. 26.
    Hinton GE, Nowlan SJ (1987) How learning can guide evolution. Complex Syst 1(1): 495–502MATHGoogle Scholar
  27. 27.
    Borenstein E, Meilijson I, Ruppin E (2006) The effect of phenotypic plasticity on evolution in multipeaked fitness landscapes. J Evol Biol 19(5): 1555–1570CrossRefGoogle Scholar
  28. 28.
    Paenke I, Kawecki T, Sendhoff B (2009) The influence of learning on evolution: a mathematical framework. Artif Life 15(2): 227–245CrossRefGoogle Scholar
  29. 29.
    Paenke I, Jin Y, Branke J (2009) Balancing population and individual level adaptation in changing environments. Adapt Behav 17(2): 153–174CrossRefGoogle Scholar
  30. 30.
    Merz P (2004) Advanced fitness landscape analysis and the performance of memetic algorithms. Evol Comput 12(3): 303–325CrossRefMathSciNetGoogle Scholar
  31. 31.
    Merz P (2000) Memetic algorithms for combinatorial optimization problems: fitness landscapes and effective search strategies. Ph.D. dissertation, University of Siegen, GermanyGoogle Scholar
  32. 32.
    Whitley D, Gordon V, Mathias K (1994) Lamarckian evolution, the Baldwin effect and function optimization. In: Parallel problem solving from nature–PPSN III: international conference on evolutionary computation, the third conference on parallel problem solving from nature, pp 6–15Google Scholar
  33. 33.
    Paenke I, Sendhoff B, Rowe J, Fernando C (2007) On the adaptive disadvantage of Lamarckianism in rapidly changing environments. In: European conference on artificial life (ECAL), 10–14 September 2007 (Lecture notes in computer science), vol 4648, pp 355–364Google Scholar
  34. 34.
    Zhang JQ, Sanderson AC (2009) Adaptive differential evolution: a robust approach to multimodal problem optimization. Series on adaptation, Learning and Optimization, vol. 1Google Scholar
  35. 35.
    Jablonka E, Lamb MJ (1995) Epigenetic inheritance and evolution: the Lamarckian dimension. Oxford University Press, OxfordGoogle Scholar
  36. 36.
    Ho MW (1996) Why Lamarck won’t go away. Ann Hum Genet 60(1): 81–84CrossRefGoogle Scholar
  37. 37.
    Beyer H-G (1997) An alternative explanation for the manner in which genetic algorithms operate. BioSystems 41(1): 1–15CrossRefGoogle Scholar
  38. 38.
    Beyer H-G, Schwefel H-P, Wegener I (2002) How to analyse evolutionary algorithms. Theor Comput Sci 287(1): 101–130MATHCrossRefMathSciNetGoogle Scholar
  39. 39.
    Schwefel H-P (1993) Evolution and optimum seeking: the sixth generation. Wiley, New YorkGoogle Scholar
  40. 40.
    Goldberg D, Deb K (1991) A comparative analysis of selection schemes used in genetic algorithms. Found Genet algorithm 1: 69–93MathSciNetGoogle Scholar
  41. 41.
    Wojtusiak J, Michalski RS (2006) The LEM3 implementation of learnable evolution model and its testing on complex function optimization problems. In: GECCO ’06: Proceedings of the 8th annual conference on genetic and evolutionary computation. ACM Press, USA, pp 1281–1288Google Scholar
  42. 42.
    Leung Y-W, Wang Y (2001) An orthogonal genetic algorithm with quantization for global numerical optimization. IEEE Trans Evol Comput 5: 41–53CrossRefGoogle Scholar
  43. 43.
    Yao X, Liu Y (1997) Fast evolution strategies. Control Cybern 26: 467–496MATHMathSciNetGoogle Scholar
  44. 44.
    Nguyen QH, Ong YS, Krasnogor N (2007) A study on the design issues of memetic algorithm. In: IEEE congress on evolutionary computation (CEC), 25–28 September 2007, pp 2390–2397Google Scholar
  45. 45.
    Tavares J, Pereira FB, Costa E (2008) Multidimensional knapsack problem: a fitness landscape analysis. IEEE Trans Syst Man Cybern Part B Cybern 38(3): 604–616CrossRefGoogle Scholar
  46. 46.
    Jones T, Forrest S (1995) Fitness distance correlation as a measure of problem difficulty for genetic algorithms. In: Proceedings of the 6th international conference on genetic algorithms, pp 184–192Google Scholar
  47. 47.
    Shang Y-W, Qiu Y-H (2006) A note on the extended Rosenbrock function. Evol Comput 14(1): 119–126CrossRefGoogle Scholar

Copyright information

© Springer-Verlag 2009

Authors and Affiliations

  • Minh Nghia Le
    • 1
  • Yew-Soon Ong
    • 1
  • Yaochu Jin
    • 2
  • Bernhard Sendhoff
    • 2
  1. 1.Centre for Computational Intelligence, Division of Information System, School of Computer EngineeringNanyang Technological UniversitySingaporeSingapore
  2. 2.Honda Research Institute Europe GmbHOffenbach/MainGermany

Personalised recommendations