Towards an Adaptive Multimeme Algorithm for Parameter Optimisation Suiting the Engineers’ Needs

  • Wilfried Jakob
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4193)


Important factors for the easy usage of an Evolutionary Algorithm (EA) are numbers of fitness calculations as low as possible, its robustness, and the reduction of its strategy parameters as far as possible. Multimeme Algorithms (MMA) are good candidates for the first two properties. In this paper a cost-benefit-based approach shall be introduced for the adaptive control of both meme selection and the ratio between local and global search. The latter is achieved by adaptively adjusting the intensity of the search of the memes and the frequency of their usage. It will be shown in which way the proposed kind of adaptation fills the gap previous work leaves. Detailed experiments in the field of continuous parameter optimisation demonstrate the superiority of the adaptive MMA over the simple MA and the pure EA.


Local Search Evolutionary Algorithm Strategy Parameter Memetic Algorithm Local Searcher 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Grefenstette, J.J.: Optimization of Control Parameters for Genetic Algorithms. IEEE Transactions on Systems, Man, and Cybernetics 16(1), 122–128 (1986)CrossRefGoogle Scholar
  2. 2.
    Srinivas, M., Patnaik, L.M.: Adaptive Probabilities of Crossover and Mutation in Genetic Algorithms. IEEE Trans. on Systems, Man, and Cybernetics 24(4), 17–26 (1994)CrossRefGoogle Scholar
  3. 3.
    Eiben, A.E., Hinterding, R., Michalewicz, Z.: Parameter Control in Evolutionary Algorithms. IEEE Trans. on Evolutionary Computation 3(2), 124–141 (1999)CrossRefGoogle Scholar
  4. 4.
    Davis, L.L. (ed.): Handbook of Genetic Algorithms. Van Nostrand Reinhold, NY (1991)Google Scholar
  5. 5.
    Hart, W.E., Krasnogor, N., Smith, J.E. (eds.): Recent Advances in Memetic Algorithms. Studies in Fuzziness and Soft Computing, vol. 166. Springer, Berlin (2005)Google Scholar
  6. 6.
    Jakob, W.: HyGLEAM – An Approach to Generally Applicable Hybridization of Evolutionary Algorithms. In: Guervós, J.J.M., Adamidis, P.A., Beyer, H.-G., Fernández-Villacañas, J.-L., Schwefel, H.-P. (eds.) PPSN 2002. LNCS, vol. 2439, pp. 527–536. Springer, Heidelberg (2002)CrossRefGoogle Scholar
  7. 7.
    Jakob, W.: A New Method for the Increased Performance of Evolutionary Algorithms by the Integration of Local Search Procedures. In German, PhD thesis, Univ. of Karlsruhe, FZKA 6965 (March 2004), see also:
  8. 8.
    Lienig, J., Brandt, H.: An Evolutionary Algorithm for the Routing of Multi Chip Modules. In: Davidor, Y., Männer, R., Schwefel, H.-P. (eds.) PPSN 1994. LNCS, vol. 866, pp. 588–597. Springer, Heidelberg (1994)Google Scholar
  9. 9.
    Cox, J.L.A., Davis, L., Oiu, Y.: Dynamic Anticipatory Routing in Circuit Switches Telecommunications Networks. In: [4], 124–143 (1991)Google Scholar
  10. 10.
    Krasnogor, N., Smith, J.E.: A Tutorial for Competent Memetic Algorithms: Model, Taxonomy, and Design Isssues. IEEE Trans. on Evol. Comp. 9(5), 474–488 (2005)CrossRefGoogle Scholar
  11. 11.
    Moscato, P.: On Evolution, Search, Optimization, Genetic Algorithms and Martial Arts Towards Memetic Algorithms. Tech. rep. Caltech Concurrent Computation Program, Rep. 826, California Inst. Technol., Pasadena, CA (1989)Google Scholar
  12. 12.
    Hart, W.E.: Adaptive Global Optimization with Local Search. PhD thesis, University of California, San Diego, CA, USA (1994)Google Scholar
  13. 13.
    Krasnogor, N.: Studies on the Theory and Design Space of Memetic Algorithms. PhD thesis, Faculty Comput., Math. and Eng., Univ. West of England, Bristol, U.K (2002)Google Scholar
  14. 14.
    Ong, Y.S., Keane, A.J.: Meta-Lamarckian Learning in Memetic Algorithms. IEEE Trans. on Evolutionary Computation 8(2), 99–110 (2004) citation: p. 100CrossRefGoogle Scholar
  15. 15.
    Krasnogor, N., Smith, J.E.: Emergence of Profitable Search Strategies Based on a Simple Inheritance Algorithm. In: Conf. Proc. GECCO 2001, pp. 432–439. M. Kaufmann, S. Francisco (2001)Google Scholar
  16. 16.
    Hinterding, R., Michalewicz, Z., Eiben, A.E.: Adaptation in Evolutionary Computation: A Survey. In: Conf. Proc. IEEE Conf. on Evol. Comp (CEC 1997), pp. 65–69. IEEE press, Los Alamitos (1997)Google Scholar
  17. 17.
    Goldberg, D.E., Voessner, S.: Optimizing Global-Local Search Hybrids. In: Conf. Proc. GECCO 1999, pp. 220–228. Morgan Kaufmann, San Mateo (1999)Google Scholar
  18. 18.
    Shina, A., Chen, Y., Goldberg, D.E.: Designing Efficient Genetic and Evolutionary Algorithm Hybrids. In: [5], 259–288 (2005)Google Scholar
  19. 19.
    Lozano, M., Herrera, F., Krasnogor, N., Molina, D.: Real-Coded Memetic Algorithms with Crossover Hill-Climbing. Evolutionary Computation Journal 12(2), 273–302 (2004)CrossRefGoogle Scholar
  20. 20.
    Zitzler, E., Teich, J., Bhattacharyya, S.S.: Optimizing the Efficiency of Parameterized Local Search within Global Search: A Preliminary Study. In: Conf. Proc. CEC 2000, pp. 365–372. IEEE press, Piscataway (2000)Google Scholar
  21. 21.
    Bambha, N.K., Bhattacharyya, S.S., Zitzler, E., Teich, J.: Systematic Integration of Parameterized Local Search into Evolutionary Algorithms. IEEE Trans. on Evolutionary Computation 8(2), 137–155 (2004)CrossRefGoogle Scholar
  22. 22.
    Jakob, W., Blume, C., Bretthauer, G.: Towards a Generally Applicable Self-Adapting Hybridization of Evolutionary Algorithms. In: Deb, K., et al. (eds.) GECCO 2004. LNCS, vol. 3102, pp. 790–791. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  23. 23.
    Smith, J.E.: Co-evolving Memetic Algorithms: A learning approach to robust scalable optimisation. In: Conf. Proc. CEC 2003, pp. 498–505. IEEE press, Piscataway (2003)Google Scholar
  24. 24.
    Krasnogor, N., Blackburne, B.P., Burke, E.K., Hirst, J.D.: Multimeme Algorithms for Protein Structure Prediction. In: Guervós, J.J.M., Adamidis, P.A., Beyer, H.-G., Fernández-Villacañas, J.-L., Schwefel, H.-P. (eds.) PPSN 2002. LNCS, vol. 2439, pp. 769–778. Springer, Heidelberg (2002)CrossRefGoogle Scholar
  25. 25.
    Schwefel, H.-P.: Evolution and Optimum Seeking. John Wiley & Sons, New York (1995)Google Scholar
  26. 26.
    Blume, C., Jakob, W.: GLEAM – An Evolutionary Algorithm for Planning and Control Based on Evolution Strategy. In: Cantú-Paz, E. (ed.) GECCO 2002, vol. Late Breaking Papers, pp. 31–38 (2002)Google Scholar
  27. 27.
    Gorges-Schleuter, M.: Genetic Algorithms and Population Structures - A Massively Parallel Algorithm. Dissertation, Dept. Comp. Science, University of Dortmund (1990)Google Scholar
  28. 28.
    Bäck, T.: GENEsYs 1.0 (1992),
  29. 29.
    Sieber, I., Eggert, H., Guth, H., Jakob, W.: Design Simulation and Optimization of Microoptical Components. In: Bell, K.D., et al. (eds.) Proceedings of Novel Optical Systems and Large-Aperture Imaging. SPIE, vol. 3430, pp. 138–149 (1998)Google Scholar
  30. 30.
    Blume, C., Gerbe, M.: Deutliche Senkung der Produktionskosten durch Optimierung des Ressourceneinsatzes. atp 36, 5/94 (in German), Oldenbourg Verlag, München, pp. 25–29 (1994)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Wilfried Jakob
    • 1
  1. 1.Institute for Applied Computer ScienceForschungszentrum Karlsruhe GmbHKarlsruheGermany

Personalised recommendations