Raising the Dead: Extending Evolutionary Algorithms with a Case-Based Memory

  • Jeroen Eggermont
  • Tom Lenaerts
  • Sanna Poyhonen
  • Alexandre Termier
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2038)


In dynamically changing environments, the performance of a standard evolutionary algorithm deteriorates. This is due to the fact that the population, which is considered to contain the history of the evolutionary process, does not contain enough information to allow the algorithm to react adequately to changes in the fitness landscape. Therefore, we added a simple, global case-based memory to the process to keep track of interesting historical events. Through the introduction of this memory and a storing and replacement scheme we were able to improve the reaction capabilities of an evolutionary algorithm with a periodically changing fitness function.


Genetic Algorithm Evolutionary Algorithm Tracking Error Knapsack Problem Global Memory 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. BNKF98.
    W. Banzhaf, P. Nordin, R.E. Keller, and F.D. Francone. Genetic Programing: an introduction. Morgan Kauffman, 1998.Google Scholar
  2. Bra99.
    J. Branke. Evolutionary approaches to dynamic optimization problems; a survey. GECCO Workshop on Evolutionary Algorithms for Dynamic Optimization Problems, pages 134–137, 1999.Google Scholar
  3. Gol89.
    D. Goldberg. Genetic Algorithms in Search, Optimization and Machine Learning. Addison-Wesley, 1989.Google Scholar
  4. GS87.
    D.E. Goldberg and R.E. Smith. Nonstationary function optimization using genetic algorithms with dominance and diploidy. 2nd International Conference on Genetic Algorithms, pages 59–68, 1987.Google Scholar
  5. Hil92.
    W. Hillis. Co-evolving parasites improve simulated evolution as an opimization procedure. Artificial Life II, pages 313–324, 1992.Google Scholar
  6. KMS.
    M. Keijzer, J.J. Merlo, and M. Schoenauer, editors. Evolutionary Objects.
  7. Mer.
    J.J. Merelo, editor. EO Evolutionary Computation Framework.
  8. Mit97.
    M. Mitchell. An Introduction to Genetic Algorithms. A Bradford Book, MIT Press, 3th edition, 1997.Google Scholar
  9. Par97.
    J. Paredis. Coevolutionary algorithms. The Handbook of Evolutionary Computation, 1997.Google Scholar
  10. Par99.
    J. Paredis. Coevolution, memory and balance. International Joint Conference on Artificial Intelligence, pages 1212–1217, 1999.Google Scholar
  11. Ros97a.
    W.A. Rosenkrantz. Introduction to Probability and Statistics for Scientists and Engineers. Mc. Graw-Hill, series in Probability and Statistics, 1997.Google Scholar
  12. Ros97b.
    C. Rosin. Coevolutionary Search among Adversaries. PhD thesis, University of California, San Diego, 1997.Google Scholar
  13. Rya97.
    C. Ryan. Diploidy without dominance. 3rd Nordic Workshop on Genetic Algorithms, pages 63–70, 1997.Google Scholar
  14. SG98.
    A. Silberschatz and P. Galvin. Operating System Concepts. Wiley, 5edition, 1998.Google Scholar
  15. SLOA99.
    L. Spector, W.B. Langdon, U.-M. O’Reilly, and P.J. Angeline, editors. Advances in Genetic Programming, volume 3. MIT Press, 1999.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2001

Authors and Affiliations

  • Jeroen Eggermont
    • 1
  • Tom Lenaerts
    • 2
  • Sanna Poyhonen
    • 3
  • Alexandre Termier
    • 4
  1. 1.Leiden Institute of Advanced Computer ScienceLeiden UniversityThe Netherlands
  2. 2.Computational Modeling LabBrussels Free UniversityBelgium
  3. 3.Control Engineering LaboratoryHelsinki University of TechnologyFinland
  4. 4.Laboratoire de Recherche en InformatiqueUniversity of Paris XIFrance

Personalised recommendations