Every niching method has its niche: Fitness sharing and implicit sharing compared

  • Paul Darwen
  • Xin Yao
Modifications and Extensions of Evolutionary Algorithms Adaptation, Niching, and Isolation in Evolutionary Algorithms
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1141)


Various extensions to the Genetic Algorithm (GA) attempt to find all or most optima in a search space containing several optima. Many of these emulate natural speciation. For co-evolutionary learning to succeed in a range of management and control problems, such as learning game strategies, such methods must find all or most optima. However, suitable comparison studies are rare. We compare two similar GA speciation methods, fitness sharing and implicit sharing. Using a realistic letter classification problem, we find they have advantages under different circumstances. Implicit sharing covers optima more comprehensively, when the population is large enough for a species to form at each optimum. With a population not large enough to do this, fitness sharing can find the optima with larger basins of attraction, and ignore the peaks with narrow bases, while implicit sharing is more easily distracted. This indicates that for a speciated GA trying to find as many near-global optima as possible, implicit sharing works well only if the population is large enough. This requires prior knowledge of how many peaks exist.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    R. Axelrod. The evolution of strategies in the iterated prisoner's dilemma. In Genetic Algorithms and Simulated Annealing, pages 32–41. Morgan Kauffman, 1987.Google Scholar
  2. 2.
    P. Darwen and X. Yao. A dilemma for fitness sharing with a scaling function. In 1995 IEEE Conference on Evolutionary Computation, pages 166–171, 1995.Google Scholar
  3. 3.
    P. Darwen and X. Yao. On evolving robust strategies for iterated prisoner's dilemma. In Progress in Evolutionary Computation, pages 276–292. Springer, 1995.Google Scholar
  4. 4.
    P. Darwen and X. Yao. Automatic modularization with speciation. In 1996 IEEE Conference on Evolutionary Computation, pages 88–93, 1996.Google Scholar
  5. 5.
    K. Deb and D. E. Goldberg. An investigation of niche and species formation in genetic function optimization. In ICGA-3, pages 42–50, June 1989.Google Scholar
  6. 6.
    L. J. Eshelman and J. D. Schaffer. Preventing premature confergence in genetic algorithms by preventing incest. In ICGA-4, pages 115–122, 1991.Google Scholar
  7. 7.
    T. C. Fogarty. First nearest neighbor classification problem on Frey and Slate's letter recognition problem. Machine Learning, 9(4):387–388, Oct. 1992.Google Scholar
  8. 8.
    S. Forrest, B. Javornik, R. E. Smith, and A. S. Perelson. Using genetic algorithms to explore pattern recognition in the immune system. Evolutionary Computation, 1(3):191–211, 1993.Google Scholar
  9. 9.
    P. Frey and D. Slate. Letter recognition using Holland-style adaptive classifiers. Machine Learning, 6(2):161–182, Mar. 1991.Google Scholar
  10. 10.
    D. E. Goldberg. Genetic Algorithms in Search, Optimization, and Machine Learning Addison-Wesley, 1989.Google Scholar
  11. 11.
    D. E. Goldberg, K. Deb, and J. Horn. Massive multimodality, deception, and genetic algorithms. In PPSN2, pages 37–46. North-Holland, 1992.Google Scholar
  12. 12.
    S. W. Mahfoud. Genetic drift in sharing methods. In First IEEE Conference on Evolutionary Computation, volume 1, pages 67–72, June 1994.CrossRefGoogle Scholar
  13. 13.
    S. W. Mahfoud. Niching Methods for Genetic Algorithms. PhD thesis, University of Illinois at Urbana-Champaign, 1995.Google Scholar
  14. 14.
    D. Michie, D. J. Spiegelhalter, and C. C. Taylor. Machine Learning, Neural and Statistical Classification. Ellis Horwood, 1994.Google Scholar
  15. 15.
    A. Pétrowski. A clearing procedure as a niching method for genetic algorithms. In 1996 IEEE Conference on Evolutionary Computation, pages 798–803, 1996.Google Scholar
  16. 16.
    S. Ronald. Finding multiple solutions with an evolutionary algorithm. In 1995 IEEE Conference on Evolutionary Computation, pages 641–646, 1995.Google Scholar
  17. 17.
    C. D. Rosin and R. K. Belew. Methods for competitive co-evolution: Finding opponents worth beating. In ICGA-6, pages 373–380, 1995.Google Scholar
  18. 18.
    C. Ryan. Racial harmony and function optimization in genetic algorithms. In 1995 Evolutionary Programming Conference, pages 296–307, 1995.Google Scholar
  19. 19.
    S. L. Salzberg. On comparing classifiers: A critique of current research and methods. Technical Report CS-1995-06, John Hopkins University, 1995.Google Scholar
  20. 20.
    R. E. Smith, S. Forrest, and A. S. Perelson. Searching for diverse, cooperative populations with genetic algorithms. Evolutionary Computation, 1:127–149, 1992.Google Scholar
  21. 21.
    R. E. Smith and B. Gray. Co-adaptive genetic algorithms: An example in Othello strategy. In 1994 Florida AI Research Symposium, pages 259–264, 1994.Google Scholar
  22. 22.
    W. M. Spears. Simple subpopulation schemes. In 1994 Evolutionary Programming Conference, pages 296–307. World Scientific, 1994.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1996

Authors and Affiliations

  • Paul Darwen
    • 1
  • Xin Yao
    • 1
  1. 1.School of Computer ScienceUniversity College UNSW Australian Defence Force AcademyCanberraAustralia

Personalised recommendations