Advertisement

A Synergistic Selection Strategy in the Genetic Algorithms

  • Ting Kuo
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4247)

Abstract

According to the Neo-Darwinist, natural selection can be classified into three categories: directional selection, disruptive selection, and stabilizing selection. Traditional genetic algorithms can be viewed as a process of evolution based on directional selection that gives more chances of reproduction to superior individuals. However, this strategy sometimes is myopic and is apt to trap the search into a local optimal. Should we restrict genetic algorithms to direction selection? No! First, we show that stabilizing selection and disruptive selection are complementary and that hybridize them may supersede directional selection. Then, we adopt an island model of parallel genetic algorithms on which two types of selection strategies are applied to two subpopulations that both evolve independently and migration is allowed between them periodically. Experimental results show that the cooperation of disruptive selection and stabilizing selection is an effective and robust way in the genetic algorithms.

Keywords

Genetic Algorithm Selection Strategy Problem Instance Fitness Measure Island Model 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Cohoon, J.P., Hedge, S.U., Martin, W.N., Richards, D.: Punctuated equilibria: A parallel Genetic Algorithms. In: Proceedings of the 2nd International Conference on Genetic algorithms, pp. 148–154 (1987)Google Scholar
  2. 2.
    Deb, K., Goldberg, D.E.: Sufficient conditions for deceptive and easy binary functions. Annals of Mathematics and Artificial Intelligence 10, 385–408 (1994)MATHCrossRefMathSciNetGoogle Scholar
  3. 3.
    Goldberg, D.E., Deb, K.: A Comparative Analysis of Selection Schemes Used in Genetic Algorithms. In: Proceedings of the 1st Foundations of Genetic Algorithms, pp. 69–93 (1991)Google Scholar
  4. 4.
    Grefenstette, J.J., Baker, J.E.: How Genetic Algorithms Work: a Critical Look at Implicit Parallelism. In: Proceedings of the 3rd International Conference on Genetic algorithms, pp. 20–27 (1989)Google Scholar
  5. 5.
    Holland, J.H.: Adaptation in Natural and Artificial Systems. The University of Michigan Press, Ann Arbor (1975)Google Scholar
  6. 6.
    Jones, T.C., Forrest, S.: Fitness Distance Correlation as a Measure of Problem Difficulty for Genetic Algorithms. In: Proceedings of the 6th International Conference on Genetic algorithms, pp. 184–192 (1995)Google Scholar
  7. 7.
    Kuo, T., Hwang, S.Y.: A Genetic Algorithm with Disruptive Selection. IEEE Trans. on System, Man, and Cybernetics 26(2), 299–307 (1996)CrossRefGoogle Scholar
  8. 8.
    Kuo, T., Hwang, S.Y.: Why DGAs Work Well on GA-hard Functions? New Generation Computing 14, 459–479 (1996)CrossRefGoogle Scholar
  9. 9.
    Kuo, T., Hwang, S.Y.: Using Disruptive Selection to Maintain Diversity in Genetic Algorithms. Applied Intelligence 7(3), 257–267 (1997)CrossRefGoogle Scholar
  10. 10.
    Manderick, B., Spiessens, P.: Fine-grained Parallel Genetic Algorithms. In: Proceedings of the 3rd International Conference on Genetic algorithms, pp. 428–433 (1989)Google Scholar
  11. 11.
    Manly, B.F.J.: The Statistics of Natural Selection on Animal Populations. Chapman and Hall, London (1984)Google Scholar
  12. 12.
    Petty, C.B., Leuze, M.R., Grefenstette, J.J.: A Parallel Genetic Algorithm. In: Proceedings of the 2nd International Conference on Genetic algorithms, pp. 155–161 (1987)Google Scholar
  13. 13.
    Tanese, R.: Distributed Genetic Algorithms. In: Proceedings of the 3rd International Conference on Genetic algorithms, pp. 434–439 (1987)Google Scholar
  14. 14.
    Whitley, L.D., Kauth, J.: GENITOR: a Different Genetic Algorithm. In: Proceedings of the Rocky Mountain Conference on Artificial Intelligence, Denver, CO, pp. 118–130 (1988)Google Scholar
  15. 15.
    Wolpert, D.H., William, G.: Macready: No Free Lunch Theorems for Optimization. IEEE Transactions on Evolutionary Computation 1(1), 67–82 (1997)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Ting Kuo
    • 1
  1. 1.Takming CollegeTaipeiTaiwan, R.O.C.

Personalised recommendations