A New Memory Based Variable-Length Encoding Genetic Algorithm for Multiobjective Optimization

  • Eduardo G. Carrano
  • Lívia A. Moreira
  • Ricardo H. C. Takahashi
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6576)

Abstract

This paper presents a new memory-based variable-length encoding genetic algorithm for solving multiobjective optimization problems. The proposed method is a binary implementation of the NSGA2 and it uses a Hash Table for storing all the solutions visited during algorithm evolution. This data structure makes possible to avoid the re-visitation of solutions and it provides recovering and storage of data with low computational cost. The algorithm memory is used for building crossover, mutation and local search operators with a parameterless variable-length encoding. These operators control the neighborhood based on the density of points already visited on the region of the new solution to be evaluated. Two classical multiobjective problems are used to compare two variations of the proposed algorithm and two variations of the binary NSGA2. A statistical analysis of the results indicates that the memory-based adaptive neighborhood operators are able to provide significant improvement of the quality of the Pareto-set approximations.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Glover, F.W., Laguna, M.: Tabu Search, 1st edn. Springer, Heidelberg (1998)MATHGoogle Scholar
  2. 2.
    Kratica, J.: Improving performances of the genetic algorithm by caching. Computers and Artificial Intelligence 18, 271–283 (1999)MATHGoogle Scholar
  3. 3.
    Povinelli, R.J., Feng, X.: Improving genetic algorithms performance by hashing fitness values. In: Proc. Artificial Neural Networks on Engineering (1999)Google Scholar
  4. 4.
    Yuen, S.Y., Chow, C.K.: A genetic algorithm that adaptively mutates and never revisits. IEEE Transactions on Evolutionary Computation 13, 454–472 (2009)CrossRefGoogle Scholar
  5. 5.
    Mauldin, M.L.: Maintaining diversity in genetic search. In: Proc. National Conference on Artificial Intelligence (1984)Google Scholar
  6. 6.
    Friedrich, T., Hebbinghaus, N., Neumann, F.: Rigorous analyses of simple diversity mechanisms. In: Proc. Genetic and Evolutionary Computation Conference (2007)Google Scholar
  7. 7.
    Kirkpatrick, S., Gelatt Jr., C.D., Vecchi, M.P.: Optimization by simulated annealing. Science 220, 671–680 (1983)MathSciNetCrossRefMATHGoogle Scholar
  8. 8.
    Beyer, H., Schwefel, H.: Evolution strategies - A comprehensive introduction. Natural Computing 1, 3–52 (2002)MathSciNetCrossRefMATHGoogle Scholar
  9. 9.
    Eiben, A.E., Hinterding, R., Michalewicz, Z.: Parameter control in evolutionary algorithms. IEEE Transactions on Evolutionary Computation 3, 124–141 (1999)CrossRefGoogle Scholar
  10. 10.
    Deb, K., Pratap, A., Agarwal, S., Meyarivan, T.: A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Transactions on Evolutionary Computation 6, 182–197 (2002)CrossRefGoogle Scholar
  11. 11.
    Ronald, S.: Duplicate genotypes in a genetic algorithm. In: Proc. IEEE International Conference on Evolutionary Computation (1998)Google Scholar
  12. 12.
    Cormen, T.H., Leiserson, C.E., Rivest, R.L., Stein, C.: Introduction to Algorithms, 2nd edn. The MIT Press, Cambridge (2001)MATHGoogle Scholar
  13. 13.
    Pearson, P.K.: Fast hashing of variable-length text strings. Computing Practices 33, 677–680 (1990)Google Scholar
  14. 14.
    Carrano, E.G., Takahashi, R.H.C., Wanner, E.F.: An enhanced statistical approach for evolutionary algorithm comparison. In: Proc. Genetic and Evolutionary Computation Conference (2008)Google Scholar
  15. 15.
    Carrano, E.G., Wanner, E.F., Takahashi, R.H.C.: A multi-criteria statistical based comparison methodology for evaluating evolutionary algorithms. IEEE Transactions on Evolutionary Computation (to appear, 2011)Google Scholar
  16. 16.
    Zitzler, E.: Evolutionary algorithms for multiobjective optimization: Methods and applications. Ph.D. dissertation, Computer Engineering and Networks Laboratory - Swiss Federal Institute of Technology Zurich (1999)Google Scholar
  17. 17.
    Efron, B.: Bootstrap methods: Another look at the jackknife. The Annals of Statistics 7, 1–26 (1979)MathSciNetCrossRefMATHGoogle Scholar
  18. 18.
    Lindman, H.R.: Analysis of Variance in Complex Experimental Designs. W. H. Freeman & Co, San Francisco (1974)MATHGoogle Scholar
  19. 19.
    Kursawe, F.: A variant of evolution strategies for vector optimization. In: Schwefel, H.-P., Männer, R. (eds.) PPSN 1990. LNCS, vol. 496, pp. 193–197. Springer, Heidelberg (1991)CrossRefGoogle Scholar
  20. 20.
    Fonseca, C.M., Fleming, P.J.: An overview of evolutionary algorithms in multiobjective optimization. Evolutionary Computation 3, 1–16 (1995)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Eduardo G. Carrano
    • 1
  • Lívia A. Moreira
    • 2
  • Ricardo H. C. Takahashi
    • 3
  1. 1.Department of Computer EngineeringCentro Federal de Educação Tecnológica de Minas GeraisBelo HorizonteBrazil
  2. 2.Department of Electrical EngineeringCentro Federal de Educação Tecnológica de Minas GeraisBelo HorizonteBrazil
  3. 3.Department of MathematicsUniversidade Federal de Minas GeraisBelo HorizonteBrazil

Personalised recommendations