A Study of the Combination of Variation Operators in the NSGA-II Algorithm

  • Antonio J. Nebro
  • Juan J. Durillo
  • Mirialys Machín
  • Carlos A. Coello Coello
  • Bernabé Dorronsoro
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8109)

Abstract

Multi-objective evolutionary algorithms rely on the use of variation operators as their basic mechanism to carry out the evolutionary process. These operators are usually fixed and applied in the same way during algorithm execution, e.g., the mutation probability in genetic algorithms. This paper analyses whether a more dynamic approach combining different operators with variable application rate along the search process allows to improve the static classical behavior. This way, we explore the combined use of three different operators (simulated binary crossover, differential evolution’s operator, and polynomial mutation) in the NSGA-II algorithm. We have considered two strategies for selecting the operators: random and adaptive. The resulting variants have been tested on a set of 19 complex problems, and our results indicate that both schemes significantly improve the performance of the original NSGA-II algorithm, achieving the random and adaptive variants the best overall results in the bi- and three-objective considered problems, respectively.

Keywords

Multiobjective Optimization Evolutionary Algorithms Variation Operators Adaptation 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Blum, C., Roli, A.: Metaheuristics in combinatorial optimization: Overview and conceptual comparison. ACM Computing Surveys 35(3), 268–308 (2003)CrossRefGoogle Scholar
  2. 2.
    Deb, K., Pratap, A., Agarwal, S., Meyarivan, T.: A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE TEVC 6(2), 182–197 (2002)Google Scholar
  3. 3.
    Deb, K., Sinha, A., Kukkonen, S.: Multi-objective test problems, linkages, and evolutionary methodologies. In: GECCO 2006, pp. 1141–1148 (2006)Google Scholar
  4. 4.
    Demšar, J.: Statistical comparisons of classifiers over multiple data sets. J. Mach. Learn. Res. 7, 1–30 (2006)MathSciNetMATHGoogle Scholar
  5. 5.
    Huang, V.L., Qin, A.K., Suganthan, P.N., Tasgetiren, M.F.: Multi-objective optimization based on self-adaptive differential evolution algorithm. In: Proceedings of the 2007 IEEE CEC, pp. 3601–3608 (2007)Google Scholar
  6. 6.
    Huang, V.L., Zhao, S.Z., Mallipeddi, R., Suganthan, P.N.: Multi-objective optimization using self-adaptive differential evolution algorithm. In: Proceedings of the 2009 IEEE CEC, pp. 190–194 (2009)Google Scholar
  7. 7.
    Iorio, A.W., Li, X.: Solving rotated multi-objective optimization problems using differential evolution. In: Australian Conference on Artificial Intelligence, pp. 861–872 (2004)Google Scholar
  8. 8.
    Knowles, J., Thiele, L., Zitzler, E.: A Tutorial on the Performance Assessment of Stochastic Multiobjective Optimizers. Technical Report 214, Computer Engineering and Networks Laboratory (TIK), ETH Zurich (2006)Google Scholar
  9. 9.
    Li, H., Zhang, Q.: Multiobjective optimization problems with complicated pareto sets, MOEA/D and NSGA-II. IEEE TEVC 2(12), 284–302 (2009)Google Scholar
  10. 10.
    Toscano Pulido, G., Coello Coello, C.A.: The micro genetic algorithm 2: Towards online adaptation in evolutionary multiobjective optimization. In: Fonseca, C.M., Fleming, P.J., Zitzler, E., Deb, K., Thiele, L. (eds.) EMO 2003. LNCS, vol. 2632, pp. 252–266. Springer, Heidelberg (2003)CrossRefGoogle Scholar
  11. 11.
    Vrugt, J.A., Robinson, B.A.: Improved evolutionary optimization from genetically adaptive multimethod search. Proceedings of the National Academy of Sciences of the United States of America 104(3), 708–711 (2007)CrossRefGoogle Scholar
  12. 12.
    Zhang, Q., Suganthan, P.N.: Special session on performance assessment of multiobjective optimization algorithms/cec 09 moea competition (May 2009)Google Scholar
  13. 13.
    Zhang, Q., Zou, A., Zhao, S., Suganthan, P.N., Liu, W., Tivari, S.: Multiobjective optimization test instances for the cec 2009 special session and competition. Technical Report CES-491, School of CS & EE, University of Essex (April 2009)Google Scholar
  14. 14.
    Zitzler, E., Thiele, L.: Multiobjective evolutionary algorithms: a comparative case study and the strength pareto approach. IEEE TEVC 3(4), 257–271 (1999)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Antonio J. Nebro
    • 1
  • Juan J. Durillo
    • 2
  • Mirialys Machín
    • 3
  • Carlos A. Coello Coello
    • 4
  • Bernabé Dorronsoro
    • 5
  1. 1.Department of Computer ScienceUniversity of MálagaSpain
  2. 2.Institute of Computer ScienceUniversity of InnsbruckAustria
  3. 3.Departamento de ComputaciónUniversity of Informatic SciencesCuba
  4. 4.CINVESAV-IPNMexico
  5. 5.Computer Science Laboratory of LilleUniversité Lille 1France

Personalised recommendations