On the Effect of Applying a Steady-State Selection Scheme in the Multi-Objective Genetic Algorithm NSGA-II

  • Antonio J. Nebro
  • Juan J. Durillo
Part of the Studies in Computational Intelligence book series (SCI, volume 193)

Abstract

Genetic Algorithms (GAs) are among the most popular techniques to solve multi-objective optimization problems, with NSGA-II being the most well-known algorithm in the field. Although most of multi-objective GAs (MOGAs) use a generational scheme, in the last few years some proposals using a steady-state scheme have been developed. However, studies about the influence of using those selection strategies in MOGAs are scarce. In this chapter we implement a steady-state version of NSGA-II, which is a generational MOGA, and we compare the two versions with a set of four state-of-the-art multi-objective metaheuristics (SPEA2, OMOPSO, AbYSS, and MOCell) attending to two criteria: the quality of the resulting approximation sets to the Pareto front and the convergence speed of the algorithms. The obtained results show that search capabilities of the steady-state version of NSGA-II significantly improves the original version, providing very competitive results in terms of the quality of the obtained Pareto front approximations and the convergence speed.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Chafekar, D., Xuan, J., Rasheed, K.: Constrained Multi-objective Optimization Using Steady State Genetic Algorithms. In: Cantú-Paz, E., et al. (eds.) GECCO 2003. LNCS, vol. 2723, pp. 813–824. Springer, Heidelberg (2003)CrossRefGoogle Scholar
  2. 2.
    Deb, K.: Multi-Objective Optimization Using Evolutionary Algorithms. John Wiley & Sons, Chichester (2001)MATHGoogle Scholar
  3. 3.
    Deb, K., Pratap, A., Agarwal, S., Meyarivan, T.: A Fast and Elitist Multiobjective Genetic Algorithm: NSGA-II. IEEE Transactions on Evolutionary Computation 6(2), 182–197 (2002)CrossRefGoogle Scholar
  4. 4.
    Deb, K., Mohan, M., Mishra, S.: Towards a Quick Computation of Well-Spread Pareto-Optimal Solutions. In: Fonseca, C.M., Fleming, P.J., Zitzler, E., Deb, K., Thiele, L. (eds.) EMO 2003. LNCS, vol. 2632, pp. 222–236. Springer, Heidelberg (2003)CrossRefGoogle Scholar
  5. 5.
    Deb, K., Mohan, M., Mishar, S.: Evaluating the ε-Domination Based Multi-Objective Evolutionary Algorithm for a Quick Computation of Pareto-Optimal Solutions. Evolutionary Computation 13(4), 501–525 (2005)CrossRefGoogle Scholar
  6. 6.
    Deb, K., Thiele, L., Laumanns, M., Zitzler, E.: Scalable Test Problems for Evolutionary Multiobjective Optimization. In: Abraham, A., Jain, L., Goldberg, R. (eds.) Evolutionary Multiobjective Optimization. Theoretical Advances and Applications, pp. 105–145. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  7. 7.
    Demšar, J.: Statistical Comparisons of Classifiers over Multiple Data Sets. J. Mach. Learn. Res. 7, 1–30 (2006)MathSciNetGoogle Scholar
  8. 8.
    Durillo, J.J., Nebro, A.J., Luna, F., Dorronsoro, B., Alba, E.: jMetal: A Java Framework for Developing Multi-Objective Optimization Metaheuristics. Tech. Rep. ITI-2006-10, Dept. de Lenguajes y Ciencias de la Computación, University of Málaga (2006)Google Scholar
  9. 9.
    Durillo, J.J., Nebro, A.J., Luna, F., Alba, E.: A Study of Master-Slave Approaches to Parallelize NSGA-II. In: IEEE Iinternational Symposium on Parallel and Distributed Processing - IPDPS 2008, pp. 1–8 (2008)Google Scholar
  10. 10.
    Emmerich, M., Beume, N., Naujoks, B.: An EMO Algorithm Using the Hypervolume Measure as Selection Criterion. In: Coello Coello, C.A., Hernández Aguirre, A., Zitzler, E. (eds.) EMO 2005. LNCS, vol. 3410, pp. 62–76. Springer, Heidelberg (2005)Google Scholar
  11. 11.
    Glover, F.W., Kochenberger, G.A.: Handbook of Metaheuristics. Kluwer Academic Publishers, Dordrecht (2003)MATHGoogle Scholar
  12. 12.
    Huband, S., Hingston, P., Barone, L., While, R.L.: A review of multiobjective test problems and a scalable test problem toolkit. IEEE Trans Evolutionary Computation 10(5), 477–506 (2006)CrossRefGoogle Scholar
  13. 13.
    Igel, C., Suttorp, T., Hansen, N.: Steady-state Selection and Efficient Covariance Matrix Update in the Multi-objective CMA-ES. In: Obayashi, S., Deb, K., Poloni, C., Hiroyasu, T., Murata, T. (eds.) EMO 2007. LNCS, vol. 4403, pp. 171–185. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  14. 14.
    Knowles, J., Thiele, L., Zitzler, E.: A Tutorial on the Performance Assessment of Stochastic Multiobjective Optimizers. Tech. Rep. 214, Computer Engineering and Networks Laboratory (TIK), ETH Zurich (2006)Google Scholar
  15. 15.
    Kumar, R., Rockett, P.: Improved Sampling of the Pareto-Front in Multiobjective Genetic Optimizations by Steady-State evolution: A Pareto Converging Genetic Algorithm. Evolutionary Computation 10(3), 283–314 (2002)CrossRefGoogle Scholar
  16. 16.
    Nebro, A.J., Durillo, J.J., Luna, F., Dorronsoro, B., Alba, E.: Design issues in a multiobjective cellular genetic algorithm. In: Obayashi, S., Deb, K., Poloni, C., Hiroyasu, T., Murata, T. (eds.) EMO 2007. LNCS, vol. 4403, pp. 126–140. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  17. 17.
    Nebro, A.J., Durillo, J.J., Coello Coello, C.A., Luna, F., Alba, E.: A Study of Convergence Speed in Multi-objective Metaheuristics. In: Rudolph, G., Jansen, T., Lucas, S., Poloni, C., Beume, N. (eds.) PPSN 2008. LNCS, vol. 5199, pp. 171–185. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  18. 18.
    Nebro, A.J., Luna, F., Alba, E., Dorronsoro, B., Durillo, J.J., Beham, A.: AbYSS: Adapting Scatter Search to Multiobjective Optimization. IEEE Transactions on Evolutionary Computation 12(4), 439–457 (2008)CrossRefGoogle Scholar
  19. 19.
    Reyes-Sierra, M., Coello Coello, C.A.: Improving PSO-based multi-objective optimization using crowding, mutation and ε-dominance. In: Coello Coello, C.A., Hernández Aguirre, A., Zitzler, E. (eds.) EMO 2005. LNCS, vol. 3410, pp. 509–519. Springer, Heidelberg (2005)Google Scholar
  20. 20.
    Reyes-Sierra, M., Coello Coello, C.A.: Multi-Objective Particle Swarm Optimizers: A Survey of the State-of-the-Art. International Journal of Computational Intelligence Research 2(3), 287–308 (2006)MathSciNetGoogle Scholar
  21. 21.
    Schaffer, J.D.: Multiple Objective Optimization with Vector Evaluated Genetic Algorithms. In: Grefenstette, J.J. (ed.) Proceedings of the 1st International Conference on Genetic Algorithms, pp. 93–100 (1985)Google Scholar
  22. 22.
    Srinivasan, D., Rachmawati, L.: An Efficient Multi-objective Evolutionary Algorithm with Steady-State Replacement Model. In: Genetic and Evolutionary Computation - GECCO 2006, pp. 715–722 (2006)Google Scholar
  23. 23.
    Valenzuela, C.L.: A Simple Evolutionary Algorithm for Multi-Objective Optimization (SEAMO). In: IEEE Congress on Evolutionary Computation - CEC 2002, pp. 717–722 (2002)Google Scholar
  24. 24.
    Zitzler, E., Thiele, L.: Multiobjective Evolutionary Algorithms: A Comparative Case Study and the Strength Pareto Approach. IEEE Transactions on Evolutionary Computation 3(4), 257–271 (1999)CrossRefGoogle Scholar
  25. 25.
    Zitzler, E., Deb, K., Thiele, L.: Comparison of Multiobjective Evolutionary Algorithms: Empirical Results. Evolutionary Computation 8(2), 173–195 (2000)CrossRefGoogle Scholar
  26. 26.
    Zitzler, E., Laumanns, M., Thiele, L.: SPEA2: Improving the Strength Pareto Evolutionary Algorithm. Tech. Rep. 103, Computer Engineering and Networks Laboratory (TIK), Swiss Federal Institute of Technology (ETH), Zurich, Switzerland (2001)Google Scholar
  27. 27.
    Zitzler, E., Thiele, L., Laumanns, M., Fonseca, C., da Fonseca, V.G.: Performance assessment of multiobjective optimizers: An analysis and review. IEEE Transactions on Evolutionary Computation 7, 117–132 (2003)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Antonio J. Nebro
    • 1
  • Juan J. Durillo
    • 1
  1. 1.Dept. Lenguajes y Ciencias de la Computación, ETSI InformáticaUniversity of MálagaMálagaSpain

Personalised recommendations