On the Integrity of Performance Comparison for Evolutionary Multi-objective Optimisation Algorithms

  • Kevin WilsonEmail author
  • Shahin Rostami
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 840)


This paper proposes the notion that the experimental results and performance analyses of newly developed algorithms in the field of multi-objective optimisation may not offer sufficient integrity for hypothesis testing. The reason for this is that many implementations exist of the same optimisation algorithms, and these may vary in behaviour due to the interpretation of the developer. This is demonstrated through the comparison of three implementations of the popular Non-dominated Sorting Genetic Algorithm II (NSGA-II) from well-regarded frameworks using the hypervolume indicator. The results show that of the thirty considered comparison cases, only four indicate that there was no significant difference between the performance of either implementation.


Evolutionary algorithms Genetic algorithms Optimisation Hypervolume indicator 


  1. 1.
    Aulig, N., Olhofer, M.: Neuro-evolutionary topology optimization of structures by utilizing local state features. In: Proceedings of the 2014 Annual Conference on Genetic and Evolutionary Computation, GECCO 2014, pp. 967–974. ACM (2014)Google Scholar
  2. 2.
    Branke, J., Elomari, J.A.: Meta-optimization for parameter tuning with a flexible computing budget. In: Proceedings of the 14th Annual Conference on Genetic and Evolutionary Computation, GECCO 2012, pp. 1245–1252. ACM (2012)Google Scholar
  3. 3.
    Cocaña-Fernández, A., Sánchez, L., Ranilla, J.: Improving the eco-efficiency of high performance computing clusters using EECluster. Energies 9(3), 197 (2016)CrossRefGoogle Scholar
  4. 4.
    Deb, K., Agrawal, S., Pratap, A., Meyarivan, T.: A fast elitist non-dominated sorting genetic algorithm for multi-objective optimisation: NSGA-II. In: Parallel Problem Solving from Nature - PPSN VI, 6th International Conference, Paris, France, 18-20 September 2000, Proceedings, pp. 849–858 (2000)Google Scholar
  5. 5.
    Deb, K., Goyal, M.: A combined genetic adaptive search (geneas) for engineering design. Comput. Sci. Inform. 26, 30–45 (1996)Google Scholar
  6. 6.
    Deb, K., Pratap, A., Agarwal, S., Meyarivan, T.: A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evol. Comput. 6(2), 182–197 (2002)CrossRefGoogle Scholar
  7. 7.
    Desjardins, B., Falcon, R., Abielmona, R., Petriu, E.: A multi-objective optimization approach to reliable robot-assisted sensor relocation. In: 2015 IEEE Congress on Evolutionary Computation (CEC), pp. 956–964 (2015)Google Scholar
  8. 8.
    Durillo, J.J., Nebro, A.J.: jMetal: a java framework for multi-objective optimization. Adv. Eng. Softw. 42(10), 760–771 (2011)CrossRefGoogle Scholar
  9. 9.
    Friedrich, T., Bringmann, K., Voß, T., Igel, C.: The logarithmic hypervolume indicator. In: Proceedings of the 11th Workshop Proceedings on Foundations of Genetic Algorithms, FOGA 2011, pp. 81–92. ACM (2011)Google Scholar
  10. 10.
    Hadka, D.: MOEA - a free and open source java framework for multiobjective optimization (2015).
  11. 11.
    Helbig, M., Engelbrecht, A.P.: Performance measures for dynamic multi-objective optimisation algorithms. Inf. Sci. 250, 61–81 (2013)CrossRefGoogle Scholar
  12. 12.
    Igel, C., Heidrich-Meisner, V., Glasmachers, T.: The shark machine learning library (2013).
  13. 13.
    Ishibuchi, H., Imada, R., Setoguchi, Y., Nojima, Y.: Performance comparison of NSGA-II and NSGA-III on various many-objective test problems. In: 2016 IEEE Congress on Evolutionary Computation (CEC), pp. 3045–3052 (2016)Google Scholar
  14. 14.
    Ishibuchi, H., Tsukamoto, N., Nojima, Y.: Evolutionary many-objective optimization: a short review. In: 2008 IEEE Congress on Evolutionary Computation (IEEE World Congress on Computational Intelligence), pp. 2419–2426 (2008)Google Scholar
  15. 15.
    Kumar, K.D.A.: Real-coded genetic algorithms with simulated binary crossover. Complex Syst. 9, 431–454 (1995)Google Scholar
  16. 16.
    Lebesgue, H.: Intégrale, longueur, aire. Annali di matematica pura ed applicata 7(1), 231–359 (1902)CrossRefGoogle Scholar
  17. 17.
    Li, M., Yang, S., Liu, X.: A performance comparison indicator for pareto front approximations in many-objective optimization. In: Proceedings of the 2015 Annual Conference on Genetic and Evolutionary Computation, GECCO 2015, pp. 703–710. ACM (2015)Google Scholar
  18. 18.
    Michalewicz, Z., Hartley, S.J.: Genetic algorithms + data structures = evolution programs. Math. Intell. 18(3), 71 (1996)CrossRefGoogle Scholar
  19. 19.
    Nebro, A.: jMetal: a framework for multi-objective optimization with metaheuristics (2014).
  20. 20.
    Purshouse, R.C.: On the evolutionary optimisation of many objectives. University of Sheffield, Sheffield (2003)Google Scholar
  21. 21.
    Rostami, S., Neri, F.: Covariance matrix adaptation Pareto archived evolution strategy with hypervolume-sorted adaptive grid algorithm. Integr. Comput. Aided Eng. 23(4), 313 (2016)CrossRefGoogle Scholar
  22. 22.
    Rostami, S., Shenfield, A.: A multi-tier adaptive grid algorithm for the evolutionary multi-objective optimisation of complex problems. Soft Comput. 21, 1–17 (2016)Google Scholar
  23. 23.
    dos Santos Neto, P.d.A., Britto, R., Rabêlo, R.d.A.L., Cruz, J.J.d.A., Lira, W.A.L.: A hybrid approach to suggest software product line portfolios. Appl. Soft Comput. 49, 1243–1255 (2016)Google Scholar
  24. 24.
    Soh, H., Demiris, Y.: Evolving policies for multi-reward partially observable markov decision processes (MR-POMDPs). In: Proceedings of the 13th Annual Conference on Genetic and Evolutionary Computation, GECCO 2011, pp. 713–720. ACM (2011)Google Scholar
  25. 25.
    Strickler, A., Prado Lima, J.A., Vergilio, S.R., Pozo, A.T.R.: Deriving products for variability test of feature models with a hyper-heuristic approach. Appl. Soft Comput. 49, 1232–1242 (2016)CrossRefGoogle Scholar
  26. 26.
    Svensson, M.K.: Using evolutionary multiobjective optimization algorithms to evolve lacing patterns for bicycle wheels. Master’s thesis, NTNU-Trondheim (2015)Google Scholar
  27. 27.
    Voß, T., Hansen, N., Igel, C.: Improved step size adaptation for the MO-CMA-ES. In: Proceedings of the 12th Annual Conference on Genetic and Evolutionary Computation, GECCO 2010, pp. 487–494. ACM (2010)Google Scholar
  28. 28.
    Wilcoxon, F.: Individual comparisons by ranking methods. Biometrics Bull. 1(6), 80–83 (1945)CrossRefGoogle Scholar
  29. 29.
    Wright, A.H., et al.: Genetic algorithms for real parameter optimization. Found. Genet. Algorithms 1, 205–218 (1991)MathSciNetGoogle Scholar
  30. 30.
    Zitzler, E., Deb, K., Thiele, L.: Comparison of multiobjective evolutionary algorithms: empirical results. Evol. Comput. 8(2), 173–195 (2000)CrossRefGoogle Scholar
  31. 31.
    Zitzler, E., Thiele, L.: An evolutionary algorithm for multiobjective optimization: The strength pareto approach. Citeseer, Swiss Federal Institute of Tech (1998)Google Scholar
  32. 32.
    Zitzler, E., Thiele, L., Laumanns, M., Fonseca, C.M., Fonseca, V.G.d.: Performance assessment of multiobjective optimizers: an analysis and review. IEEE Trans. Evol. Comput. 7(2), 117–132 (2003)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Computational Intelligence Research Initiative (CIRI)Bournemouth UniversityBournemouthUK

Personalised recommendations