OCD: Online Convergence Detection for Evolutionary Multi-Objective Algorithms Based on Statistical Testing

  • Tobias Wagner
  • Heike Trautmann
  • Boris Naujoks
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5467)


Over the last decades, evolutionary algorithms (EA) have proven their applicability to hard and complex industrial optimization problems in many cases. However, especially in cases with high computational demands for fitness evaluations (FE), the number of required FE is often seen as a drawback of these techniques. This is partly due to lacking robust and reliable methods to determine convergence, which would stop the algorithm before useless evaluations are carried out. To overcome this drawback, we define a method for online convergence detection (OCD) based on statistical tests, which invokes a number of performance indicators and which can be applied on a stand-alone basis (no predefined Pareto fronts, ideal and reference points). Our experiments show the general applicability of OCD by analyzing its performance for different algorithmic setups and on different classes of test functions. Furthermore, we show that the number of FE can be reduced considerably – compared to common suggestions from literature – without significantly deteriorating approximation accuracy.


Pareto Front Multiobjective Optimization Multiobjective Evolutionary Algorithm Regression Criterion True Pareto Front 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Zitzler, E., Künzli, S.: Indicator-based selection in multiobjective search. In: Yao, X., Burke, E.K., Lozano, J.A., Smith, J., Merelo-Guervós, J.J., Bullinaria, J.A., Rowe, J.E., Tiňo, P., Kabán, A., Schwefel, H.-P. (eds.) PPSN 2004. LNCS, vol. 3242, pp. 832–842. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  2. 2.
    Beume, N., Naujoks, B., Emmerich, M.: SMS-EMOA: Multiobjective selection based on dominated hypervolume. European Journal of Operational Research 181(3), 1653–1669 (2007)CrossRefMATHGoogle Scholar
  3. 3.
    Zitzler, E., Thiele, L., Bader, J.: SPAM: Set preference algorithm for multiobjective optimization. In: Rudolph, G., Jansen, T., Lucas, S., Poloni, C., Beume, N. (eds.) PPSN 2008. LNCS, vol. 5199, pp. 847–858. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  4. 4.
    Wagner, T., Beume, N., Naujoks, B.: Pareto-, aggregation-, and indicator-based methods in many-objective optimization. In: Obayashi, S., Deb, K., Poloni, C., Hiroyasu, T., Murata, T. (eds.) EMO 2007. LNCS, vol. 4403, pp. 742–756. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  5. 5.
    Emmerich, M., Giannakoglou, K., Naujoks, B.: Single- and multi-objective evolutionary optimization assisted by gaussian random field metamodels. IEEE Trans. on Evolutionary Computation 10(4), 421–439 (2006)CrossRefGoogle Scholar
  6. 6.
    Knowles, J.: ParEGO: A hybrid algorithm with on-line landscape approximation for expensive multiobjective optimization problems. IEEE Trans. on Evolutionary Computation 10(1), 50–66 (2006)CrossRefGoogle Scholar
  7. 7.
    Ponweiser, W., Wagner, T., Biermann, D., Vincze, M.: Multiobjective optimization on a limited amount of evaluations using model-assisted \(\mathcal{S}\)-metric selection. In: Rudolph, G., Jansen, T., Lucas, S., Poloni, C., Beume, N. (eds.) PPSN 2008. LNCS, vol. 5199, pp. 784–794. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  8. 8.
    Deb, K., Mohan, M., Mishra, S.: A fast multi-objective evolutionary algorithm for finding well-spread pareto-optimal solutions. KanGAL report 2003002, Indian Institute of Technology, Kanpur, India (2003)Google Scholar
  9. 9.
    Huang, V.L., Qin, A.K., Deb, K., Zitzler, E., Suganthan, P.N., Liang, J.J., Preuss, M., Huband, S.: Problem definitions for performance assessment of multi-objective optimization algorithms. Technical report, Nanyang Technological University (2007)Google Scholar
  10. 10.
    Rudolph, G., Naujoks, B., Preuss, M.: Capabilities of EMOA to detect and preserve equivalent pareto subsets. In: Obayashi, S., Deb, K., Poloni, C., Hiroyasu, T., Murata, T. (eds.) EMO 2007. LNCS, vol. 4403, pp. 36–50. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  11. 11.
    Kumar, A., Sharma, D., Deb, K.: A hybrid multi-objective optimization procedure using PCX based NSGA-II and sequential quadratic programming. In: Michalewicz, Z., Reynolds, R.G. (eds.) Congress on Evolutionary Computation (CEC). IEEE Press, Piscataway (2007)Google Scholar
  12. 12.
    Deb, K., Lele, S., Datta, R.: A hybrid evolutionary multi-objective and SQP based procedure for constrained optimization. In: Kang, L., Liu, Y., Zeng, S. (eds.) ISICA 2007. LNCS, vol. 4683, pp. 36–45. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  13. 13.
    Emmerich, M., Deutz, A., Beume, N.: Gradient-based/Evolutionary relay hybrid for computing pareto front approximations maximizing the S-metric. In: Bartz-Beielstein, T., Blesa Aguilera, M.J., Blum, C., Naujoks, B., Roli, A., Rudolph, G., Sampels, M. (eds.) HCI/ICCV 2007. LNCS, vol. 4771, pp. 140–156. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  14. 14.
    Deb, K., Jain, S.: Running performance metrics for evolutionary multi-objective optimization. In: Simulated Evolution and Learning (SEAL), pp. 13–20 (2002)Google Scholar
  15. 15.
    Rudenko, O., Schoenauer, M.: A steady performance stopping criterion for pareto-based evolutionary algorithms. In: Multi-Objective Programming and Goal Programming (2004)Google Scholar
  16. 16.
    Deb, K., Pratap, A., Agarwal, S.: A fast and elitist multi-objective genetic algorithm: NSGA-II. IEEE Trans. on Evolutionary Computation 6(8) (2002)Google Scholar
  17. 17.
    Zitzler, E., Thiele, L., Laumanns, M., Fonseca, C., Fonseca, V.: Performance assessment of multiobjective optimizers: An analysis and review. IEEE Trans. on Evolutionary Computation 8(2), 117–132 (2003)CrossRefGoogle Scholar
  18. 18.
    Trautmann, H., Ligges, U., Mehnen, J., Preuss, M.: A convergence criterion for multiobjective evolutionary algorithms based on systematic statistical testing. In: Rudolph, G., Jansen, T., Lucas, S., Poloni, C., Beume, N. (eds.) PPSN 2008. LNCS, vol. 5199, pp. 825–836. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  19. 19.
    Knowles, J., Thiele, L., Zitzler, E.: A tutorial on the performance assessment of stochastic multiobjective optimizers. 214, Computer Engineering and Networks Laboratory (TIK), Swiss Federal Institute of Technology (ETH) Zurich (2005)Google Scholar
  20. 20.
    Bleuler, S., Laumanns, M., Thiele, L., Zitzler, E.: PISA – A platform and programming language independent interface for search algorithms. In: Fonseca, C.M., Fleming, P.J., Zitzler, E., Deb, K., Thiele, L. (eds.) EMO 2003. LNCS, vol. 2632, pp. 494–508. Springer, Heidelberg (2003)CrossRefGoogle Scholar
  21. 21.
    Sheskin, D.J.: Handbook of Parametric and Nonparametric Statistical Procedures, 2nd edn. Chapman and Hall, Boca Raton (2000)MATHGoogle Scholar
  22. 22.
    Dudoit, S., van der Laan, M.: Multiple Testing Procedures with Applications to Genomics. Springer, Berlin (2008)CrossRefMATHGoogle Scholar
  23. 23.
    Stapleton, J.H.: Linear Statistical Models. Wiley Series in Probability and Statistics. Wiley, New York (1995)CrossRefMATHGoogle Scholar
  24. 24.
    Jensen, M.T.: Reducing the run-time complexity of multiobjective EAs: The NSGA-II and other algorithms. IEEE Trans. on Evolutionary Computation 7(5), 503–515 (2003)CrossRefGoogle Scholar
  25. 25.
    Beume, N., Rudolph, G.: Faster S-metric calculation by considering dominated hypervolume as Klee’s measure problem. In: International Conference on Computational Intelligence (CI 2006) (2006)Google Scholar
  26. 26.
    Ihaka, R., Gentleman, R.: R: A language for data analysis and graphics. Journal of Computational and Graphical Statistics 5, 299–314 (1996)Google Scholar
  27. 27.
    Fonseca, C.M., Fleming, P.J.: Multiobjective genetic algorithms made easy: Selection, sharing, and mating restriction. In: Genetic Algorithms in Engineering Systems: Innovations and Applications, pp. 42–52 (1995)Google Scholar
  28. 28.
    Zitzler, E., Deb, K., Thiele, L.: Comparison of multiobjective evolutionary algorithms: Empirical results. Evolutionary Computation 8(2), 173–195 (2000)CrossRefGoogle Scholar
  29. 29.
    Deb, K., Thiele, L., Laumanns, M., Zitzler, E.: Scalable multi-objective optimization test problems. In: Congress on Evolutionary Computation (CEC), vol. 1, pp. 825–830. IEEE Press, Piscataway (2002)Google Scholar
  30. 30.
    Deb, K.: Multi-objective Optimization using Evolutionary Algorithms. Wiley, Chichester (2001)MATHGoogle Scholar
  31. 31.
    Zitzler, E., Thiele, L.: Multiobjective optimization using evolutionary algorithms - A comparative case study. In: Eiben, A.E., Bäck, T., Schoenauer, M., Schwefel, H.-P. (eds.) PPSN 1998. LNCS, vol. 1498, pp. 292–301. Springer, Heidelberg (1998)CrossRefGoogle Scholar
  32. 32.
    Hansen, M.P., Jaszkiewicz, A.: Evaluating the quality of approximations to the non-dominated set. Technical Report IMM-REP-1998-7 (1998)Google Scholar
  33. 33.
    Wagner, T., Michelitsch, T., Sacharow, A.: On the design of optimisers for surface reconstruction. In: Thierens, D., et al. (eds.) 9th Annual Genetic and Evolutionary Computation Conference (GECCO 2007), Proc., London, UK, pp. 2195–2202. ACM, New York (2007)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Tobias Wagner
    • 1
  • Heike Trautmann
    • 2
  • Boris Naujoks
    • 3
  1. 1.Institute of Machining Technology (ISF)TU Dortmund UniversityGermany
  2. 2.Faculty of StatisticsTU Dortmund UniversityGermany
  3. 3.Chair of Algorithm EngineeringTU Dortmund UniversityGermany

Personalised recommendations