Advertisement

Journal of Global Optimization

, Volume 67, Issue 1–2, pp 263–282 | Cite as

Dynamic algorithm selection for pareto optimal set approximation

  • Ingrida Steponavičė
  • Rob J. Hyndman
  • Kate Smith-Miles
  • Laura Villanova
Article

Abstract

This paper presents a meta-algorithm for approximating the Pareto optimal set of costly black-box multiobjective optimization problems given a limited number of objective function evaluations. The key idea is to switch among different algorithms during the optimization search based on the predicted performance of each algorithm at the time. Algorithm performance is modeled using a machine learning technique based on the available information. The predicted best algorithm is then selected to run for a limited number of evaluations. The proposed approach is tested on several benchmark problems and the results are compared against those obtained using any one of the candidate algorithms alone.

Keywords

Multiobjective optimization Expensive black-box function Machine learning Classification Algorithm selection Hypervolume metric Features 

References

  1. 1.
    Bader, J., Zitzler, E.: Hype: an algorithm for fast hypervolume-based many-objective optimization. Evol Comput 19(1), 45–76 (2011)CrossRefGoogle Scholar
  2. 2.
    Borrett, J.E., Tsang, E.P.: Adaptive constraint satisfaction: the quickest first principle. In: Computational Intelligence, pp. 203–230. Springer (2009)Google Scholar
  3. 3.
    Breiman, L.: Random forests. Mach. Learn. 45(1), 5–32 (2001)MathSciNetCrossRefzbMATHGoogle Scholar
  4. 4.
    Deb, K.: Multi-objective Optimization Using Evolutionary Algorithms, vol. 16. Wiley, New York (2001)zbMATHGoogle Scholar
  5. 5.
    Deb, K., Pratap, A., Agarwal, S., Meyarivan, T.: A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evol. Comput. 6(2), 182–197 (2002)CrossRefGoogle Scholar
  6. 6.
    Durillo, J.J., Nebro, A.J.: jMetal: a java framework for multi-objective optimization. Adv. Eng. Softw. 42(10), 760–771 (2011)CrossRefGoogle Scholar
  7. 7.
    Feng, Z., Zhang, Q., Zhang, Q., Tang, Q., Yang, T., Ma, Y.: A multiobjective optimization based framework to balance the global exploration and local exploitation in expensive optimization. J. Glob. Optim. 61, 677–694 (2014)Google Scholar
  8. 8.
    Forrester, A.I., Keane, A.J.: Recent advances in surrogate-based optimization. Prog. Aerosp. Sci. 45(1–3), 50–79 (2009)CrossRefGoogle Scholar
  9. 9.
    Gao, F., Han, L.: Implementing the Nelder–Mead simplex algorithm with adaptive parameters. Comput. Optim. Appl. 51(1), 259–277 (2012)MathSciNetCrossRefzbMATHGoogle Scholar
  10. 10.
    Garrett, D., Dasgupta, D.: Multiobjective landscape analysis and the generalized assignment problem. In: Learning and Intelligent Optimization, pp. 110–124. Springer (2008)Google Scholar
  11. 11.
    Han, L., Neumann, M.: Effect of dimensionality on the Nelder–Mead simplex method. Optim. Methods Softw. 21(1), 1–16 (2006)MathSciNetCrossRefzbMATHGoogle Scholar
  12. 12.
    Jiang, S., Ong, Y.S., Zhang, J., Feng, L.: Consistencies and contradictions of performance metrics in multiobjective optimization. IEEE Trans. Cybern. 44(12), 2391–2404 (2014)CrossRefGoogle Scholar
  13. 13.
    Jin, R., Chen, W., Simpson, T.: Comparative studies of metamodelling techniques under multiple modelling criteria. Struct. Multidiscip. Optim. 23(1), 1–13 (2001)CrossRefGoogle Scholar
  14. 14.
    Jones, D.R., Schonlau, M., Welch, W.J.: Efficient global optimization of expensive black-box functions. J. Glob. Optim. 13(4), 455–492 (1998)MathSciNetCrossRefzbMATHGoogle Scholar
  15. 15.
    Kaelbling, L.P., Littman, M.L., Moore, A.W.: Reinforcement learning: a survey. J. Artif. Intell. Res. 4, 237–285 (1996)Google Scholar
  16. 16.
    Knowles, J.: Parego: a hybrid algorithm with on-line landscape approximation for expensive multiobjective optimization problems. IEEE Trans. Evol. Comput. 10(1), 50–66 (2006)CrossRefGoogle Scholar
  17. 17.
    Koduru, P., Dong, Z., Das, S., Welch, S.M., Roe, J.L., Charbit, E.: A multiobjective evolutionary-simplex hybrid approach for the optimization of differential equation models of gene networks. IEEE Trans. Evol. Comput. 12(5), 572–590 (2008)CrossRefGoogle Scholar
  18. 18.
    Kolda, T.G., Lewis, R.M., Torczon, V.: Optimization by direct search: new perspectives on some classical and modern methods. SIAM Rev. 45(3), 385–482 (2003)MathSciNetCrossRefzbMATHGoogle Scholar
  19. 19.
    Kursawe, F.: A variant of evolution strategies for vector optimization. In: Schwefel, H.P., Mnner, R. (eds.) Parallel Problem Solving from Nature, vol. 496, pp. 193–197. Springer, Berlin (1991)CrossRefGoogle Scholar
  20. 20.
    Lagarias, J.C., Reeds, J.A., Wright, M.H., Wright, P.E.: Convergence properties of the Nelder–Mead simplex method in low dimensions. SIAM J. Optim. 9(1), 112–147 (1998)MathSciNetCrossRefzbMATHGoogle Scholar
  21. 21.
    Lagoudakis, M.G., Littman, M.L.: Algorithm selection using reinforcement learning. In: ICML, pp. 511–518. Citeseer (2000)Google Scholar
  22. 22.
    Luersen, M.A., Le Riche, R.: Globalized Nelder–Mead method for engineering optimization. Comput. Struct. 82(23), 2251–2260 (2004)CrossRefGoogle Scholar
  23. 23.
    Marler, R.T., Arora, J.S.: Survey of multi-objective optimization methods for engineering. Struct. Multidiscip. Optim. 26(6), 369–395 (2004)MathSciNetCrossRefzbMATHGoogle Scholar
  24. 24.
    Miettinen, K.: Nonlinear Multiobjective Optimization, vol. 12. Springer Science & Business Media, Berlin (1999)zbMATHGoogle Scholar
  25. 25.
    Mockus, J.: Bayesian Approach to Global Optimization. Kluwer Academic Publishers, Dordrecht (1989)CrossRefzbMATHGoogle Scholar
  26. 26.
    Nelder, J.A., Mead, R.: A simplex method for function minimization. Comput. J. 7(4), 308–313 (1965)MathSciNetCrossRefzbMATHGoogle Scholar
  27. 27.
    Okabe, T., Jin, Y., Sendhoff, M.O.B.: On test functions for evolutionary multi-objective optimization. In: Yao, X., Burke, E., Lozano, J., Smith, J., Merelo-Guervs, J., Bullinaria, J., Rowe, J., Tio, P., Kabn, A., Schwefel, H.P. (eds.) Parallel Problem Solving from Nature—PPSN VIII, vol. 3242, pp. 792–802. Springer, Berlin (2011)CrossRefGoogle Scholar
  28. 28.
    Pham, N., Wilamowski, B.M.: Improved Nelder Meads simplex method and applications. J. Comput. 3(3), 55–63 (2011)Google Scholar
  29. 29.
    Ponweiser, W., Wagner, T., Biermann, D., Vincze, M.: Multiobjective optimization on a limited budget of evaluations using model-assisted \(\cal {S}\)-metric selection. In: Rudolph, G., Jansen, T., Beume, N., Lucas, S., Poloni, C. (eds.) Parallel Problem Solving from Nature—PPSN X, vol. 5199, pp. 784–794. Springer, Berlin (2008)CrossRefGoogle Scholar
  30. 30.
    Rice, J.R.: The algorithm selection problem. Comput. Sci. Tech. Rep. (1975). http://docs.lib.purdue.edu/cstech/99
  31. 31.
    Santana-Quintero, L., Montaño, A., Coello, C.C.: A review of techniques for handling expensive functions in evolutionary multi-objective optimization. In: Tenne, Y., Goh, C.K. (eds.) Computational Intelligence in Expensive Optimization Problems, vol. 2, pp. 29–59. Springer, Berlin (2010)CrossRefGoogle Scholar
  32. 32.
    Steponavičė, I., Hyndman, R.J., Smith-Miles, K., Villanova, L.: Efficient identification of the Pareto optimal set. In: Learning and Intelligent Optimization, pp. 341–352. Springer International Publishing (2014)Google Scholar
  33. 33.
    Torczon, V.J.: Multi-directional search: a direct search algorithm for parallel machines. Ph.D. thesis, Citeseer (1989)Google Scholar
  34. 34.
    Törn, A., Žilinskas, A.: Global Optimization. Springer, New York, NY (1989)Google Scholar
  35. 35.
    Van Veldhuizen, D.A., Lamont, G.B.: Multiobjective evolutionary algorithm test suites. In: Proceedings of the 1999 ACM Symposium on Applied Computing, pp. 351–357. ACM (1999)Google Scholar
  36. 36.
    Viennet, R., Fonteix, C., Marc, I.: New multicriteria optimization method based on the use of a diploid genetic algorithm: example of an industrial problem. In: Selected Papers from the European Conference on Artificial Evolution, pp. 120–127. Springer, London (1996)Google Scholar
  37. 37.
    Wagner, T.: Planning and Multi-objective Optimization of Manufacturing Processes by Means of Empirical Surrogate Models. Vulkan, Essen (2013)Google Scholar
  38. 38.
    Wu, J., Azarm, S.: Metrics for quality assessment of a multiobjective design optimization solution set. J. Mech. Des. 123(1), 18–25 (2001)CrossRefGoogle Scholar
  39. 39.
    Zahara, E., Kao, Y.T.: Hybrid Nelder–Mead simplex search and particle swarm optimization for constrained engineering design problems. Expert Syst. Appl. 36(2), 3880–3886 (2009)CrossRefGoogle Scholar
  40. 40.
    Zapotecas-Martínez, S., Coello, C.A.C.: Monss: a multi-objective nonlinear simplex search approach. Eng. Optim. 48, 16–38 (2016)Google Scholar
  41. 41.
    Zhang, Q., Liu, W., Tsang, E., Virginas, B.: Expensive multiobjective optimization by MOEA/D with Gaussian process model. IEEE Trans. Evol. Comput. 14(3), 456–474 (2010)CrossRefGoogle Scholar
  42. 42.
    Zitzler, E., Brockhoff, D., Thiele, L.: The hypervolume indicator revisited: On the design of Pareto-compliant indicators via weighted integration. In: Evolutionary Multi-criterion Optimization, pp. 862–876. Springer (2007)Google Scholar
  43. 43.
    Zitzler, E., Deb, K., Thiele, L.: Comparison of multiobjective evolutionary algorithms: empirical results. Evol. Comput. 8(2), 173–195 (2000)CrossRefGoogle Scholar
  44. 44.
    Zitzler, E., Thiele, L.: Multiobjective optimization using evolutionary algorithms—a comparative case study. In: Parallel Problem Solving from Nature—PPSN-V, pp. 292–301. Springer (1998)Google Scholar
  45. 45.
    Zitzler, E., Thiele, L., Laumanns, M., Fonseca, C.M., Da Fonseca, V.G.: Performance assessment of multiobjective optimizers: an analysis and review. IEEE Trans. Evol. Comput. 7(2), 117–132 (2003)CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2016

Authors and Affiliations

  • Ingrida Steponavičė
    • 1
  • Rob J. Hyndman
    • 2
  • Kate Smith-Miles
    • 1
  • Laura Villanova
    • 1
  1. 1.School of Mathematical SciencesMonash UniversityClaytonAustralia
  2. 2.Department of Econometrics and Business StatisticsMonash UniversityClaytonAustralia

Personalised recommendations