Advertisement

On Sampling Methods for Costly Multi-Objective Black-Box Optimization

  • Ingrida Steponavičė
  • Mojdeh Shirazi-Manesh
  • Rob J. Hyndman
  • Kate Smith-Miles
  • Laura Villanova
Chapter
Part of the Springer Optimization and Its Applications book series (SOIA, volume 107)

Abstract

We investigate the impact of different sampling techniques on the performance of multi-objective optimization methods applied to costly black-box optimization problems. Such problems are often solved using an algorithm in which a surrogate model approximates the true objective function and provides predicted objective values at a lower cost. As the surrogate model is based on evaluations of a small number of points, the quality of the initial sample can have a great impact on the overall effectiveness of the optimization. In this study, we demonstrate how various sampling techniques affect the results of applying different optimization algorithms to a set of benchmark problems. Additionally, some recommendations on usage of sampling methods are provided.

Keywords

Design of experiment Space-filling Low-discrepancy Efficient global optimization 

Notes

Acknowledgements

This research was partly financially supported by the Linkage project “Optimizing experimental design for robust product development: a case study for high-efficiency energy generation” funded by the Australian Research Council.

References

  1. 1.
    Altmann, J.: Observational study of behavior: sampling methods. Behaviour 49 (3), 227–266 (1974)CrossRefGoogle Scholar
  2. 2.
    Bischl, B., Bossek, J., Horn, D., Lang, M.: mlrMBO: Model-Based Optimization for MLR (2015). R package v1.0. https://github.com/berndbischl/mlrMBO
  3. 3.
    Box, G.E., Draper, N.R.: Empirical Model-Building and Response Surfaces. Wiley, New York (1987)zbMATHGoogle Scholar
  4. 4.
    Cox, D.R., Reid, N.: The Theory of the Design of Experiments. CRC, Boca Raton (2000)zbMATHGoogle Scholar
  5. 5.
    Doucet, A., Godsill, S., Andrieu, C.: On sequential Monte Carlo sampling methods for Bayesian filtering. Stat. Comput. 10 (3), 197–208 (2000)CrossRefGoogle Scholar
  6. 6.
    Fang, H., Horstemeyer, M.F.: Global response approximation with radial basis functions. Eng. Optim. 38 (04), 407–424 (2006)MathSciNetCrossRefGoogle Scholar
  7. 7.
    Halton, J.H.: On the efficiency of certain quasi-random sequences of points in evaluating multi-dimensional integrals. Numer. Math. 2 (1), 84–90 (1960)MathSciNetCrossRefzbMATHGoogle Scholar
  8. 8.
    Hammersley, J.M.: Monte Carlo methods for solving multivariable problems. Ann. N. Y. Acad. Sci. 86 (3), 844–874 (1960)MathSciNetCrossRefzbMATHGoogle Scholar
  9. 9.
    Hastings, W.K.: Monte Carlo sampling methods using Markov chains and their applications. Biometrika 57 (1), 97–109 (1970)MathSciNetCrossRefzbMATHGoogle Scholar
  10. 10.
    Huband, S., Hingston, P., Barone, L., While, L.: A review of multiobjective test problems and a scalable test problem toolkit. IEEE Trans. Evol. Comput. 10 (5), 477–506 (2006)CrossRefzbMATHGoogle Scholar
  11. 11.
    Ilzarbe, L., Álvarez, M.J., Viles, E., Tanco, M.: Practical applications of design of experiments in the field of engineering: a bibliographical review. Qual. Reliab. Eng. Int. 24 (4), 417–428 (2008)CrossRefGoogle Scholar
  12. 12.
    Iman, R.L., Conover, W.: A distribution-free approach to inducing rank correlation among input variables. Commun. Stat. Simul. Comput. 11 (3), 311–334 (1982)CrossRefzbMATHGoogle Scholar
  13. 13.
    Jiang, S., Ong, Y.S., Zhang, J., Feng, L.: Consistencies and contradictions of performance metrics in multiobjective optimization. IEEE Trans. Cybern. 44 (12), 2391–2404 (2014)CrossRefGoogle Scholar
  14. 14.
    Johnson, M.E., Moore, L.M., Ylvisaker, D.: Minimax and maximin distance designs. J. Stat. Plann. Infer. 26 (2), 131–148 (1990)MathSciNetCrossRefGoogle Scholar
  15. 15.
    Joseph, V.R., Hung, Y.: Orthogonal-maximin Latin hypercube designs. Stat. Sin. 18 (1), 171 (2008)MathSciNetzbMATHGoogle Scholar
  16. 16.
    Kalagnanam, J.R., Diwekar, U.M.: An efficient sampling technique for off-line quality control. Technometrics 39 (3), 308–319 (1997)CrossRefzbMATHGoogle Scholar
  17. 17.
    Khuri, A.I., Mukhopadhyay, S.: Response surface methodology. Wiley Interdiscip. Rev. Comput. Stat. 2 (2), 128–149 (2010)CrossRefGoogle Scholar
  18. 18.
    Knowles, J.: ParEGO: a hybrid algorithm with on-line landscape approximation for expensive multiobjective optimization problems. IEEE Trans. Evol. Comput. 10 (1), 50–66 (2006)CrossRefGoogle Scholar
  19. 19.
    Kocis, L., Whiten, W.J.: Computational investigations of low-discrepancy sequences. ACM Trans. Math. Softw. 23 (2), 266–294 (1997)CrossRefzbMATHGoogle Scholar
  20. 20.
    Kushner, H.J.: A versatile stochastic model of a function of unknown and time varying form. J. Math. Anal. Appl. 5 (1), 150–167 (1962)MathSciNetCrossRefzbMATHGoogle Scholar
  21. 21.
    LaValle, S.M.: Planning Algorithms. Cambridge University Press, Cambridge (2006)CrossRefzbMATHGoogle Scholar
  22. 22.
    McKay, M.D., Beckman, R.J., Conover, W.J.: Comparison of three methods for selecting values of input variables in the analysis of output from a computer code. Technometrics 21 (2), 239–245 (1979)MathSciNetzbMATHGoogle Scholar
  23. 23.
    Meckesheimer, M., Booker, A.J., Barton, R.R., Simpson, T.W.: Computationally inexpensive metamodel assessment strategies. AIAA J. 40 (10), 2053–2060 (2002)CrossRefGoogle Scholar
  24. 24.
    Montgomery, D.C.: Design and Analysis of Experiments. Wiley, Hoboken (2008)Google Scholar
  25. 25.
    Morris, M.D., Mitchell, T.J.: Exploratory designs for computational experiments. J. Stat. Plann. Infer. 43 (3), 381–402 (1995)MathSciNetCrossRefzbMATHGoogle Scholar
  26. 26.
    Müller, J., Shoemaker, C.A.: Influence of ensemble surrogate models and sampling strategy on the solution quality of algorithms for computationally expensive black-box global optimization problems. J. Glob. Optim. 60 (2), 123–144 (2014)MathSciNetCrossRefzbMATHGoogle Scholar
  27. 27.
    Niederreiter, H.: Random number generation and quasi-Monte Carlo methods. CBMS-NSF Regional Conference Series in Applied Mathematics, vol. 63. SIAM, Philadelphia (1992)Google Scholar
  28. 28.
    Owen, A.B.: Controlling correlations in Latin hypercube samples. J. Am. Stat. Assoc. 89 (428), 1517–1522 (1994)CrossRefzbMATHGoogle Scholar
  29. 29.
    Panse, V.G., Sukhatme, P.V.: Statistical Methods for Agricultural Workers. Indian Council of Agricultural Research, New Delhi (1954)Google Scholar
  30. 30.
    Poles, S., Fu, Y., Rigoni, E.: The effect of initial population sampling on the convergence of multi-objective genetic algorithms. In: Multiobjective Programming and Goal Programming, pp. 123–133. Springer, Berlin (2009)Google Scholar
  31. 31.
    Ponweiser, W., Wagner, T., Biermann, D., Vincze, M.: Multiobjective optimization on a limited budget of evaluations using model-assisted \(\mathcal{S}\)-metric selection. In: Parallel Problem Solving from Nature–PPSN X, pp. 784–794. Springer, Berlin (2008)Google Scholar
  32. 32.
    Rice, J.: Mathematical Statistics and Data Analysis. Cengage Learning, Belmont (2006)Google Scholar
  33. 33.
    Rockafellar, R.T., Uryasev, S.: Optimization of conditional value-at-risk. J. Risk 2, 21–42 (2000)CrossRefGoogle Scholar
  34. 34.
    Rosenbaum, P.R.: Observational Studies. Springer, New York (2002)CrossRefzbMATHGoogle Scholar
  35. 35.
    Roy, R.K.: Design of Experiments Using the Taguchi Approach: 16 Steps to Product and Process Improvement. Wiley, New York (2001)Google Scholar
  36. 36.
    Sacks, J., Welch, W.J., Mitchell, T.J., Wynn, H.P.: Design and analysis of computer experiments. Stat. Sci. 4, 409–423 (1989)MathSciNetCrossRefzbMATHGoogle Scholar
  37. 37.
    Santner, T.J., Williams, B.J., Notz, W.I.: Space-filling designs for computer experiments. In: The Design and Analysis of Computer Experiments. Springer Series in Statistics, pp. 121–161. Springer, New York (2003)Google Scholar
  38. 38.
    Sobol, I.M.: On the distribution of points in a cube and the approximate evaluation of integrals. USSR Comput. Math. Math. Phys. 7, 86–112 (1967)MathSciNetCrossRefzbMATHGoogle Scholar
  39. 39.
    Starnes, D.S., Tabor, J., Yates, D., Moore, D.S.: The Practice of Statistics, New York, 5th edn. W.H. Freeman (2014)Google Scholar
  40. 40.
    Steinberg, D.M., Hunter, W.G.: Experimental design: review and comment. Technometrics 26 (2), 71–97 (1984)MathSciNetCrossRefzbMATHGoogle Scholar
  41. 41.
    Steponavičė, I., Hyndman, R.J., Smith-Miles, K., Villanova, L.: Efficient identification of the Pareto optimal set. In: Learning and Intelligent Optimization, pp. 341–352. Springer, New York (2014)Google Scholar
  42. 42.
    Tang, B.: Selecting Latin hypercubes using correlation criteria. Stat. Sin. 8 (3), 965–977 (1998)MathSciNetzbMATHGoogle Scholar
  43. 43.
    Tenne, Y.: An analysis of the impact of the initial sample on evolutionary metamodel-assisted optimization. Appl. Artif. Intell. 27 (8), 669–699 (2013)CrossRefGoogle Scholar
  44. 44.
    Tenne, Y.: Initial sampling methods in metamodel-assisted optimization. Eng. Comput. 31 (4), 661–680 (2015)CrossRefGoogle Scholar
  45. 45.
    Wagner, T.: Planning and multi-objective optimization of manufacturing processes by means of empirical surrogate models. Vulkan, Essen (2013)Google Scholar
  46. 46.
    Wang, X., Hickernell, F.J.: Randomized Halton sequences. Math. Comput. Model. 32 (7), 887–899 (2000)MathSciNetCrossRefzbMATHGoogle Scholar
  47. 47.
    Wu, J., Azarm, S.: Metrics for quality assessment of a multiobjective design optimization solution set. Math. Comput. Model. 123 (1), 18–25 (2001)Google Scholar
  48. 48.
    Yates, F.: Sampling Methods for Censuses and Surveys. Charles Griffin & Co. Ltd., London (1949)Google Scholar
  49. 49.
    Žilinskas, A.: A statistical model-based algorithm for ‘black-box’ multi-objective optimisation. Int. J. Syst. Sci. 45 (1), 82–93 (2014)MathSciNetCrossRefzbMATHGoogle Scholar
  50. 50.
    Zitzler, E., Thiele, L.: Multiobjective optimization using evolutionary algorithms – a comparative case study. In: Parallel Problem Solving from Nature - PPSN-V, pp. 292–301. Springer, Heidelberg (1998)Google Scholar
  51. 51.
    Zitzler, E., Thiele, L., Laumanns, M., Fonseca, C.M., Da Fonseca, V.G.: Performance assessment of multiobjective optimizers: an analysis and review. IEEE Trans. Evol. Comput. 7 (2), 117–132 (2003)CrossRefGoogle Scholar
  52. 52.
    Zitzler, E., Brockhoff, D., Thiele, L.: The hypervolume indicator revisited: on the design of Pareto-compliant indicators via weighted integration. In: Evolutionary Multi-Criterion Optimization, pp. 862–876. Springer, Berlin (2007)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Ingrida Steponavičė
    • 1
  • Mojdeh Shirazi-Manesh
    • 1
  • Rob J. Hyndman
    • 2
  • Kate Smith-Miles
    • 1
  • Laura Villanova
    • 1
  1. 1.School of Mathematical SciencesMonash UniversityClaytonAustralia
  2. 2.Department of Econometrics and Business StatisticsMonash UniversityClaytonAustralia

Personalised recommendations