Advertisement

A First Analysis of Kernels for Kriging-Based Optimization in Hierarchical Search Spaces

  • Martin ZaeffererEmail author
  • Daniel Horn
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11102)

Abstract

Many real-world optimization problems require significant resources for objective function evaluations. This is a challenge to evolutionary algorithms, as it limits the number of available evaluations. One solution are surrogate models, which replace the expensive objective.

A particular issue in this context are hierarchical variables. Hierarchical variables only influence the objective function if other variables satisfy some condition. We study how this kind of hierarchical structure can be integrated into the model based optimization framework. We discuss an existing kernel and propose alternatives. An artificial test function is used to investigate how different kernels and assumptions affect model quality and search performance.

Keywords

Surrogate model based optimization Hierarchical search spaces Conditional variables Kernel 

References

  1. 1.
    Thornton, C., Hutter, F., Hoos, H.H., Leyton-Brown, K.: Auto-WEKA. In: Proceedings of the 19th International Conference on Knowledge Discovery and Data Mining. ACM Press (2013)Google Scholar
  2. 2.
    Horn, D., Bischl, B.: Multi-objective parameter configuration of machine learning algorithms using model-based optimization. In: 2016 IEEE Symposium Series on Computational Intelligence (SSCI) (2016)Google Scholar
  3. 3.
    Cáceres, L.P., Bischl, B., Stützle, T.: Evaluating random forest models for irace. In: Proceedings of the Genetic and Evolutionary Computation Conference Companion. ACM Press (2017)Google Scholar
  4. 4.
    Hutter, F., Osborne, M.A.: A kernel for hierarchical parameter spaces. Technical report arXiv:1310.5738, arXiv (2013)
  5. 5.
    Swersky, K., Duvenaud, D., Snoek, J., Hutter, F., Osborne, M.: Raiders of the lost architecture: kernels for Bayesian optimization in conditional parameter spaces. In: NIPS workshop on Bayesian Optimization in Theory and Practice (2013)Google Scholar
  6. 6.
    Bergstra, J.S., Bardenet, R., Bengio, Y., Kégl, B.: Algorithms for hyper-parameter optimization. In: Advances in Neural Information Processing Systems, vol. 24. Curran Associates, Inc. (2011)Google Scholar
  7. 7.
    Bergstra, J., Yamins, D., Cox, D.: Making a science of model search: hyperparameter optimization in hundreds of dimensions for vision architectures. In: Proceedings of the 30th International Conference on Machine Learning, PMLR (2013)Google Scholar
  8. 8.
    Jenatton, R., Archambeau, C., González, J., Seeger, M.: Bayesian optimization with tree-structured dependencies. In: Proceedings of the 34th International Conference on Machine Learning, PMLR (2017)Google Scholar
  9. 9.
    Jones, D.R., Schonlau, M., Welch, W.J.: Efficient global optimization of expensive black-box functions. J. Glob. Optim. 13(4), 455–492 (1998)MathSciNetCrossRefGoogle Scholar
  10. 10.
    Bischl, B., Richter, J., Bossek, J., Horn, D., Thomas, J., Lang, M.: mlrMBO: a modular framework for model-based optimization of expensive black-box functions. arXiv preprint arXiv:1703.03373 (2017)
  11. 11.
    Storn, R., Price, K.: Differential evolution - a simple and efficient heuristic for global optimization over continuous spaces. J. Glob. Optim. 11(4), 341–359 (1997)MathSciNetCrossRefGoogle Scholar
  12. 12.
    Forrester, A., Sobester, A., Keane, A.: Engineering Design Via Surrogate Modelling. Wiley, Hoboken (2008)CrossRefGoogle Scholar
  13. 13.
    Mockus, J., Tiesis, V., Zilinskas, A.: The application of Bayesian methods for seeking the extremum. In: Towards Global Optimization, North-Holland, vol. 2 (1978)Google Scholar
  14. 14.
    Zaefferer, M., Bartz-Beielstein, T.: Efficient global optimization with indefinite kernels. In: Handl, J., Hart, E., Lewis, P.R., López-Ibáñez, M., Ochoa, G., Paechter, B. (eds.) PPSN 2016. LNCS, vol. 9921, pp. 69–79. Springer, Cham (2016).  https://doi.org/10.1007/978-3-319-45823-6_7CrossRefGoogle Scholar
  15. 15.
    Zaefferer, M.: Combinatorial efficient global optimization in R - CEGO v2.2.0 (2017). https://cran.r-project.org/package=CEGO. Accessed 10 Jan 2018
  16. 16.
    Zaefferer, M., Stork, J., Friese, M., Fischbach, A., Naujoks, B., Bartz-Beielstein, T.: Efficient global optimization for combinatorial problems. In: Proceedings of the Genetic and Evolutionary Computation Conference. ACM (2014)Google Scholar
  17. 17.
    Jones, D.R., Perttunen, C.D., Stuckman, B.E.: Lipschitzian optimization without the Lipschitz constant. J. Optim. Theory Appl. 79(1), 157–181 (1993)MathSciNetCrossRefGoogle Scholar
  18. 18.
    Mullen, K., Ardia, D., Gil, D., Windover, D., Cline, J.: DEoptim: an R package for global optimization by differential evolution. J. Stat. Softw. 40(6), 1–26 (2011)CrossRefGoogle Scholar
  19. 19.
    Demšar, J.: Statistical comparisons of classifiers over multiple data sets. J. Mach. Learn. Res. 7, 1–30 (2006)MathSciNetzbMATHGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  1. 1.Institute of Data Science, Engineering, and AnalyticsTH KölnGummersbachGermany
  2. 2.Faculty of StatisticsTU Dortmund UniversityDortmundGermany

Personalised recommendations