Surrogate Assisted Feature Computation for Continuous Problems

  • Nacim BelkhirEmail author
  • Johann Dréo
  • Pierre Savéant
  • Marc Schoenauer
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10079)


A possible approach to Algorithm Selection and Configuration for continuous black box optimization problems relies on problem features, computed from a set of evaluated sample points. However, the computation of these features requires a rather large number of such samples, unlikely to be practical for expensive real-world problems. On the other hand, surrogate models have been proposed to tackle the optimization of expensive objective functions. This paper proposes to use surrogate models to approximate the values of the features at reasonable computational cost. Two experimental studies are conducted, using a continuous domain test bench. First, the effect of sub-sampling is analyzed. Then, a methodology to compute approximate values for the features using a surrogate model is proposed, and validated from the point of view of the classification of the test functions. It is shown that when only small computational budgets are available, using surrogate models as proxies to compute the features can be beneficial.


Empirical study Black-box continuous optimization Surrogate modelling Problem features 


  1. 1.
    Auger, A., Teytaud, O.: Continuous lunches are free plus the design of optimal optimization algorithms. Algorithmica 57, 121–146 (2009). MathSciNetCrossRefzbMATHGoogle Scholar
  2. 2.
    Bischl, B., Mersmann, O., Trautmann, H., Preuß, M.: Algorithm selection based on exploratory landscape analysis and cost-sensitive learning. In: Proceedings of the 14th Annual Conference on Genetic and Evolutionary Computation, pp. 313–320. ACM (2012)Google Scholar
  3. 3.
    Hansen, N., Auger, A., Finck, S., Ros, R.: Real-parameter black-box optimization benchmarking 2010: experimental setup. Technical report RR-7215, INRIA (2010)Google Scholar
  4. 4.
    Hoos, H.H.: Programming by optimization. Commun. ACM 55(2), 70–80 (2012)CrossRefGoogle Scholar
  5. 5.
    Hutter, F., Xu, L., Hoos, H.H., Leyton-Brown, K.: Algorithm runtime prediction: methods and evaluation. Artif. Intell. 206, 79–111 (2014)MathSciNetCrossRefzbMATHGoogle Scholar
  6. 6.
    Jin, R., Chen, W., Simpson, T.W.: Comparative studies of metamodeling techniques under multiple modeling criteria. Struct. Multidiscip. Optim. 23, 1–13 (2000)CrossRefGoogle Scholar
  7. 7.
    Jin, Y.: A comprehensive survey of fitness approximation in evolutionary computation. Soft Comput. 9(1), 3–12 (2005)CrossRefGoogle Scholar
  8. 8.
    Lunacek, M., Whitley, D.: The dispersion metric and the CMA evolution strategy. In: Proceedings of the 8th Annual Conference on Genetic and Evolutionary Computation, pp. 477–484. ACM (2006)Google Scholar
  9. 9.
    Mersmann, O., Bischl, B., Trautmann, H., Preuss, M., Weihs, C., Rudolph, G.: Exploratory landscape analysis. In: Proceedings of 13th GECCO, pp. 829–836. ACM (2011)Google Scholar
  10. 10.
    Mısır, M., Sebag, M.: Algorithm selection as a collaborative filtering problem. Technical report, INRIA-Saclay (2013).
  11. 11.
    Munoz, M., Kirley, M., Halgamuge, S.K., et al.: Exploratory landscape analysis of continuous space optimization problems using information content. IEEE Trans. Evol. Comput. 19(1), 74–87 (2015)CrossRefGoogle Scholar
  12. 12.
    Wolpert, D., Macready, W.: No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1(1), 67–82 (1997)CrossRefGoogle Scholar
  13. 13.
    Xu, L., Hutter, F., Hoos, H.H., Leyton-Brown, K.: Satzilla: portfolio-based algorithm selection for SAT. J. Artif. Intell. Res. 33, 565–606 (2008)zbMATHGoogle Scholar

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  • Nacim Belkhir
    • 1
    • 2
    Email author
  • Johann Dréo
    • 1
  • Pierre Savéant
    • 1
  • Marc Schoenauer
    • 2
  1. 1.Thales Research & TechnologyPalaiseauFrance
  2. 2.TAO, Inria Saclay Île-de-FranceOrsayFrance

Personalised recommendations