Efficient Global Optimization with Indefinite Kernels

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9921)


Kernel based surrogate models like Kriging are a popular remedy for costly objective function evaluations in optimization. Often, kernels are required to be definite. Highly customized kernels, or kernels for combinatorial representations, may be indefinite. This study investigates this issue in the context of Kriging. It is shown that approaches from the field of Support Vector Machines are useful starting points, but require further modifications to work with Kriging. This study compares a broad selection of methods for dealing with indefinite kernels in Kriging and Kriging-based Efficient Global Optimization, including spectrum transformation, feature embedding and computation of the nearest definite matrix. Model quality and optimization performance are tested. The standard, without explicitly correcting indefinite matrices, yields functional results, which are further improved by spectrum transformations.


Recombination Kriging 


  1. 1.
    Anderson, T.W.: On the distribution of the two-sample Cramer-von Mises criterion. Ann. Math. Stat. 33(3), 1148–1159 (1962)MathSciNetCrossRefMATHGoogle Scholar
  2. 2.
    Ayhan, M.S., Chu, C.-H.H.: Towards indefinite gaussian processes. Technical report, University of Louisiana at Lafayette (2012)Google Scholar
  3. 3.
    Boisvert, J.B., Deutsch, C.V.: Programs for kriging and sequential Gaussian simulation with locally varying anisotropy using non-Euclidean distances. Comput. Geoscie. 37(4), 495–510 (2011)CrossRefGoogle Scholar
  4. 4.
    Chen, Y., Gupta, M.R., Recht, B.: Learning kernels from indefinite similarities. In: Proceedings of the 26th Annual International Conference on Machine Learning, ICML 2009, pp. 145–152. ACM, New York (2009)Google Scholar
  5. 5.
    Forrester, A., Sobester, A., Keane, A.: Engineering Design via Surrogate Modelling. Wiley, Hoboken (2008)CrossRefMATHGoogle Scholar
  6. 6.
    Glunt, W., Hayden, T.L., Hong, S., Wells, J.: An alternating projection algorithm for computing the nearest Euclidean distance matrix. SIAM J. Matrix Anal. Appl. 11(4), 589–600 (1990)MathSciNetCrossRefMATHGoogle Scholar
  7. 7.
    Haasdonk, B., Bahlmann, C.: Learning with distance substitution kernels. In: Rasmussen, C.E., Bülthoff, H.H., Schölkopf, B., Giese, M.A. (eds.) DAGM 2004. LNCS, vol. 3175, pp. 220–227. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  8. 8.
    Higham, N.J.: Computing the nearest correlation matrix-a problem from finance. IMA J. Numer. Anal. 22(3), 329–343 (2002)MathSciNetCrossRefMATHGoogle Scholar
  9. 9.
    Jin, Y.: A comprehensive survey of fitness approximation in evolutionary computation. Soft Comput. 9(1), 3–12 (2005)CrossRefGoogle Scholar
  10. 10.
    Jones, D.R., Perttunen, C.D., Stuckman, B.E.: Lipschitzian optimization without the lipschitz constant. J. Optim. Theory Appl. 79(1), 157–181 (1993)MathSciNetCrossRefMATHGoogle Scholar
  11. 11.
    Jones, D.R., Schonlau, M., Welch, W.J.: Efficient global optimization of expensive black-box functions. J. Glob. Optim. 13(4), 455–492 (1998)MathSciNetCrossRefMATHGoogle Scholar
  12. 12.
    Loosli, G., Canu, S., Ong, C.: Learning SVM in Krein spaces. IEEE Trans. Pattern Anal. Mach. Intell. 38(6), 1204–1216 (2015)CrossRefGoogle Scholar
  13. 13.
    Manchuk, J.G., Deutsch, C.V.: Robust solution of normal (kriging) equations. Technical report, CCG Alberta (2007)Google Scholar
  14. 14.
    Mangasarian, O.L.: Generalized support vector machines. In: Smola, A.J., Bartlett, P., Schölkopf, B., Schuurmans, D. (eds.) Advances in Large Margin Classifiers, pp. 135–146. MIT Press (2000)Google Scholar
  15. 15.
    Mockus, J., Tiesis, V., Zilinskas, A.: The application of bayesian methods for seeking the extremum. In: Towards Global Optimization 2, pp. 117–129. North-Holland, Amsterdam (1978)Google Scholar
  16. 16.
    Mohammadi, H., Le Riche, R., Durrande, N., Touboul, E., Bay, X.: An analytic comparison of regularization methods for Gaussian Processes. Research report, Ecole Nationale Supérieure des Mines de Saint-Etienne, LIMOS (2016)Google Scholar
  17. 17.
    Moraglio, A., Kattan, A.: Geometric generalisation of surrogate model based optimisation to combinatorial spaces. In: Merz, P., Hao, J.-K. (eds.) EvoCOP 2011. LNCS, vol. 6622, pp. 142–154. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  18. 18.
    Moraglio, A., Kim, Y.-H., Yoon, Y.: Geometric surrogate-based optimisation for permutation-based problems. In: Proceedings of the 13th Annual Conference Companion on Genetic and Evolutionary Computation, GECCO 2011, pp. 133–134. ACM, New York (2011)Google Scholar
  19. 19.
    Rebonato, R., Jäckel, P.: The most general methodology to create a valid correlation matrix for risk management and option pricing purposes. J. Risk 2(2), 17–27 (1999)CrossRefGoogle Scholar
  20. 20.
    Schleif, F.-M., Tino, P.: Indefinite proximity learning: a review. Neural Comput. 27(10), 2039–2096 (2015)CrossRefGoogle Scholar
  21. 21.
    Schölkopf, B., Smola, A.J.: Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond. MIT Press, Cambridge (2001)Google Scholar
  22. 22.
    Schölkopf, B.: The kernel trick for distances. In: Leen, T.K., Dietterich, T.G., Tresp, V. (eds.) Advances in Neural Information Processing Systems, vol. 13, pp. 301–307. MIT Press, Cambridge (2001)Google Scholar
  23. 23.
    Wagner, T.: Planning and multi-objective optimization of manufacturing processes by means of empirical surrogate models. Ph. D. thesis, TU Dortmund. Vulkan Verlag (2013)Google Scholar
  24. 24.
    Wu, G., Chang, E.Y., Zhang, Z.: An analysis of transformation on non-positive semidefinite similarity matrix for kernel machines. In: Proceedings of the 22nd International Conference on Machine Learning (2005)Google Scholar
  25. 25.
    Yandell, B.S.: Practical Data Analysis for Designed Experiments. Chapman and Hall/CRC, London (1997)CrossRefMATHGoogle Scholar
  26. 26.
    Zaefferer, M., Stork, J., Bartz-Beielstein, T.: Distance measures for permutations in combinatorial efficient global optimization. In: Bartz-Beielstein, T., Branke, J., Filipič, B., Smith, J. (eds.) PPSN 2014. LNCS, vol. 8672, pp. 373–383. Springer, Heidelberg (2014)Google Scholar
  27. 27.
    Zaefferer, M., Stork, J., Friese, M., Fischbach, A., Naujoks, B., Bartz-Beielstein, T.: Efficient global optimization for combinatorial problems. In: Genetic and Evolutionary Computation Conference, GECCO 2014, pp. 871–878. ACM (2014)Google Scholar
  28. 28.
    Zhu, J., Rosset, S., Hastie, T., Tibshirani, R.: 1-norm support vector machines. Adv. Neural Inf. Process. Syst. 16(1), 49–56 (2004)Google Scholar

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  1. 1.Faculty of Computer Science and Engineering ScienceCologne University of Applied Sciences (TH Köln)GummersbachGermany

Personalised recommendations