Optimization of Gaussian Process Models with Evolutionary Algorithms

  • Dejan Petelin
  • Bogdan Filipič
  • Juš Kocijan
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6593)

Abstract

Gaussian process (GP) models are non-parametric, black-box models that represent a new method for system identification. The optimization of GP models, due to their probabilistic nature, is based on maximization of the probability of the model. This probability can be calculated by the marginal likelihood. Commonly used approaches for maximizing the marginal likelihood of GP models are the deterministic optimization methods. However, their success critically depends on the initial values. In addition, the marginal likelihood function often has a lot of local minima in which the deterministic method can be trapped. Therefore, stochastic optimization methods can be considered as an alternative approach. In this paper we test their applicability in GP model optimization. We performed a comparative study of three stochastic algorithms: the genetic algorithm, differential evolution, and particle swarm optimization. Empirical tests were carried out on a benchmark problem of modeling the concentration of CO2 in the atmosphere. The results indicate that with proper tuning differential evolution and particle swarm optimization significantly outperform the conjugate gradient method.

Keywords

Gaussian process models hyperparameters optimization evolutionary algorithms 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Birge, B.: Matlab Central: Particle swarm optimization toolbox, http://www.mathworks.com/matlabcentral/fileexchange/7506-particle-swarm-optimization-toolbox
  2. 2.
    Carbon Dioxide Information Analysis Center. Atmospheric CO 2 values collected at Mauna Loa, Hawaii, USA, http://cdiac.esd.ornl.gov/ftp/trends/co2/maunaloa.co2
  3. 3.
    Eiben, A.E., Smith, J.E.: Introduction to Evolutionary Computing. Natural Computing Series. Springer, Heidelberg (2003)CrossRefMATHGoogle Scholar
  4. 4.
    Kennedy, J., Eberhart, R.: Particle swarm optimization. In: Proceedings of IEEE International Conference on Neural Networks, pp. 1942–1948. IEEE Press, Los Alamitos (1995)CrossRefGoogle Scholar
  5. 5.
    Kennedy, J., Eberhart, R., Shi, Y.: Swarm Intelligence. Morgan Kaufmann, San Francisco (2001)Google Scholar
  6. 6.
    Pohlheim, H.: Geatbx – The Genetic and Evolutionary Algorithm Toolbox for Matlab, http://www.geatbx.com/
  7. 7.
    Price, K., Storn, R., Lampinen, J.: Differential Evolution. Natural Computing Series. Springer, Heidelberg (2005)MATHGoogle Scholar
  8. 8.
    Rassmusen, C.E., Williams, C.K.I.: Gaussian Processes for Machine Learning. The MIT Press, Cambridge (2006)Google Scholar
  9. 9.
    Storn, R., Price, K.: Differential evolution – A Simple and Efficient Heuristic for Global Optimization over Continuous Spaces. Journal of Global Optimization (11), 341–359 (1997)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Dejan Petelin
    • 1
  • Bogdan Filipič
    • 1
  • Juš Kocijan
    • 1
    • 2
  1. 1.Jožef Stefan InstituteLjubljanaSlovenia
  2. 2.University of Nova GoricaNova GoricaSlovenia

Personalised recommendations