Statistics and Computing

, Volume 15, Issue 1, pp 31–41 | Cite as

Hierarchical Gaussian process mixtures for regression

  • J.Q. ShiEmail author
  • R. Murray-Smith
  • D.M. Titterington


As a result of their good performance in practice and their desirable analytical properties, Gaussian process regression models are becoming increasingly of interest in statistics, engineering and other fields. However, two major problems arise when the model is applied to a large data-set with repeated measurements. One stems from the systematic heterogeneity among the different replications, and the other is the requirement to invert a covariance matrix which is involved in the implementation of the model. The dimension of this matrix equals the sample size of the training data-set. In this paper, a Gaussian process mixture model for regression is proposed for dealing with the above two problems, and a hybrid Markov chain Monte Carlo (MCMC) algorithm is used for its implementation. Application to a real data-set is reported.


Gaussian process heterogeneity hybrid Markov chain Monte Carlo mixture models nonparametric curve fitting 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. Carlin B.P. and Louis T.A. 2000. Bayes and Empirical Bayes Methods for Data Analysis, 2nd edition. Chapman & Hall/CRC, London.Google Scholar
  2. Cheng B. and Titterington D.M. 1994. Neural networks: A review from a statistical perspective (with discussion). Statistical Science 9: 2–54.Google Scholar
  3. Duane S., Kennedy A.D., and Roweth D. 1987. Hybrid Monte Carlo. Physics Letters B 195: 216–222.Google Scholar
  4. Gelman A. 1996. Inference and monitoring convergence. In: Gilks W.R., Richardson S., and Spiegelhalter D.J. (Eds.), Markov Chain Monte Carlo in Practice, Chapman Hall, London, pp. 131–144.Google Scholar
  5. Geman S. and Geman D. 1984. Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images. IEEE Transactions on Pattern Analysis and Machine Intelligence 6: 721–741.Google Scholar
  6. Gibbs M.N. 1997. Bayesian Gaussian Processes for Regression and Classification. PhD thesis, Cambridge University. (Available from
  7. Gibbs M.N. and MacKay D.J.C. 1996. Efficient implementation of Gaussian processes for interpolation. Technical report. Cambridge University. (Available from GP/)
  8. Horowitz A.M. 1991. A generalized guided Monte Carlo algorithm. Physics Letters B 268: 247–252.Google Scholar
  9. Kamnik R., Bajd T., and Kralj A. 1999. Functional electrical stimulation and arm supported sit-to-stand transfer after paraplegia: A study of kinetic parameters. Artificial Organs 23: 413–417.Google Scholar
  10. Kamnik R., Shi, J.Q., Murray-Smith, R., and Bajd T. 2003. Feedback information in FES supported standing-up in paraplegia. Technical Report. University of Glasgow. (Available from
  11. Lemm J.C. 1999. Mixtures of Gaussian process priors. In: Proceedings of the Ninth International Conference on Artificial Neural Networks (ICANN99), IEE Conference Publication No. 470. Institution of Electrical Engineers, London.Google Scholar
  12. MacKay D.J.C. 1999. Introduction to Gaussian processes. Technical report. Cambridge University. (Available from http://wol.ra. /mackay/GP/)
  13. McLachlan G.J. and Peel D. 2000. Finite Mixture Distributions. Wiley, New York.Google Scholar
  14. Neal R.M. 1997. Monte Carlo implementation of Gaussian process models for Bayesian regression and classification. Technical report 9702. Dept of Computing Science, University of Toronto. (Available from ∼radford/)
  15. O’Hagan A. 1978. On curve fitting and optimal design for regression (with discussion). Journal of the Royal Statistical Society B 40: 1–42.Google Scholar
  16. Ramsay J.O. and Silverman B.W. 1997. Functional Data Analysis. Springer, New York.Google Scholar
  17. Rasmussen C.E. 1996. Evaluation of Gaussian Processes and Other Methods for Non-linear Regression. PhD Thesis. University of Toronto. (Available from
  18. Rasmussen C.E. and Ghahramani Z. 2002. Infinite mixtures of Gaussian process experts. In: Dietterich T., Becker S., and Ghahramani Z. (Eds.), Advances in Neural Information Processing Systems 14, MIT Press.Google Scholar
  19. Richardson S. and Green P.J. 1997. On Bayesian analysis of mixtures with an unknown number of components (with discussion). Journal of the Royal Statistical Society B 59: 731–758.Google Scholar
  20. Stephens M. 2000. Bayesian analysis of mixture models with an unknown number of components—an alternative to reversible jump methods. The Annals of Statistics 28: 40–74.Google Scholar
  21. Thompson T.J., Smith P.J., and Boyle J.P. 1998. Finite mixture models with concomitant information: Assessing diagnostic criteria for diabetes. Applied Statistics 47: 393–404.Google Scholar
  22. Titterington D.M., Smith A.F.M., and Makov U.E. 1985. Statistical Analysis of Finite Mixture Distribution. Wiley, Chichester, New York.Google Scholar
  23. Tresp V. 2000. The Bayesian committee machine. Neural Computation 12: 2719–2741.Google Scholar
  24. Tresp V. 2001. Mixtures of Gaussian processes. In: Leen T.K., Diettrich T.G., and Tresp V. (Eds.), Advances in Neural Information Processing Systems, 13, MIT Press.Google Scholar
  25. Williams C.K.I. 1998. Prediction with Gaussian processes: From linear regression to linear prediction and beyond. In: Jordan M.I. (Ed.), Learning and Inference in Graphical Models, Kluwer, pp. 599–621.Google Scholar
  26. Williams C.K.I. and Rasmussen C.E. 1996. Gaussian process for regression. In: Touretzky D.S. et al. (Eds.), Advances in Neural Information Processing Systems 8, MIT Press.Google Scholar

Copyright information

© Springer Science + Business Media, Inc. 2005

Authors and Affiliations

  1. 1.School of Mathematics and StatisticsUniversity of NewcastleUK
  2. 2.Department of Computing ScienceUniversity of GlasgowGlasgowUK
  3. 3.Department of StatisticsUniversity of GlasgowGlasgowUK
  4. 4.Hamilton Institute, National University of Ireland, Maynooth Co.KildareIreland

Personalised recommendations