Advertisement

The Hard-Cut EM Algorithm for Mixture of Sparse Gaussian Processes

  • Ziyi Chen
  • Jinwen MaEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9227)

Abstract

The mixture of Gaussian Processes (MGP) is a powerful and fast developed machine learning framework. In order to make its learning more efficient, certain sparsity constraints have been adopted to form the mixture of sparse Gaussian Processes (MSGP). However, the existing MGP and MSGP models are rather complicated and their learning algorithms involve various approximation schemes. In this paper, we refine the MSGP model and develop the hard-cut EM algorithm for MSGP from its original version for MGP. It is demonstrated by the experiments on both synthetic and real datasets that our refined MSGP model and the hard-cut EM algorithm are feasible and can outperform some typical regression algorithms on prediction. Moreover, with sparse technique, the parameter learning of our proposed MSGP model is much more efficient than that of the MGP model.

Keywords

Mixture of Gaussian Processes Sparsity Hard-cut EM algorithm Big data 

Notes

Acknowledgement

This work was supported by the Natural Science Foundation of China for Grant 61171138. The authors would like to thank Dr. E. Snelson and Dr. Z. Ghahramani for their valuable advice about FITC model.

References

  1. 1.
    Rasmussen, C.E.: Evaluation of Gaussian processes and other methods for non-linear regression. The University of Toronto (1996)Google Scholar
  2. 2.
    Williiams, C.K.I., Barber, D.: Bayesian classification with Gaussian processes. IEEE Trans. Pattern Anal. Mach. Intell. 20(12), 1342–1351 (1998)CrossRefGoogle Scholar
  3. 3.
    Gao, X.B., Wang, X.M., Tao, D.C.: Supervised Gaussian process latent variable model for dimensionality reduction. IEEE Trans. Syst. Man Cybern. B Cyberne. 41(2), 425–434 (2011)CrossRefGoogle Scholar
  4. 4.
    Rasmussen, C.E., Kuss, M.: Gaussian processes in reinforcement learning. In: Thrun, S., Saul, L., Schölkopf, B. (eds.) Advances in Neural Information Processing Systems, vol. 16, pp. 751–759. MIT Press, Cambridge (2003)Google Scholar
  5. 5.
    Yuan, C., Neubauer, C.: Variational mixture of Gaussian process experts. In: Advances in Neural Information Processing Systems, vol. 21, pp. 1897–1904 (2008)Google Scholar
  6. 6.
    Stachniss, C., Plagemann, C., Lilienthal, A.J., et al.: Gas distribution modeling using sparse Gaussian process mixture models. In: Proceedings of Robotics: Science and Systems, pp. 310–317 (2008)Google Scholar
  7. 7.
    Tresp, V.: Mixtures of Gaussian processes. In: Advances in Neural Information Processing Systems, vol. 13, pp. 654–660 (2000)Google Scholar
  8. 8.
    Snelson, E., Ghahramani, Z.: Sparse Gaussian processes using pseudo-inputs. In: Advances in Neural Information Processing Systems, vol. 18, pp. 1257–1264 (2005)Google Scholar
  9. 9.
    Nguyen, T., Bonilla, E.: Fast allocation of Gaussian process experts. In: Proceedings of the 31st International Conference on Machine Learning, pp. 145–153 (2014)Google Scholar
  10. 10.
    Wang, Y., Khardon, R.: Sparse Gaussian processes for multi-task learning. In: The European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, pp. 711–727 (2012)Google Scholar
  11. 11.
    Sun, S., Xu, X.: Variational inference for infinite mixtures of Gaussian processes with applications to traffic flow prediction. IEEE Trans. Intell. Transp. Syst. 12(2), 466–475 (2011)CrossRefGoogle Scholar
  12. 12.
    Meeds, E., Osindero, S.: An alternative infinite mixture of Gaussian process experts. In: Advances in Neural Information Processing Systems, vol. 18, pp. 883–890 (2005)Google Scholar
  13. 13.
    Gramacy, R.B., Lee, H.K.H.: Bayesian treed Gaussian process models with an application to computer modeling. J. Am. Stat. Assoc. 103(483), 1119–1130 (2008)MathSciNetCrossRefGoogle Scholar
  14. 14.
    Shi, J.Q., Murray-Smith, R., Titterington, D.M.: Bayesian regression and classification using mixtures of Gaussian processes. Int. J. Adapt. Control Sig. Process. 17(2), 149–161 (2003)CrossRefGoogle Scholar
  15. 15.
    Shi, J.Q., Murray-Smith, R., Titterington, D.M.: Hierarchical Gaussian process mixtures for regression. Stat. Comput. 15(1), 31–41 (2005)MathSciNetCrossRefGoogle Scholar
  16. 16.
    Rasmussen, C.E., Ghahramani, Z.: Infinite mixtures of Gaussian process experts. In: Advances in Neural Information Processing Systems, vol. 14, pp. 881–888 (2001)Google Scholar
  17. 17.
    Fergie, M.P.: Discriminative Pose Estimation Using Mixtures of Gaussian Processes. The University of Manchester (2013)Google Scholar
  18. 18.
    Sun, S.: Infinite mixtures of multivariate Gaussian processes. In: Proceedings of the International Conference on Machine Learning and Cybernetics, pp. 1011–1016 (2013)Google Scholar
  19. 19.
    Tayal, A., Poupart, P., Li, Y.: Hierarchical double Dirichlet process mixture of Gaussian processes. In: Proceedings of the 26th Association for the Advancement of Artificial Intelligence, pp. 1126–1133 (2012)Google Scholar
  20. 20.
    Ross, J., Dy, J.: Nonparametric mixture of Gaussian processes with constraints. In: Proceedings of the 30th International Conference on Machine Learning, pp. 1346–1354 (2013)Google Scholar
  21. 21.
    Chatzis, S.P., Demiris, Y.: Nonparametric mixtures of Gaussian processes with power-law behavior. IEEE Trans. Neural Netw. Learn. Syst. 23(12), 1862–1871 (2012)CrossRefGoogle Scholar
  22. 22.
    Platanios, E.A., Chatzis, S.P.: Mixture Gaussian process conditional heteroscedasticity. IEEE Trans. Pattern Anal. Mach. Intell. 36(5), 888–900 (2014)CrossRefGoogle Scholar
  23. 23.
    Kapoor, A., Ahn, H., Picard, R.W.: Mixture of Gaussian processes for combining multiple modalities. In: Oza, N.C., Polikar, R., Kittler, J., Roli, F. (eds.) MCS 2005. LNCS, vol. 3541, pp. 86–96. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  24. 24.
    Reece, S., Mann, R., Rezek, I., et al.: Gaussian process segmentation of co-moving animals. Proc. Am. Inst. Phys. 1305(1), 430–437 (2011)MathSciNetGoogle Scholar
  25. 25.
    Lázaro-Gredilla, M., Van, V.S., Lawrence, N.D.: Overlapping mixtures of Gaussian processes for the data association problem. Pattern Recogn. 45(4), 1386–1395 (2012)CrossRefzbMATHGoogle Scholar
  26. 26.
    Yang, Y., Ma, J.: An efficient EM approach to parameter learning of the mixture of Gaussian processes. In: Liu, D., Zhang, H., Polycarpou, M., Alippi, C., He, H. (eds.) ISNN 2011, Part II. LNCS, vol. 6676, pp. 165–174. Springer, Heidelberg (2011)CrossRefGoogle Scholar
  27. 27.
    Schiegg, M., Neumann, M., Kersting, K.: Markov logic mixtures of Gaussian processes: towards machines reading regression data. In: Proceedings of the 15th International Conference on Artificial Intelligence and Statistics, JMLR:W&CP, vol. 22, pp. 1002–1011 (2012)Google Scholar
  28. 28.
    Yu, J., Chen, K., Rashid, M.M.: A Bayesian model averaging based multi-kernel Gaussian process regression framework for nonlinear state estimation and quality prediction of multiphase batch processes with transient dynamics and uncertainty. Chem. Eng. Sci. 93(19), 96–109 (2013)CrossRefGoogle Scholar
  29. 29.
    Chen, Z., Ma, J., Zhou, Y.: A precise hard-cut EM algorithm for mixtures of Gaussian processes. In: Huang, D.-S., Jo, K.-H., Wang, L. (eds.) ICIC 2014. LNCS, vol. 8589, pp. 68–75. Springer, Heidelberg (2014)Google Scholar
  30. 30.
    Rasmussen, C.E., Williams, C.K.I.: Gaussian Processes for Machine Learning. MIT Press, Cambridge (2006)zbMATHGoogle Scholar
  31. 31.
    Sundararajan, S., Keerthi, S.: Predictive approaches for choosing hyperparameters in Gaussian processes. Neural Comput. 13(5), 1103–1118 (2001)CrossRefzbMATHGoogle Scholar
  32. 32.
    Quiñonero-Candela, J., Rasmussen, C.E.: A unifying view of sparse approximate Gaussian process regression. J. Mach. Learn. Res. 6, 1935–1959 (2005)Google Scholar
  33. 33.
    Csató, L., Opper, M.: Sparse online Gaussian processes. Neural Comput. 14(3), 641–669 (2002)CrossRefGoogle Scholar
  34. 34.
    Seeger, M., Williams, C.K.I., Lawrence, N.D.: Fast forward selection to speed up sparse Gaussian process regression. In: Bishop, C.M., Frey, B.J. (eds.) Proceedings of the 9th International Workshop on Artificial Intelligence and Statistics (2003)Google Scholar
  35. 35.
    Keerthi, S.S., Chu, W.: A matching pursuit approach to sparse Gaussian process regression. In: Advances in Neural Information Processing Systems, vol. 18 (2005)Google Scholar
  36. 36.
    Smola, A., Bartlett, P.: Sparse greedy Gaussian process regression. Adv. Neural Inf. Process. Syst. 13, 619–625 (2000)Google Scholar
  37. 37.
    Murphy, K.P.: Machine Learning: A Probabilistic Perspective. MIT Press, Cambridge (2012)Google Scholar
  38. 38.
    Smola, A.J., Schölkopf, B.: A tutorial on support vector regression. Stat. Comput. 14(3), 199–222 (2004)MathSciNetCrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  1. 1.Department of Information Science, School of Mathematical Sciences and LMAMPeking UniversityBeijingChina

Personalised recommendations