Advertisement

Multi-view Deep Gaussian Processes

  • Shiliang Sun
  • Qiuyang Liu
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11301)

Abstract

Deep Gaussian processes (DGPs) have shown their power in many tasks of machine learning. However, when they deal with multi-view data, DGPs assume the same modeling depth for different views of data, which is quite unreasonable because there are usually large diversities among different views. In this paper, we propose the model of multi-view deep Gaussian processes (MvDGPs), which takes full account of the characteristics of multi-view data. Combining the advantages of the DGPs with the multi-view learning, MvDGPs can independently determine the modeling depths for each view, which is more flexible and powerful. In contrast with the DGPs, MvDGPs support asymmetrical modeling depths for different view of data, resulting in better characterizations of the discrepancies among different views. Experimental results on multiple multi-view data sets have verified the flexibilities and effectiveness of the proposed model.

Keywords

Multi-view learning Deep learning Gaussian process Unsupervised learning 

Notes

Acknowledgments

This work is supported by the National Natural Science Foundation of China under Project 61673179, and Shanghai Knowledge Service Platform Project (No. ZF1213).

References

  1. 1.
    Blum, A., Mitchell, T.: Combining labeled and unlabeled data with co-training. In: Proceedings of the 11th Annual Conference on Computational Learning Theory, pp. 92–100 (1998)Google Scholar
  2. 2.
    Bui, T., Hernández-Lobato, D., Hernandez-Lobato, J., Li, Y., Turner, R.: Deep Gaussian processes for regression using approximate expectation propagation. In: International Conference on Machine Learning, pp. 1472–1481 (2016)Google Scholar
  3. 3.
    Bui, T.D., Hernández-Lobato, J.M., Li, Y., Hernández-Lobato, D., Turner, R.E.: Training deep Gaussian processes using stochastic expectation propagation and probabilistic backpropagation. arXiv preprint arXiv:1511.03405 (2015)
  4. 4.
    Dai, Z., Damianou, A., González, J., Lawrence, N.: Variational auto-encoded deep Gaussian processes. arXiv preprint arXiv:1511.06455 (2015)
  5. 5.
    Damianou, A., Lawrence, N.: Deep Gaussian processes. In: Artificial Intelligence and Statistics, pp. 207–215 (2013)Google Scholar
  6. 6.
    Damianou, A.C., Titsias, M.K., Lawrence, N.D.: Variational Gaussian process dynamical systems. In: Proceedings of the 25th Annual Conference on Neural Information Processing Systems, pp. 2510–2518 (2011)Google Scholar
  7. 7.
    Hearst, M.A., Dumais, S.T., Osuna, E., Platt, J., Scholkopf, B.: Support vector machines. IEEE Intell. Syst. Appl. 13(4), 18–28 (1998)CrossRefGoogle Scholar
  8. 8.
    Joyce, J.M.: Kullback-Leibler divergence. Springer (2011)Google Scholar
  9. 9.
    Kandemir, M.: Asymmetric transfer learning with deep Gaussian processes. In: International Conference on Machine Learning, pp. 730–738 (2015)Google Scholar
  10. 10.
    Krause, A., Guestrin, C.: Nonmyopic active learning of Gaussian processes: an exploration-exploitation approach. In: Proceedings of the 24th International Conference on Machine Learning, pp. 449–456 (2007)Google Scholar
  11. 11.
    Lawrence, N.D., Jordan, M.I.: Semi-supervised learning via Gaussian processes. In: Proceedings of the 18th Annual Conference on Neural Information Processing Systems, pp. 753–760 (2004)Google Scholar
  12. 12.
    Liu, Q., Sun, S.: Multi-view regularized Gaussian processes. In: Kim, J., Shim, K., Cao, L., Lee, J.-G., Lin, X., Moon, Y.-S. (eds.) PAKDD 2017. LNCS (LNAI), vol. 10235, pp. 655–667. Springer, Cham (2017).  https://doi.org/10.1007/978-3-319-57529-2_51CrossRefGoogle Scholar
  13. 13.
    Quiñonero-Candela, J., Rasmussen, C.E.: A unifying view of sparse approximate Gaussian process regression. J. Mach. Learn. Res. 6(Dec), 1939–1959 (2005)MathSciNetzbMATHGoogle Scholar
  14. 14.
    Rasmussen, C.E., Williams, C.K.I.: Gaussian Processes for Machine Learning. MIT Press, Cambridge (2006)zbMATHGoogle Scholar
  15. 15.
    Salimbeni, H., Deisenroth, M.: Doubly stochastic variational inference for deep Gaussian processes. In: Advances in Neural Information Processing Systems, pp. 4591–4602 (2017)Google Scholar
  16. 16.
    Sun, S.: A survey of multi-view machine learning. Neural Comput. Appl. 23(7–8), 2031–2038 (2013)CrossRefGoogle Scholar
  17. 17.
    Sun, S., Chao, G.: Multi-view maximum entropy discrimination. In: IJCAI, pp. 1706–1712 (2013)Google Scholar
  18. 18.
    Sun, S., Shawe-Taylor, J., Mao, L.: PAC-Bayes analysis of multi-view learning. Inf. Fusion 35, 117–131 (2017)CrossRefGoogle Scholar
  19. 19.
    Titsias, M., Lawrence, N.D.: Bayesian Gaussian process latent variable model. In: Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, pp. 844–851 (2010)Google Scholar
  20. 20.
    Yu, S., Krishnapuram, B., Rosales, R., Rao, R.B.: Bayesian co-training. J. Mach. Learn. Res. 12, 2649–2680 (2011)MathSciNetzbMATHGoogle Scholar
  21. 21.
    Zhao, J., Xie, X., Xu, X., Sun, S.: Multi-view learning overview: recent progress and new challenges. Inf. Fusion 38, 43–54 (2017)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  1. 1.Department of Computer Science and TechnologyEast China Normal UniversityShanghaiChina

Personalised recommendations