Advertisement

Indian Buffet Process for Model Selection in Latent Force Models

  • Cristian GuarnizoEmail author
  • Mauricio A. Álvarez
  • Alvaro A. Orozco
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9423)

Abstract

Latent force models (LFM) are an hybrid approach which combines multiple output Gaussian processes and differential equations, where the covariance functions encode the physical models given by the differential equations. LFM require the specification of the number of latent functions used to build the covariance function for the outputs. Furthermore, they assume that the output data is explained by using the entire set of latent functions, which is not the case in many real applications. We propose in this paper the use of an Indian Buffet process (IBP) as a way to perform model selection over the number of latent Gaussian processes in LFM applications. Furthermore, IBP allows us to infer the interconnection between latent functions and the outputs. We use variational inference to approximate the posterior distributions, and show examples of the proposed model performance over artificial data and a motion capture dataset.

Keywords

Indian buffet process Latent force models Gaussian processes Regression 

References

  1. 1.
    Álvarez, M.A., Luengo, D., Lawrence, N.D.: Latent force models. In: van Dyk, D., Welling, M. (eds.) Proceedings of the Twelfth International Conference on Artificial Intelligence and Statistics. JMLR W&CP 5, Clearwater Beach, Florida, pp. 9–16, April 16–18, 2009Google Scholar
  2. 2.
    Álvarez, M.A., Luengo, D., Lawrence, N.D.: Linear latent force models using gaussian processes. IEEE Transactions on Pattern Analysis and Machine Intelligence 35(11), 2693–2705 (2013)CrossRefGoogle Scholar
  3. 3.
    Álvarez, M.A., Luengo, D., Titsias, M.K., Lawrence, N.D.: Variational inducing kernels for sparse convolved multiple output gaussian processes. University of Manchester, Tech. rep. (2009)Google Scholar
  4. 4.
    Álvarez, M.A., Luengo, D., Titsias, M.K., Lawrence, N.D.: Efficient multioutput Gaussian processes through variational inducing kernels. In: Teh, Y.W., Titterington, M. (eds.) Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics. JMLR W&CP 9, Chia Laguna, Sardinia, Italy, pp. 25–32, May 13–15, 2010Google Scholar
  5. 5.
    Álvarez, M.A., Rosasco, L., Lawrence, N.D.: Kernels for vector-valued functions: a review. Foundations and Trends in Machine Learning 4(3), 195–266 (2012)CrossRefGoogle Scholar
  6. 6.
    Bishop, C.M.: Pattern Recognition and Machine Learning. Springer (2006)Google Scholar
  7. 7.
    Chai, K.M., Williams, C., Klanke, S., Vijayakumar, S.: Multi-task Gaussian Process Learning of Robot Inverse Dynamics. In: NIPS 2008 (2009). http://eprints.pascal-network.org/archive/00004640/
  8. 8.
    Doshi-Velez, F., Miller, K., Van Gael, J., Teh, Y.W.: Variational inference for the indian buffet process. In: AISTATS (2009), pp. 137–144 (2009)Google Scholar
  9. 9.
    Griffiths, T.L., Ghahramani, Z.: Infinite latent feature models and the indian buffet process. In: NIPS, pp. 475–482. MIT Press (2005)Google Scholar
  10. 10.
    Griffiths, T.L., Ghahramani, Z.: The indian buffet process: an introduction and review. Journal of Machine Learning Research 12, 1185–1224 (2011)MathSciNetzbMATHGoogle Scholar
  11. 11.
    Guo, S., Zoeter, O., Archambeau, C.: Sparse bayesian multi-task learning. In: Shawe-Taylor, J., Zemel, R., Bartlett, P., Pereira, F., Weinberger, K. (eds.) Advances in Neural Information Processing Systems 24, pp. 1755–1763. Curran Associates, Inc. (2011). http://papers.nips.cc/paper/4242-sparse-bayesian-multi-task-learning.pdf
  12. 12.
    Knowles, D.A., Ghahramani, Z.: Nonparametric Bayesian sparse factor models with application to gene expression modelling. Annals of Applied Statistics 5(2B), 1534–1552 (2011)MathSciNetCrossRefzbMATHGoogle Scholar
  13. 13.
    Olsen, C., Fleming, K., Prendergast, N., Rubio, R., Emmert-Streib, F., Bontempi, G., Haibe-Kains, B., Quackenbush, J.: Inference and validation of predictive gene networks from biomedical literature and gene expression data. Genomics 103(5–6), 329–336 (2014)CrossRefGoogle Scholar
  14. 14.
    Rasmussen, C.E., Williams, C.K.I.: Gaussian Processes for Machine Learning. MIT Press, Cambridge (2006)zbMATHGoogle Scholar
  15. 15.
    Titsias, M.K., Lázaro-Gredilla, M.: Spike and slab variational inference for multi-task and multiple kernel learning. In: NIPS 2011, pp. 2339–2347 (2011)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Cristian Guarnizo
    • 1
    Email author
  • Mauricio A. Álvarez
    • 1
  • Alvaro A. Orozco
    • 1
  1. 1.Engineering PhD ProgramUniversidad Tecnológica de PereiraColombiaSouth America

Personalised recommendations