Fast Variational Bayesian Linear State-Space Model

  • Jaakko Luttinen
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8188)

Abstract

This paper presents a fast variational Bayesian method for linear state-space models. The standard variational Bayesian expectation-maximization (VB-EM) algorithm is improved by a parameter expansion which optimizes the rotation of the latent space. With this approach, the inference is orders of magnitude faster than the standard method. The speed of the proposed method is demonstrated on an artificial dataset and a large real-world dataset, which shows that the standard VB-EM algorithm is not suitable for large datasets because it converges extremely slowly. In addition, the paper estimates the temporal state variables using a smoothing algorithm based on the block LDL decomposition. This smoothing algorithm reduces the number of required matrix inversions and avoids a model augmentation compared to previous approaches.

Keywords

variational Bayesian methods linear state-space models parameter expansion 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Bar-Shalom, Y., Li, X.R., Kirubarajan, T.: Estimation with Applications to Tracking and Navigation. Wiley-Interscience (2001)Google Scholar
  2. 2.
    Shumway, R.H., Stoffer, D.S.: Time Series Analysis and its Applications. Springer (2000)Google Scholar
  3. 3.
    Beal, M.J., Ghahramani, Z.: The variational Bayesian EM algorithm for incomplete data: with application to scoring graphical model structures. Bayesian Statistics 7, 453–464 (2003)MathSciNetGoogle Scholar
  4. 4.
    Bishop, C.M.: Pattern Recognition and Machine Learning. Information Science and Statistics, 2nd edn. Springer, New York (2006)Google Scholar
  5. 5.
    Liu, C., Rubin, D.B., Wu, Y.N.: Parameter expansion to accelerate EM: the PX-EM algorithm. Biometrika 85, 755–770 (1998)MathSciNetMATHCrossRefGoogle Scholar
  6. 6.
    Qi, Y.A., Jaakkola, T.S.: Parameter expanded variational Bayesian methods. In: [16], pp. 1097–1104Google Scholar
  7. 7.
    Luttinen, J., Ilin, A.: Transformations in variational Bayesian factor analysis to speed up learning. Neurocomputing 73, 1093–1102 (2010)CrossRefGoogle Scholar
  8. 8.
    Klami, A., Virtanen, S., Kaski, S.: Bayesian canonical correlation analysis. Journal of Machine Learning Research 14, 899–937 (2013)Google Scholar
  9. 9.
    Kang, H., Choi, S.: Probabilistic models for common spatial patterns: Parameter-expanded EM and variational Bayes. In: Proceedings of the Twenty-Sixth AAAI Conference on Artificial Intelligence (2012)Google Scholar
  10. 10.
    Rauch, H.E., Tung, F., Striebel, C.T.: Maximum likelihood estimates of linear dynamic systems. AIAA Journal 3(8), 1445–1450 (1965)MathSciNetCrossRefGoogle Scholar
  11. 11.
    Beal, M.J.: Variational algorithms for approximate Bayesian inference. PhD thesis, Gatsby Computational Neuroscience Unit, University College London (2003)Google Scholar
  12. 12.
    Barber, D., Chiappa, S.: Unified inference for variational Bayesian linear Gaussian state-space models. In: [16]Google Scholar
  13. 13.
    Eubank, R.L., Wang, S.: The equivalence between the Cholesky decomposition and the Kalman filter. The American Statistician 56(1), 39–43 (2002)MathSciNetCrossRefGoogle Scholar
  14. 14.
    Kalman, R.E., Bucy, R.S.: New results in linear filtering and prediction theory. Journal of Basic Engineering 85, 95–108 (1961)MathSciNetCrossRefGoogle Scholar
  15. 15.
    Luttinen, J., Ilin, A., Karhunen, J.: Bayesian robust PCA of incomplete data. Neural Processing Letters 36(2), 189–202 (2012)CrossRefGoogle Scholar
  16. 16.
    Schölkopf, B., Platt, J., Hoffman, T. (eds.): Advances in Neural Information Processing Systems 19. MIT Press, Cambridge (2007)MATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Jaakko Luttinen
    • 1
  1. 1.Aalto UniversityEspooFinland

Personalised recommendations