Tensors and Latent Variable Models

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9237)

Abstract

In this paper we discuss existing and new connections between latent variable models from machine learning and tensors (multi-way arrays) from multilinear algebra. A few ideas have been developed independently in the two communities. However, there are still many useful but unexplored links and ideas that could be borrowed from one of the communities and used in the other. We will start our discussion from simple concepts such as independent variables and rank-1 matrices and gradually increase the difficulty. The final goal is to connect discrete latent tree graphical models to state of the art tensor decompositions in order to find tractable representations of probability tables of many variables.

Keywords

Latent variable models Tensor Low rank 

Notes

Acknowledgments

This work was supported in part by the Fund for Scientific Research (FWO-Vlaanderen), by FWO project G.0280.15N, by the Flemish Government (Methusalem), by the Belgian Government through the Inter-university Poles of Attraction (IAP VII) Program (DYSCO II, Dynamical systems, control and optimization, 2012–2017), by the ERC Advanced Grant SNLSID under contract 320378, and by the ERC Starting Grant SLRA under contract 258581. Mariya Ishteva was an FWO Pegasus Marie Curie Fellow.

References

  1. 1.
    Carroll, J., Chang, J.: Analysis of individual differences in multidimensional scaling via an N-way generalization of “Eckart-Young” decomposition. Psychometrika 35(3), 283–319 (1970)CrossRefGoogle Scholar
  2. 2.
    Cichocki, A.: Tensor networks for big data analytics and large-scale optimization problems (2014). arXiv preprint arXiv:1407.3124
  3. 3.
    Cichocki, A., Mandic, D., De Lathauwer, L., Zhou, G., Zhao, Q., Caiafa, C., Phan, H.A.: Tensor decompositions for signal processing applications: from two-way to multiway component analysis. IEEE Sig. Process. Mag. 32(2), 145–163 (2015)CrossRefGoogle Scholar
  4. 4.
    Cichocki, A., Zdunek, R., Phan, A., Amari, S.: Nonnegative Matrix and Tensor Factorizations. Wiley, Chichester (2009)CrossRefGoogle Scholar
  5. 5.
    Comon, P.: Tensors: a brief introduction. IEEE Signal Process. Mag. 31(3), 44–53 (2014)CrossRefGoogle Scholar
  6. 6.
    Domanov, I., De Lathauwer, L.: On the uniqueness of the canonical polyadic decomposition of third-order tensors – part II: Uniqueness of the overall decomposition. SIAM J. Matrix Anal. Appl. 34(3), 876–903 (2013)MathSciNetCrossRefMATHGoogle Scholar
  7. 7.
    Grasedyck, L.: Hierarchical singular value decomposition of tensors. SIAM J. Matrix Anal. Appl. 31(4), 2029–2054 (2010)MathSciNetCrossRefMATHGoogle Scholar
  8. 8.
    Grasedyck, L., Kressner, D., Tobler, C.: A literature survey of low-rank tensor approximation techniques. GAMM Mitt. 36(1), 53–78 (2013)MathSciNetCrossRefMATHGoogle Scholar
  9. 9.
    Hackbusch, W.: Tensor Spaces and Numerical Tensor Calculus. Springer Series in Computational Mathematics. Springer, Heidelberg (2012). vol. 42CrossRefMATHGoogle Scholar
  10. 10.
    Harshman, R.A.: Foundations of the PARAFAC procedure: model and conditions for an “explanatory” multi-mode factor analysis. UCLA Working Pap. Phonetics 16(1), 1–84 (1970)MATHGoogle Scholar
  11. 11.
    Ishteva, M., Song, L., Park, H.: Unfolding latent tree structures using \(4\)th order tensors. In: International Conference on Machine Learning (ICML) (2013)Google Scholar
  12. 12.
    Khoromskij, B.N.: Tensors-structured numerical methods in scientific computing: Survey on recent advances. Chemometr. Intell. Lab. Syst. 110(1), 1–19 (2012)CrossRefMATHGoogle Scholar
  13. 13.
    Kolda, T.G., Bader, B.W.: Tensor decompositions and applications. SIAM Rev. 51(3), 455–500 (2009)MathSciNetCrossRefMATHGoogle Scholar
  14. 14.
    Kroonenberg, P.M.: Applied Multiway Data Analysis. Wiley, New York (2008)CrossRefGoogle Scholar
  15. 15.
    Oseledets, I.V.: Tensor-train decomposition. SIAM J. Sci. Comput. 33, 2295–2317 (2011)MathSciNetCrossRefMATHGoogle Scholar
  16. 16.
    Shashua, A.: The applications of tensor factorization in inference, clustering, graph theory, coding and visual representation, 2012. Keynote talk at the 10th International Conference on Latent Variable Analysis and Signal SeparationGoogle Scholar
  17. 17.
    Smilde, A., Bro, R., Geladi, P.: Multi-way Analysis Applications in the Chemical Sciences. Wiley, Chichester (2004)CrossRefGoogle Scholar
  18. 18.
    Song, L., Ishteva, M., Parikh, A., Xing, E., Park, H.: Hierarchical tensor decomposition of latent tree graphical models. In: International Conference on Machine Learning (ICML) (2013)Google Scholar
  19. 19.
    Vervliet, N., Debals, O., Sorber, L., De Lathauwer, L.: Breaking the curse of dimensionality using decompositions of incomplete tensors. Sig. Process. Mag. IEEE 31(5), 71–79 (2014)CrossRefGoogle Scholar
  20. 20.
    Yılmaz, Y.K., Cemgil, A.T.: Algorithms for probabilistic latent tensor factorization. Sig. Process. 92(8), 1853–1863 (2012)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  1. 1.Vrije Universiteit Brussel (VUB)BrusselsBelgium

Personalised recommendations