Tensors and Latent Variable Models
In this paper we discuss existing and new connections between latent variable models from machine learning and tensors (multi-way arrays) from multilinear algebra. A few ideas have been developed independently in the two communities. However, there are still many useful but unexplored links and ideas that could be borrowed from one of the communities and used in the other. We will start our discussion from simple concepts such as independent variables and rank-1 matrices and gradually increase the difficulty. The final goal is to connect discrete latent tree graphical models to state of the art tensor decompositions in order to find tractable representations of probability tables of many variables.
KeywordsLatent variable models Tensor Low rank
This work was supported in part by the Fund for Scientific Research (FWO-Vlaanderen), by FWO project G.0280.15N, by the Flemish Government (Methusalem), by the Belgian Government through the Inter-university Poles of Attraction (IAP VII) Program (DYSCO II, Dynamical systems, control and optimization, 2012–2017), by the ERC Advanced Grant SNLSID under contract 320378, and by the ERC Starting Grant SLRA under contract 258581. Mariya Ishteva was an FWO Pegasus Marie Curie Fellow.
- 2.Cichocki, A.: Tensor networks for big data analytics and large-scale optimization problems (2014). arXiv preprint arXiv:1407.3124
- 11.Ishteva, M., Song, L., Park, H.: Unfolding latent tree structures using \(4\)th order tensors. In: International Conference on Machine Learning (ICML) (2013)Google Scholar
- 16.Shashua, A.: The applications of tensor factorization in inference, clustering, graph theory, coding and visual representation, 2012. Keynote talk at the 10th International Conference on Latent Variable Analysis and Signal SeparationGoogle Scholar
- 18.Song, L., Ishteva, M., Parikh, A., Xing, E., Park, H.: Hierarchical tensor decomposition of latent tree graphical models. In: International Conference on Machine Learning (ICML) (2013)Google Scholar