Advertisement

Linked PARAFAC/CP Tensor Decomposition and Its Fast Implementation for Multi-block Tensor Analysis

  • Tatsuya Yokota
  • Andrzej Cichocki
  • Yukihiko Yamashita
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7665)

Abstract

In this paper we propose a new flexible group tensor analysis model called the linked CP tensor decomposition (LCPTD). The LCPTD method can decompose given multiple tensors into common factor matrices, individual factor matrices, and core tensors, simultaneously. We applied the Hierarchical Alternating Least Squares (HALS) algorithm to the LCPTD model; besides we impose additional constraints to obtain sparse and nonnegative factors. Furthermore, we conducted some experiments of this model to demonstrate its advantages over existing models.

Keywords

Tensor decompositions of multi-block data PARAFAC/CP model Group Analysis Hierarchical Alternating Least Squares (HALS) 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Bro, R.: Multi-way analysis in the food industry - models, algorithms, and applications. Tech. rep., MRI, EPG and EMA, Proc. ICSLP 2000 (1998)Google Scholar
  2. 2.
    Carroll, J., Chang, J.J.: Analysis of individual differences in multidimensional scaling via an n-way generalization of eckart-young decomposition. Psychometrika 35, 283–319 (1970)zbMATHCrossRefGoogle Scholar
  3. 3.
    Cichocki, A.: Tensor decompositions: New concepts for brain data analysis? Journal of Control, Measurement, and System Integration (SICE) 7, 507–517 (2011)Google Scholar
  4. 4.
    Cichocki, A., Phan, A.: Fast local algorithms for large scale nonnegative matrix and tensor factorizations. IEICE Trans. Fundamentals of Electronics, Communications and Computer Sciences E92-A (3), 708–721 (2009)Google Scholar
  5. 5.
    Cichocki, A., Zdunek, R., Amari, S.-I.: Hierarchical ALS Algorithms for Nonnegative Matrix and 3D Tensor Factorization. In: Davies, M.E., James, C.J., Abdallah, S.A., Plumbley, M.D. (eds.) ICA 2007. LNCS, vol. 4666, pp. 169–176. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  6. 6.
    Cichocki, A., Zdunek, R., Phan, A.H., Amari, S.: Nonnegative Matrix and Tensor Factorizations: Applications to Exploratory Multi-way Data Analysis and Blind Source Separation. Wiley Publishing (2009)Google Scholar
  7. 7.
    Guo, Y., Pagnoni, G.: A unified framework for group independent component analysis for multi-subject fMRI data. NeuroImage 42(3), 1078–1093 (2008)CrossRefGoogle Scholar
  8. 8.
    Harshman, R.: Foundations of the parafac procedure: Model and conditions for an ’explanatory’ multi-mode factor analysis. UCLA Working Papers in phonetics 16, 1–84 (1970)Google Scholar
  9. 9.
    Jolliffe, I.T., Trendafilov, N.T., Uddin, M.: A modified principal component technique based on the LASSO. Journal of Computational and Graphical Statistics 12(3), 531–547 (2003)MathSciNetCrossRefGoogle Scholar
  10. 10.
    Lee, D.D., Seung, H.S.: Learning the parts of objects by non-negative matrix factorization. Nature 401(6755), 788–791 (1999)CrossRefGoogle Scholar
  11. 11.
    Lee, H., Choi, S.: Group nonnegative matrix factorization for EEG classification. Journal of Machine Learning Research - Proceedings Track 5, 320–327 (2009)Google Scholar
  12. 12.
    Phan, A., Cichocki, A.: Tensor decompositions for feature extraction and classification of high dimensional datasets. IEICE, NOLTA 1(1), 37–68 (2010)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Tatsuya Yokota
    • 1
    • 2
  • Andrzej Cichocki
    • 2
  • Yukihiko Yamashita
    • 1
  1. 1.Tokyo Institute of TechnologyJapan
  2. 2.RIKEN Brain Science InstituteJapan

Personalised recommendations