Skip to main content

Nonnegative Tensor Train Decompositions for Multi-domain Feature Extraction and Clustering

  • Conference paper
  • First Online:
Book cover Neural Information Processing (ICONIP 2016)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 9949))

Included in the following conference series:

Abstract

Tensor train (TT) is one of the modern tensor decomposition models for low-rank approximation of high-order tensors. For nonnegative multiway array data analysis, we propose a nonnegative TT (NTT) decomposition algorithm for the NTT model and a hybrid model called the NTT-Tucker model. By employing the hierarchical alternating least squares approach, each fiber vector of core tensors is optimized efficiently at each iteration. We compared the performances of the proposed method with a standard nonnegative Tucker decomposition (NTD) algorithm by using benchmark data sets including event-related potential data and facial image data in multi-domain feature extraction and clustering tasks. It is illustrated that the proposed algorithm extracts physically meaningful features with relatively low storage and computational costs compared to the standard NTD model.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Cichocki, A., Zdunek, R., Amari, S.: Hierarchical ALS algorithms for nonnegative matrix and 3D tensor factorization. In: Davies, M.E., James, C.J., Abdallah, S.A., Plumbley, M.D. (eds.) ICA 2007. LNCS, vol. 4666, pp. 169–176. Springer, Heidelberg (2007)

    Chapter  Google Scholar 

  2. Cichocki, A., Zdunek, R., Phan, A.H., Amari, S.: Nonnegative Matrix and Tensor Factorizations: Applications to Exploratory Multi-way Data Analysis and Blind Source Separation. Wiley, Chichester (2009)

    Book  Google Scholar 

  3. Cong, F., Phan, A.-H., Astikainen, P., Zhao, Q., Wu, Q., Hietanen, J.K., Ristaniemi, T., Cichocki, A.: Multi-domain feature extraction for small event-related potentials through nonnegative multi-way array decomposition from low dense array EEG. Int. J. Neural Syst. 23, 1350006 (2013)

    Article  Google Scholar 

  4. Cong, F., Lin, Q.-H., Kuang, L.-D., Gong, X.-F., Astikainen, P., Ristaniemi, T.: Tensor decomposition of EEG signals: a brief review. J. Neurosci. Methods 248, 59–69 (2015)

    Article  Google Scholar 

  5. Dolgov, S., Khoromskij, B.: Two-level QTT-Tucker format for optimized tensor calculus. SIAM J. Matrix Anal. Appl. 34, 593–623 (2013)

    Article  MathSciNet  MATH  Google Scholar 

  6. Holtz, S., Rohwedder, T., Schneider, R.: The alternating linear scheme for tensor optimization in the tensor train format. SIAM J. Sci. Comput. 34, A683–A713 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  7. Kolda, T.G., Bader, B.W.: Tensor decompositions and applications. SIAM Rev. 51, 455–500 (2009)

    Article  MathSciNet  MATH  Google Scholar 

  8. Lee, N., Cichocki, A.: Fundamental tensor operations for large-scale data analysis in tensor train formats. ArXiv preprint arXiv:1405.7786 (2014)

  9. Oseledets, I.V.: Tensor-train decomposition. SIAM J. Sci. Comput. 33, 2295–2317 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  10. Phan, A.H., Cichocki, A.: Extended HALS algorithm for nonnegative Tucker decomposition and its applications for multiway analysis and classification. Neurocomputing 74, 1956–1969 (2011)

    Article  Google Scholar 

  11. Samaria, F.S., Harter, A.C.: Parameterisation of a stochastic model for human face identification. In: Proceedings of the Second IEEE Workshop on Applications of Computer Vision, pp. 138–142. lEEE Computer Society Press (1994)

    Google Scholar 

  12. Zhou, G., Cichocki, A., Xie, S.: Fast nonnegative matrix/tensor factorization based on low-rank approximation. IEEE T. Signal Process. 60, 2928–2940 (2012)

    Article  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Namgil Lee .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing AG

About this paper

Cite this paper

Lee, N., Phan, AH., Cong, F., Cichocki, A. (2016). Nonnegative Tensor Train Decompositions for Multi-domain Feature Extraction and Clustering. In: Hirose, A., Ozawa, S., Doya, K., Ikeda, K., Lee, M., Liu, D. (eds) Neural Information Processing. ICONIP 2016. Lecture Notes in Computer Science(), vol 9949. Springer, Cham. https://doi.org/10.1007/978-3-319-46675-0_10

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-46675-0_10

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-46674-3

  • Online ISBN: 978-3-319-46675-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics