Advertisement

Geometrical Formulation of the Nonnegative Matrix Factorization

  • Shotaro Akaho
  • Hideitsu Hino
  • Neneka Nara
  • Noboru Murata
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11303)

Abstract

Nonnegative matrix factorization (NMF) has many applications as a tool for dimension reduction. In this paper, we reformulate the NMF from an information geometrical viewpoint. We show that a conventional optimization criterion is not geometrically natural, thus we propose to use more natural criterion. By this formulation, we can apply a geometrical algorithm based on the Pythagorean theorem. We also show the algorithm can improve the existing algorithm through numerical experiments.

Keywords

Information geometry Dimension reduction Topic model 

References

  1. 1.
    Akaho, S.: The e-PCA and m-PCA: dimension reduction of parameters by information geometry. In: Proceedings of the 2004 IEEE International Joint Conference on Neural Networks, vol. 1, pp. 129–134. IEEE (2004)Google Scholar
  2. 2.
    Amari, S.: Differential-Geometrical Methods in Statistics. Springer, Heidelberg (1985).  https://doi.org/10.1007/978-1-4612-5056-2DCrossRefzbMATHGoogle Scholar
  3. 3.
    Amari, S.: Information Geometry and Its Applications. AMS, vol. 194. Springer, Tokyo (2016).  https://doi.org/10.1007/978-4-431-55978-8CrossRefzbMATHGoogle Scholar
  4. 4.
    Blei, D.M.: Probabilistic topic models. Commun. ACM 55(4), 77–84 (2012)CrossRefGoogle Scholar
  5. 5.
    Cho, Y.C., Choi, S.: Nonnegative features of spectro-temporal sounds for classification. Pattern Recognit. Lett. 26(9), 1327–1336 (2005)CrossRefGoogle Scholar
  6. 6.
    Cichocki, A., Zdunek, R., Phan, A.H., Amari, S.: Nonnegative Matrix and Tensor Factorizations: Applications to Exploratory Multi-way Data Analysis and Blind Source Separation. Wiley, Chichester (2009)CrossRefGoogle Scholar
  7. 7.
    Collins, M., Dasgupta, S., Schapire, R.E.: A generalization of principal component analysis to the exponential family. In: NIPS, vol. 13, p. 23 (2001)Google Scholar
  8. 8.
    Dhillon, I.S., Sra, S.: Generalized nonnegative matrix approximations with Bregman divergences. In: NIPS, vol. 18 (2005)Google Scholar
  9. 9.
    Dong, B., Lin, M.M., Chu, M.T.: Nonnegative rank factorization—a heuristic approach via rank reduction. Numer. Algorithms 65(2), 251–274 (2014)MathSciNetCrossRefGoogle Scholar
  10. 10.
    Févotte, C., Bertin, N., Durrieu, J.L.: Nonnegative matrix factorization with the Itakura-Saito divergence: with application to music analysis. Neural Comput. 21(3), 793–830 (2009)CrossRefGoogle Scholar
  11. 11.
    Harman, D.: Overview of the first text retrieval conference (TREC-1). In: The First Text REtrieval Conference (TREC-1), pp. 1–20, no. 1 (1992)Google Scholar
  12. 12.
    Harper, F.M., Konstan, J.A.: The MovieLens datasets: history and context. ACM Trans. Interact. Intell. Syst. (TIIS) 5(4), 19 (2016)Google Scholar
  13. 13.
    Hino, H., Takano, K., Akaho, S., Murata, N.: Non-parametric e-mixture of density functions. In: Hirose, A., Ozawa, S., Doya, K., Ikeda, K., Lee, M., Liu, D. (eds.) ICONIP 2016. LNCS, vol. 9948, pp. 3–10. Springer, Cham (2016).  https://doi.org/10.1007/978-3-319-46672-9_1CrossRefGoogle Scholar
  14. 14.
    Hofmann, T.: Probabilistic latent semantic indexing. In: Proceedings of the 22nd Annual International ACM SIGIR Conference on Research and Development in Information Retrieval, pp. 50–57. ACM (1999)Google Scholar
  15. 15.
    Lee, D.D., Seung, H.S.: Algorithms for non-negative matrix factorization. In: Advances in Neural Information Processing Systems, pp. 556–562 (2001)Google Scholar
  16. 16.
    Nagaoka, H., Amari, S.: Differential geometry of smooth families of probability distributions. Technical report METR 82–7, University of Tokyo (1982)Google Scholar
  17. 17.
    Takano, K., Hino, H., Akaho, S., Murata, N.: Nonparametric e-mixture estimation. Neural Comput. 28(12), 2687–2725 (2016)CrossRefGoogle Scholar
  18. 18.
    Watanabe, K., Akaho, S., Omachi, S., Okada, M.: Variational Bayesian mixture model on a subspace of exponential family distributions. IEEE Trans. Neural Netw. 20(11), 1783–1796 (2009)CrossRefGoogle Scholar
  19. 19.
    Wohlmayr, M., Pernkopf, F.: Model-based multiple pitch tracking using factorial HMMs: model adaptation and inference. IEEE Trans. Audio Speech Lang. Process. 21(8), 1742–1754 (2013)CrossRefGoogle Scholar
  20. 20.
    Yoshida, K., Kuwatani, T., Hirajima, T., Iwamori, H., Akaho, S.: Progressive evolution of whole-rock composition during metamorphism revealed by multivariate statistical analyses. J. Metamorph. Geol. 36(1), 41–54 (2018)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  1. 1.National Institute of Advanced Industrial Science and TechnologyTsukubaJapan
  2. 2.The Institute of Statistical MathematicsTachikawaJapan
  3. 3.Waseda UniversityTokyoJapan

Personalised recommendations