Nonnegative Dictionary Learning by Nonnegative Matrix Factorization with a Sparsity Constraint

  • Zunyi Tang
  • Shuxue Ding
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7368)


In this paper, we propose an overcomplete nonnegative dictionary learning method for sparse representation of signals by posing it as a problem of nonnegative matrix factorization (NMF) with a sparsity constraint. By introducing the sparsity constraint, we show that the problem can be cast as two sequential optimal problems of parabolic functions, although the forms of parabolic functions are different from that of the case without the constraint [1,2]. So that the problems can be efficiently solved by generalizing the hierarchical alternating least squares (HALS) algorithm, since the original HALS can work only for the case without the constraint. The convergence of dictionary learning process is fast and the computational cost is low. Numerical experiments show that the algorithm performs better than the nonnegative K-SVD (NN-KSVD) and the other two compared algorithms, and the computational cost is remarkably reduced either.


dictionary learning sparse representation nonnegative matrix factorization (NMF) hierarchical alternating least squares (HALS) overcomplete dictionary 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Cichocki, A., Zdunek, R., Amari, S.-I.: Hierarchical ALS Algorithms for Nonnegative Matrix and 3D Tensor Factorization. In: Davies, M.E., James, C.J., Abdallah, S.A., Plumbley, M.D. (eds.) ICA 2007. LNCS, vol. 4666, pp. 169–176. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  2. 2.
    Cichocki, A., Phan, A.-H.: Fast local algorithms for large scale nonnegative matrix and tensor factorizations. IEICE Trans. on Fundamentals of Electronics E92-A(3), 708–721 (2009)CrossRefGoogle Scholar
  3. 3.
    Tošić, I., Frossard, P.: Dictionary learning. IEEE Signal Processing Magazine 28(2), 27–38 (2011)CrossRefGoogle Scholar
  4. 4.
    Olshausen, B.A., Field, D.J.: Emergence of simple-cell receptive field properties by learning a sparse code for natural images. Nature 381, 607–609 (1996)CrossRefGoogle Scholar
  5. 5.
    Aharon, M., Elad, M., Bruckstein, A.: K-SVD and its nonnegative variant for dictionary design. In: Proceedings of the SPIE Conference Wavelets, vol. 5914, pp. 327–339 (July 2005)Google Scholar
  6. 6.
    Hoyer, P.O.: Non-negative matrix factorization with sparseness constraints. Journal of Machine Learning Research, 1457–1469 (2004)Google Scholar
  7. 7.
    Lee, D.D., Seung, H.S.: Learning the parts of objects by nonnegative matrix factorization. Nature 401, 788–791 (1999)CrossRefGoogle Scholar
  8. 8.
    Lee, D.D., Seung, H.S.: Algorithms for non-negative matrix factorization. In: Advances in Neural Information Processing Systems, pp. 556–562 (2001)Google Scholar
  9. 9.
    Peharz, R., Stark, M., Pernkopf, F.: Sparse nonnegative matrix factorization using ℓ0-constraints. In: 2010 IEEE International Workshop on Machine Learning for Signal Processing (MLSP), pp. 83–88 (September 2010)Google Scholar
  10. 10.
    Gillis, N.: Nonnegative Matrix Factorization: Complexity, Algorithms and Applications. PhD thesis, Université catholique de Louvain (2011)Google Scholar
  11. 11.
    Berry, M.W., Browne, M., Langville, A.N., Pauca, V.P., Plemmons, R.J.: Algorithms and applications for approximate nonnegative matrix factorization. Computational Statistics & Data Analysis 52(1), 155–173 (2007)MathSciNetzbMATHCrossRefGoogle Scholar
  12. 12.
    Aharon, M., Elad, M., Bruckstein, A.: K-SVD: An algorithm for designing overcomplete dictionaries for sparse representation. IEEE Trans. on Signal Processing 54(11), 4311–4322 (2006)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Zunyi Tang
    • 1
  • Shuxue Ding
    • 1
  1. 1.Graduate School of Computer Science and EngineeringThe University of AizuAizu-Wakamatsu CityJapan

Personalised recommendations