Structured Dictionaries

  • Bogdan Dumitrescu
  • Paul Irofti
Chapter

Abstract

Endowing the dictionary with a structure may be beneficial by better modeling certain signals and by speeding up the representation and learning processes, despite losing some of the freedom of a general dictionary. We study here several unrelated types of structures and present DL algorithms adapted to the respective structures. Sparse dictionaries assume that the atoms are sparse combinations of the columns of a matrix, usually those of a square transform. This is equivalent to a factorization of the dictionary as a product between a dense and a sparse matrix or, generalizing the concept, a product of several sparse matrices. This structure can be seen as the ultimate approach to parsimony via sparsity. Dictionaries made of orthogonal blocks have several appealing properties, including better incoherence. Of particular interest is the case where a single block is used for the sparse representation, thus making sparse coding extremely fast because of its simplicity and parallelism. Shift invariant dictionaries bring the advantage of being insensitive to the way a long signal is cut into smaller patches for processing. They also have fast representation algorithms based on FFT. Separable dictionaries work with 2D signals without vectorization; a pair of dictionaries is used instead of a single one. The representation is more economic and may be better suited to image processing. The concept can be generalized to more than two dimensions, working with tensors; we present a few theoretical notions that pave the way to a tensor DL. Finally, composite dictionaries have two components: one is learned off-line, as usual, but the other directly on the set of signals to be processed. This slows the processing, but can bring extra quality.

References

  1. 2.
    P.A. Absil, R. Mahony, R. Sepulchre, Optimization Algorithms on Matrix Manifolds (Princeton University Press, Princeton, 2009)Google Scholar
  2. 8.
    M. Bahri, Y. Panagakis, S. Zafeiriou, Robust Kronecker-decomposable component analysis for low rank modeling. Preprint (2017). arXiv:1703.07886Google Scholar
  3. 15.
    S. Boyd, N. Parikh, E. Chu, B. Peleato, J. Eckstein, Distributed optimization and statistical learning via the alternating direction method of multipliers. Found. Trends® Mach. Learn. 3(1), 1–122 (2011)MathSciNetCrossRefGoogle Scholar
  4. 16.
    H. Bristow, A. Eriksson, S. Lucey, Fast convolutional sparse coding, in Proceedings of IEEE Conference on Computer Vision and Pattern Recognition (2013), pp. 391–398Google Scholar
  5. 26.
    J.D. Carroll, J.J. Chang, Analysis of individual differences in multidimensional scaling via an n-way generalization of “Eckart-Young” decomposition. Psychometrika 35(3), 283–319 (1970)CrossRefGoogle Scholar
  6. 30.
    J.M. Chambers, Algorithm 410: partial sorting. Commun. ACM 14(5), 357–358 (1971)CrossRefGoogle Scholar
  7. 46.
    C.F. Dantas, M.N. da Costa, R. da Rocha Lopes, Learning dictionaries as a sum of Kronecker products. IEEE Signal Process. Lett. 24(5), 559–563 (2017)CrossRefGoogle Scholar
  8. 53.
    D.L. Donoho, X. Huo, Uncertainty principles and ideal atomic decomposition. IEEE Trans. Inf. Theory 47(7), 2845–2862 (2001)MathSciNetCrossRefGoogle Scholar
  9. 63.
    M. Elad, A.M. Bruckstein, A generalized uncertainty principle and sparse representation in pairs of bases. IEEE Trans. Inf. Theory 48(9), 2558–2567 (2002)Google Scholar
  10. 70.
    Y. Fang, J. Wu, B. Huang, 2D sparse signal recovery via 2D orthogonal matching pursuit. Sci. China Inf. Sci. 55(4), 889–897 (2012)MathSciNetCrossRefGoogle Scholar
  11. 74.
    Y. Fu, J. Gao, Y. Sun, X. Hong, Joint multiple dictionary learning for tensor sparse coding, in 2014 International Joint Conference on Neural Networks (IJCNN) (IEEE, New York, 2014), pp. 2957–2964Google Scholar
  12. 80.
    G.H. Golub, C. Van Loan, Matrix Computations, 4th edn. (Johns Hopkins University Press, Baltimore, 2013)Google Scholar
  13. 83.
    J.C. Gower, G.B. Dijksterhuis, Procrustes Problems, vol. 3 (Oxford University Press, Oxford, 2004)CrossRefGoogle Scholar
  14. 84.
    R. Gribonval, M. Nielsen, Sparse decompositions in “incoherent” dictionaries, in 2003 Proceedings of the International Conference on Image Processing, 2003 (ICIP 2003), vol. 1 (IEEE, New York, 2003), pp. I–33Google Scholar
  15. 86.
    R. Grosse, R. Raina, H. Kwong, A.Y. Ng, Shift-invariant sparse coding for audio classification, in Proceedings of 23rd Conference on Uncertainty in Artificial Intelligence (AUAI Press, Corvallis, 2007), pp. 149–158Google Scholar
  16. 90.
    J. Han, J. Pei, Y. Yin, R. Mao, Mining frequent patterns without candidate generation: a frequent-pattern tree approach. Data Min. Knowl. Disc. 8(1), 53–87 (2004)MathSciNetCrossRefGoogle Scholar
  17. 91.
    R.A. Harshman, Foundations of the parafac procedure: models and conditions for an “explanatory” multimodal factor analysis. UCLA Working Papers in Phonetics, vol. 16, pp. 1–84 (1970)Google Scholar
  18. 92.
    S. Hawe, M. Seibert, M. Kleinsteuber, Separable dictionary learning, in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2013), pp. 438–445Google Scholar
  19. 93.
    F.L. Hitchcock, The expression of a tensor or a polyadic as a sum of products. Stud. Appl. Math. 6(1–4), 164–189 (1927)CrossRefGoogle Scholar
  20. 95.
    S.H. Hsieh, C.S. Lu, S.C. Pei, 2D sparse dictionary learning via tensor decomposition, in 2014 IEEE Global Conference on Signal and Information Processing (GlobalSIP) (IEEE, New York, 2014), pp. 492–496Google Scholar
  21. 96.
    P. Irofti, Sparse denoising with learned composite structured dictionaries, in 19th International Conference on System Theory, Control and Computing (2015), pp. 331–336Google Scholar
  22. 97.
    P. Irofti, Efficient parallel implementation for single block orthogonal dictionary learning. J. Control Eng. Appl. Inf. 18(3), 101–108 (2016)Google Scholar
  23. 103.
    M.G. Jafari, M.D. Plumbley, Fast dictionary learning for sparse representations of speech signals. IEEE J. Sel. Topics Signal Process. 5(5), 1025–1031 (2011)CrossRefGoogle Scholar
  24. 108.
    M.E. Kilmer, C.D. Martin, Factorization strategies for third-order tensors. Linear Algebra Appl. 435(3), 641–658 (2011)Google Scholar
  25. 110.
    T.G. Kolda, B.W. Bader, The tophits model for higher-order web link analysis, in Workshop on Link Analysis, Counterterrorism and Security, vol. 7 (2006), pp. 26–29Google Scholar
  26. 111.
    T.G. Kolda, B.W. Bader, Tensor decompositions and applications. SIAM Rev. 51(3), 455–500 (2009)MathSciNetCrossRefGoogle Scholar
  27. 115.
    L. Le Magoarou, R. Gribonval, Chasing butterflies: in search of efficient dictionaries, in International Conference on Acoustics Speech Signal Processing (ICASSP), Brisbane, April 2015, pp. 3287–3291Google Scholar
  28. 117.
    S. Lesage, R. Gribonval, F. Bimbot, L. Benaroya, Learning unions of orthonormal bases with thresholded singular value decomposition, in Proceedings. IEEE International Conference on Acoustics, Speech, and Signal Processing (ICASSP ’05), 2005, vol. 5, March 2005, pp. v/293–v/296Google Scholar
  29. 118.
    N. Li, S. Kindermann, C. Navasca, Some convergence results on the regularized alternating least-squares method for tensor decomposition. Linear Algebra Appl. 438(2), 796–812 (2013)MathSciNetCrossRefGoogle Scholar
  30. 125.
    B. Mailhé, S. Lesage, R. Gribonval, F. Bimbot, P. Vandergheynst, Shift-invariant dictionary learning for sparse representations: extending K-SVD, in European Signal Processing Conference (EUSIPCO), Lausanne, August 2008Google Scholar
  31. 128.
    J. Mairal, G. Sapiro, M. Elad, Learning multiscale sparse representations for image and video restoration. SIAM Multiscale Model. Simul. 7(1), 214–241 (2008)MathSciNetCrossRefGoogle Scholar
  32. 146.
    G. Pope, C. Aubel, C. Studer, Learning phase-invariant dictionaries, in International Conference on Acoustics Speech Signal Processing (ICASSP), Vancouver, May 2013, pp. 5979–5983Google Scholar
  33. 157.
    R. Rubinstein, M. Zibulevsky, M. Elad, Double sparsity: learning sparse dictionaries for sparse signal approximation. IEEE Trans. Signal Process. 58(3), 1553–1564 (2010)Google Scholar
  34. 161.
    C. Rusu, B. Dumitrescu, Block orthonormal overcomplete dictionary learning, in 21st European Signal Processing Conference (EUSIPCO 2013), September 2013, pp. 1–5Google Scholar
  35. 164.
    C. Rusu, B. Dumitrescu, S.A. Tsaftaris, Explicit shift-invariant dictionary learning. IEEE Signal Process. Lett. 24(1), 6–9 (2014)CrossRefGoogle Scholar
  36. 171.
    O.G. Sezer, O. Harmanci, O.G. Guleryuz, Sparse orthonormal transforms for image compression, in 2008 15th IEEE International Conference on Image Processing, October 2008, pp. 149–152Google Scholar
  37. 185.
    J.J. Thiagarajan, K.N. Ramamurthy, A. Spanias, Shift-invariant sparse representation of images using learned dictionaries, in IEEE Workshop Machine Learning Signal Processing, Cancun, October 2008, pp. 145–150Google Scholar
  38. 194.
    C.F. Van Loan, N. Pitsianis, Approximation with Kronecker products, in Linear Algebra for Large Scale and Real-Time Applications (Springer, Berlin, 1993), pp. 293–314CrossRefGoogle Scholar
  39. 201.
    Z. Wang, Y. Yang, J. Yang, T. Huang, Designing a composite dictionary adaptively from joint examples, in IEEE Visual Communication Image Processing (VCIP), Singapore, December 2015Google Scholar
  40. 204.
    H. Wersing, J. Eggert, E. Körner, Sparse coding with invariance constraints. Lect. Notes Comput. Sci. 2714, 385–392 (2003)CrossRefGoogle Scholar
  41. 206.
    B. Wohlberg, Efficient algorithms for convolutional sparse representations. IEEE Trans. Image Process. 15(1), 301–315 (2016)MathSciNetCrossRefGoogle Scholar
  42. 219.
    Z. Zhang, S. Aeron, Denoising and completion of 3d data via multidimensional dictionary learning, in Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence (IJCAI-16) (2016), pp. 2371–2377Google Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  • Bogdan Dumitrescu
    • 1
  • Paul Irofti
    • 2
  1. 1.Department of Automatic Control and Systems Engineering, Faculty of Automatic Control and ComputersUniversity Politehnica of BucharestBucharestRomania
  2. 2.Department of Computer Science, Faculty of Mathematics and Computer ScienceUniversity of BucharestBucharestRomania

Personalised recommendations