Advertisement

Approximation Algorithms for Tensor Clustering

  • Stefanie Jegelka
  • Suvrit Sra
  • Arindam Banerjee
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5809)

Abstract

We present the first (to our knowledge) approximation algorithm for tensor clustering—a powerful generalization to basic 1D clustering. Tensors are increasingly common in modern applications dealing with complex heterogeneous data and clustering them is a fundamental tool for data analysis and pattern discovery. Akin to their 1D cousins, common tensor clustering formulations are NP-hard to optimize. But, unlike the 1D case, no approximation algorithms seem to be known. We address this imbalance and build on recent co-clustering work to derive a tensor clustering algorithm with approximation guarantees, allowing metrics and divergences (e.g., Bregman) as objective functions. Therewith, we answer two open questions by Anagnostopoulos et al. (2008). Our analysis yields a constant approximation factor independent of data size; a worst-case example shows this factor to be tight for Euclidean co-clustering. However, empirically the approximation factor is observed to be conservative, so our method can also be used in practice.

Keywords

Approximation Algorithm Approximation Factor Projection Matrice Approximation Guarantee Constant Approximation Factor 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Banerjee, A., Basu, S., Merugu, S.: Multi-way Clustering on Relation Graphs. In: SIAM Conf. Data Mining, SDM (2007)Google Scholar
  2. 2.
    Shashua, A., Zass, R., Hazan, T.: Multi-way Clustering Using Super-Symmetric Non-negative Tensor Factorization. In: Leonardis, A., Bischof, H., Pinz, A. (eds.) ECCV 2006. LNCS, vol. 3954, pp. 595–608. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  3. 3.
    Dhillon, I.S., Mallela, S., Modha, D.S.: Information-theoretic co-clustering. In: KDD, pp. 89–98 (2003)Google Scholar
  4. 4.
    Banerjee, A., Dhillon, I.S., Ghosh, J., Merugu, S., Modha, D.S.: A Generalized Maximum Entropy Approach to Bregman Co-clustering and Matrix Approximation. JMLR 8, 1919–1986 (2007)MathSciNetzbMATHGoogle Scholar
  5. 5.
    Ackermann, M.R., Blömer, J.: Coresets and Approximate Clustering for Bregman Divergences. In: ACM-SIAM Symp. on Disc. Alg., SODA (2009)Google Scholar
  6. 6.
    Ackermann, M.R., Blömer, J., Sohler, C.: Clustering for metric and non-metric distance measures. In: ACM-SIAM Symp. on Disc. Alg. (SODA) (April 2008)Google Scholar
  7. 7.
    Arthur, D., Vassilvitskii, S.: k-means++: The Advantages of Careful Seeding. In: ACM-SIAM Symp. on Discete Algorithms (SODA), pp. 1027–1035 (2007)Google Scholar
  8. 8.
    Nock, R., Luosto, P., Kivinen, J.: Mixed Bregman clustering with approximation guarantees. In: Daelemans, W., Goethals, B., Morik, K. (eds.) ECML / PKDD 2008, Part II. LNCS (LNAI), vol. 5212, pp. 154–169. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  9. 9.
    Sra, S., Jegelka, S., Banerjee, A.: Approximation algorithms for Bregman clustering, co-clustering and tensor clustering. Technical Report 177, MPI for Biological Cybernetics (2008)Google Scholar
  10. 10.
    Ben-David, S.: A framework for statistical clustering with constant time approximation algorithms for K-median and K-means clustering. Mach. Learn. 66(2-3), 243–257 (2007)CrossRefGoogle Scholar
  11. 11.
    Puolamäki, K., Hanhijärvi, S., Garriga, G.C.: An approximation ratio for biclustering. Inf. Process. Letters 108(2), 45–49 (2008)MathSciNetCrossRefzbMATHGoogle Scholar
  12. 12.
    Anagnostopoulos, A., Dasgupta, A., Kumar, R.: Approximation algorithms for co-clustering. In: Symp. on Principles of Database Systems, PODS (2008)Google Scholar
  13. 13.
    Zha, H., Ding, C., Li, T., Zhu, S.: Workshop on Data Mining using Matrices and Tensors. In: KDD (2008)Google Scholar
  14. 14.
    Hasan, M., Velazquez-Armendariz, E., Pellacini, F., Bala, K.: Tensor Clustering for Rendering Many-Light Animations. In: Eurographics Symp. on Rendering, vol. 27 (2008)Google Scholar
  15. 15.
    Kolda, T.G., Bader, B.W.: Tensor Decompositions and Applications. SIAM Review 51(3) (to appear, 2009)Google Scholar
  16. 16.
    Hartigan, J.A.: Direct clustering of a data matrix. J. of the Am. Stat. Assoc. 67(337), 123–129 (1972)CrossRefGoogle Scholar
  17. 17.
    Cheng, Y., Church, G.: Biclustering of expression data. In: Proc. ISMB, pp. 93–103. AAAI Press, Menlo Park (2000)Google Scholar
  18. 18.
    Dhillon, I.S.: Co-clustering documents and words using bipartite spectral graph partitioning. In: KDD, pp. 269–274 (2001)Google Scholar
  19. 19.
    Bekkerman, R., El-Yaniv, R., McCallum, A.: Multi-way distributional clustering via pairwise interactions. In: ICML (2005)Google Scholar
  20. 20.
    Agarwal, S., Lim, J., Zelnik-Manor, L., Perona, P., Kriegman, D., Belongie, S.: Beyond pairwise clustering. In: IEEE CVPR (2005)Google Scholar
  21. 21.
    Govindu, V.M.: A tensor decomposition for geometric grouping and segmentation. In: IEEE CVPR (2005)Google Scholar
  22. 22.
    Schölkopf, B., Smola, A.: Learning with Kernels. MIT Press, Cambridge (2001)zbMATHGoogle Scholar
  23. 23.
    Hein, M., Bousquet, O.: Hilbertian metrics and positive definite kernels on probability measures. In: AISTATS (2005)Google Scholar
  24. 24.
    Censor, Y., Zenios, S.A.: Parallel Optimization: Theory, Algorithms, and Applications. Oxford University Press, Oxford (1997)zbMATHGoogle Scholar
  25. 25.
    Banerjee, A., Merugu, S., Dhillon, I.S., Ghosh, J.: Clustering with Bregman Divergences. JMLR 6(6), 1705–1749 (2005)MathSciNetzbMATHGoogle Scholar
  26. 26.
    de Silva, V., Lim, L.H.: Tensor Rank and the Ill-Posedness of the Best Low-Rank Approximation Problem. SIAM J. Matrix Anal. & Appl. 30(3), 1084–1127 (2008)MathSciNetCrossRefzbMATHGoogle Scholar
  27. 27.
    Jegelka, S., Sra, S., Banerjee, A.: Approximation algorithms for Bregman co-clustering and tensor clustering (2009); arXiv:cs.DS/0812.0389v3Google Scholar
  28. 28.
    Chaudhuri, K., McGregor, A.: Finding metric structure in information theoretic clustering. In: Conf. on Learning Theory, COLT (July 2008)Google Scholar
  29. 29.
    Cho, H., Dhillon, I.S., Guan, Y., Sra, S.: Minimum Sum Squared Residue based Co-clustering of Gene Expression data. In: SDM, 114–125 (2004)Google Scholar
  30. 30.
    Kluger, Y., Basri, R., Chang, J.T.: Spectral biclustering of microarray data: Coclustering genes and conditions. Genome Research 13, 703–716 (2003)CrossRefGoogle Scholar
  31. 31.
    Cho, H., Dhillon, I.: Coclustering of human cancer microarrays using minimum sum-squared residue coclustering. IEEE/ACM Tran. Comput. Biol. Bioinf. 5(3), 385–400 (2008)CrossRefGoogle Scholar
  32. 32.
    Baranzini, S.E., et al: Transcription-based prediction of response to IFNβ using supervised computational methods. PLoS Biology 3(1) (2004)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Stefanie Jegelka
    • 1
  • Suvrit Sra
    • 1
  • Arindam Banerjee
    • 2
  1. 1.Max Planck Institute for Biological CyberneticsTübingenGermany
  2. 2.Univ. of Minnesota, Twin CitiesMinneapolisUSA

Personalised recommendations