Clustering and Metaclustering with Nonnegative Matrix Decompositions

  • Liviu Badea
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3720)


Although very widely used in unsupervised data mining, most clustering methods are affected by the instability of the resulting clusters w.r.t. the initialization of the algorithm (as e.g. in k-means). Here we show that this problem can be elegantly and efficiently tackled by meta-clustering the clusters produced in several different runs of the algorithm, especially if “soft” clustering algorithms (such as Nonnegative Matrix Factorization) are used both at the object- and the meta-level. The essential difference w.r.t. other meta-clustering approaches consists in the fact that our algorithm detects frequently occurring sub-clusters (rather than complete clusters) in the various runs, which allows it to outperform existing algorithms. Additionally, we show how to perform two-way meta-clustering, i.e. take both object and sample dimensions of clusters simultaneously into account, a feature which is essential e.g. for biclustering gene expression data, but has not been considered before.


Nonnegative Matrix Factorization Nonnegative Matrix Nonnegativity Constraint Average Match Cluster Prototype 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    Bradley, P.S., Fayyad, U.M.: Refining Initial Points for K-Means Clustering. In: Proc. ICML 1998, pp. 91–99 (1998)Google Scholar
  2. 2.
    Eisen, M.B., Spellman, P.T., Brown, P.O., Botstein, D.: Cluster analysis and display of genome-wide expression patterns. PNAS 95, 14863–14868 (1998)CrossRefGoogle Scholar
  3. 3.
    Hoyer, P.O.: Non-negative sparse coding. Neural Networks for Signal Processing XII, 557–565 (2002)Google Scholar
  4. 4.
    Lee, D.D., Seung, H.S.: Learning the parts of objects by non-negative matrix factorization. Nature 401(6755), 788–791 (1999)CrossRefGoogle Scholar
  5. 5.
    Lee, D.D., Seung, H.S.: Algorithms for non-negative matrix factorization. In: Proc. NIPS 2000. MIT Press, Cambridge (2001)Google Scholar
  6. 6.
    Welling, M., Weber, M.: Positive tensor factorization. Pattern Recognition Letters 22(12), 1255–1261 (2001)zbMATHCrossRefGoogle Scholar
  7. 7.
    Bezdek, J.C.: Pattern Recognition with Fuzzy Objective Function Algorithms. Plenum Press, New York (1981)zbMATHGoogle Scholar
  8. 8.
    Bauer, E., Kohavi, R.: An empirical comparison of voting classification algorithms: bagging, boosting, and variants. Machine Learning 36, 105–139 (1999)CrossRefGoogle Scholar
  9. 9.
    Cheng, Y., Church, G.: Biclustering of expression data. In: Proc. ISMB 2000, pp. 93–103 (2000)Google Scholar
  10. 10.
    Bhattacharjee, et al.: Classification of human lung carcinomas by mRNA expression profiling reveals distinct adenocarcinoma subclasses. Proc. Natl. Acad. Sci. USA 98(24), 13790–13795 (2001)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2005

Authors and Affiliations

  • Liviu Badea
    • 1
  1. 1.AI LabNational Institute for Research and Development in InformaticsBucharestRomania

Personalised recommendations