C3E: A Framework for Combining Ensembles of Classifiers and Clusterers

  • A. Acharya
  • E. R. Hruschka
  • J. Ghosh
  • S. Acharyya
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6713)


The combination of multiple classifiers to generate a single classifier has been shown to be very useful in practice. Similarly, several efforts have shown that cluster ensembles can improve the quality of results as compared to a single clustering solution. These observations suggest that ensembles containing both classifiers and clusterers are potentially useful as well. Specifically, clusterers provide supplementary constraints that can improve the generalization capability of the resulting classifier. This paper introduces a new algorithm named C 3 E that combines ensembles of classifiers and clusterers. Our experimental evaluation of C 3 E shows that it provides good classification accuracies in eleven tasks derived from three real-world applications. In addition, C 3 E produces better results than the recently introduced Bipartite Graph-based Consensus Maximization (BGCM) Algorithm, which combines multiple supervised and unsupervised models and is the algorithm most closely related to C 3 E.


Ensembles Classification Clustering 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Banerjee, A., Merugu, S., Dhillon, I., Ghosh, J.: Clustering with Bregman divergences. In: JMLR (2005)Google Scholar
  2. 2.
    Schlkopf, B., Zien, A., Chapelle, O.: Semi-Supervised Learning. MIT Press. Cambridge (2006)Google Scholar
  3. 3.
    Demšar, J.: Statistical comparisons of classifiers over multiple data sets. J. Mach. Learn. Res. 7, 1–30 (2006)zbMATHGoogle Scholar
  4. 4.
    Fern, X.Z., Brodley, C.E.: Solving cluster ensemble problems by bipartite graph partitioning. In: Proc. of the ICML, pp. 36–43 (2004)Google Scholar
  5. 5.
    Gao, J., Liang, F., Fan, W., Sun, Y., Han, J.: Graph-based consensus maximization among multiple supervised and unsupervised models. In: Proc. of NIPS, pp. 1–9 (2009)Google Scholar
  6. 6.
    Ghosh, J., Acharya, A.: Cluster ensembles. WIREs Data Mining and Knowledge Discovery 1, 1–12 (to appear 2011)CrossRefGoogle Scholar
  7. 7.
    Kittler, J., Roli, F. (eds.): IPSN 2003. LNCS, vol. 2634. Springer, Heidelberg (2003)zbMATHGoogle Scholar
  8. 8.
    Kuncheva, L.I.: Combining Pattern Classifiers: Methods and Algorithms. Wiley, Chichester (2004)CrossRefzbMATHGoogle Scholar
  9. 9.
    Oza, N., Tumer, K.: Classifier ensembles: Select real-world applications. Information Fusion 9(1), 4–20 (2008)CrossRefGoogle Scholar
  10. 10.
    Punera, K., Ghosh, J.: Consensus based ensembles of soft clusterings. Applied Artificial Intelligence 22, 109–117 (2008)CrossRefGoogle Scholar
  11. 11.
    Davidson, I., Basu, S., Wagstaff, K.L. (eds.): Clustering with Balancing Constraints. CRC Press, Boca Raton (2008)Google Scholar
  12. 12.
    Strehl, A., Ghosh, J.: Cluster ensembles – a knowledge reuse framework for combining multiple partitions. In: JMLR, vol. 617, pp. 583–617 (2002)Google Scholar
  13. 13.
    Tumer, K., Ghosh, J.: Analysis of decision boundaries in linearly combined neural classifiers. Pattern Recognition 29, 341–348 (1996)CrossRefGoogle Scholar
  14. 14.
    Goldberg, A., Zhu, X.: Introduction to Semi-Supervised Learning. Morgan and Claypool Publishers, San Rafael (2009)zbMATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • A. Acharya
    • 1
  • E. R. Hruschka
    • 1
    • 2
  • J. Ghosh
    • 1
  • S. Acharyya
    • 1
  1. 1.University of Texas (UT)AustinUSA
  2. 2.University of Sao Paulo (USP)Sao CarlosBrazil

Personalised recommendations