Advertisement

CLOOSTING: CLustering Data with bOOSTING

  • F. Smeraldi
  • M. Bicego
  • M. Cristani
  • V. Murino
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6713)

Abstract

We present a novel clustering approach, that exploits boosting as the primary means of modelling clusters. Typically, boosting is applied in a supervised classification context; here, we move in the less explored unsupervised scenario. Starting from an initial partition, clusters are iteratively re-estimated using the responses of one-vs-all boosted classifiers. Within-cluster homogeneity and separation between the clusters are obtained by a combination of three mechanisms: use of regularised Adaboost to reject outliers, use of weak learners inspired to subtractive clustering and smoothing of the decision functions with a Gaussian Kernel. Experiments on public datasets validate our proposal, in some cases improving on the state of the art.

Keywords

Support Vector Machine Feature Space Gaussian Mixture Model Outer Sphere Weak Learner 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Avidan, S.: SpatialBoost: Adding spatial reasoning to adaBoost. In: Leonardis, A., Bischof, H., Pinz, A. (eds.) ECCV 2006. LNCS, vol. 3954, pp. 386–396. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  2. 2.
    Bicego, M., Figueiredo, M.: Soft clustering using weighted one-class support vector machines. Pattern Recognition 42(1), 27–32 (2009)CrossRefzbMATHGoogle Scholar
  3. 3.
    Camastra, F., Verri, A.: A novel kernel method for clustering. IEEE Trans. on Pattern Analysis and Machine Intelligence 27, 801–805 (2005)CrossRefGoogle Scholar
  4. 4.
    Chiu, S.: Fuzzy model identification based on cluster estimation. Journal of intelligent and fuzzy systems 2, 267–278 (1994)Google Scholar
  5. 5.
    Chiu, S.L.: In: Dubois, D., Prade, H., Yager, R. (eds.) Fuzzy Information Engineering: A guided tour of applications, ch. 9, John Wiley & Sons, Chichester (1997)Google Scholar
  6. 6.
    Demiriz, A., Bennet, K.P., Shawe-Taylor, J.: Linear programming boosting via column generation. Machine Learning 46, 225–254 (2002)CrossRefzbMATHGoogle Scholar
  7. 7.
    Freund, Y.: An adaptive version of the boost by majority algorithm. In: Proceedings of the Twelfth Annual Conference on Computational Learning Theory, pp. 102–113 (2000)Google Scholar
  8. 8.
    Freund, Y., Schapire, R.: A decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and System Science 55(1) (1997)Google Scholar
  9. 9.
    Freund, Y., Schapire, R.: A short introduction to boosting. Journal of Japanese society for Artificial Intelligence Science 14(5) (1999)Google Scholar
  10. 10.
    Frossyniotis, D., Likas, A., Stafylopatis, A.: A clustering method based on boosting. Pattern Recognition Letters 25, 641–654 (2004)CrossRefGoogle Scholar
  11. 11.
    Kim, T.K., Cipolla, R.: MCBoost: multiple classifer boosting for perceptual co-clustering of images and visual features. In: Advances in Neural Information Processing Systems, Vancouver, Canada, pp. 841–856 (2008)Google Scholar
  12. 12.
    Kohonen, T.: Self-Organizing Maps. Springer, Heidelberg (1997)CrossRefzbMATHGoogle Scholar
  13. 13.
    Martinetz, T., Schulten, K.: Neural-gas network for vector quantization and its application to time-series prediction. IEEE Trans. on Neural Networks 4(4), 558–569 (1993)CrossRefGoogle Scholar
  14. 14.
    McLachlan, G., Peel, D.: Finite mixture models. John Wiley and Sons, Chichester (2000)CrossRefzbMATHGoogle Scholar
  15. 15.
    Ng, A., Jordan, M., Weiss, Y.: On spectral clustering: Analysis and an algorithm. In: Advances in Neural Information Processing Systems, pp. 849–856 (2001)Google Scholar
  16. 16.
    Nock, R., Nielsen, F.: On weighting clustering. IEEE Transactions on PAMI 28(8), 1223–1235 (2006)CrossRefGoogle Scholar
  17. 17.
    Press, W.H., Teukolsky, S.A., Vetterling, W.T., Flannery, B.P.: Numerical recipes in C++, ch.10 Cambridge University Press, Cambridge (2002)zbMATHGoogle Scholar
  18. 18.
    Rätsch, G., Onoda, T., Muller, K.R.: Soft margins for adaboost. Machine Learning 42(3), 287–320 (2001), citeseer.ist.psu.edu/657521.html CrossRefzbMATHGoogle Scholar
  19. 19.
    Schapire, R., Singer, Y.: Improved boosting algorithms using confidence-rated predictions. Machine Learning 37(3) (1999)Google Scholar
  20. 20.
    Topchy, A., Minaei-Bidgoli, B., Jain, A.K., Punch, W.F.: Adaptive clustering ensembles. In: Proc. 17th Conf. Pattern Recognition, pp. 272–275 (2004)Google Scholar
  21. 21.
    Warmuth, M., Glocer, K.A., Rätsch, G.: Boosting algorithms for maximizing the soft margin. In: Advances in Neural Information Processing Systems 20, pp. 1585–1592. MIT Press, Cambridge (2008)Google Scholar
  22. 22.
    Warmuth, M.K., Glocer, K.A., Vishwanathan, S.V.N.: Entropy regularized lPBoost. In: Freund, Y., Györfi, L., Turán, G., Zeugmann, T. (eds.) ALT 2008. LNCS (LNAI), vol. 5254, pp. 256–271. Springer, Heidelberg (2008)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • F. Smeraldi
    • 1
  • M. Bicego
    • 2
    • 3
  • M. Cristani
    • 2
    • 3
  • V. Murino
    • 2
    • 3
  1. 1.School of Electronic Engineering and Computer ScienceQueen Mary University of LondonUK
  2. 2.Computer Science DepartmentUniversity of VeronaItaly
  3. 3.Istituto Italiano di Tecnologia (IIT)Italy

Personalised recommendations