Unsupervised Learning of Image Recognition with Neural Society for Clustering

  • Marcin Wojnarski
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4259)


New algorithm for partitional data clustering is presented, Neural Society for Clustering (NSC). Its creation was inspired by hierarchical image understanding, which requires unsupervised training to build the hierarchy of visual features. Existing clustering algorithms are not well-suited for this task, since they usually split natural groups of patterns into several parts (like k-means) or give crisp clustering.

Neurons comprising NSC may be viewed as a society of autonomous individuals, proceeding along the same simple algorithm, based on four principles: of locality, greediness, balance and competition. The same principles govern large groups of entities in economy, sociology, biology and physics. Advantages of NSC are demonstrated in experiment with visual data. The paper presents also a new method for objective and quantitative comparison of clustering algorithms, based on the notions of entropy and mutual information.


Mutual Information Cluster Center Visual Feature Gaussian Mixture Model Unsupervised Learn 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    LeCun, Y., Bottou, L., Bengio, Y., Haffner, P.: Gradient-based learning applied to document recognition. Proceedings of the IEEE 86(11), 2278–2324 (1998)CrossRefGoogle Scholar
  2. 2.
    Riesenhuber, M., Poggio, T.: Hierarchical models of object recognition in cortex. Nature Neuroscience 2(11), 1019–1025 (1999)CrossRefGoogle Scholar
  3. 3.
    Serre, T., Wolf, L., Poggio, T.: Object recognition with features inspired by visual cortex. In: IEEE CVPR, vol. 2, pp. 994–1000 (2005)Google Scholar
  4. 4.
    Behnke, S.: Hierarchical Neural Networks for Image Interpretation. LNCS, vol. 2766. Springer, Heidelberg (2003)MATHCrossRefGoogle Scholar
  5. 5.
    Jain, A.K., Murty, M.N., Flynn, P.J.: Data clustering: a review. ACM Computing Surveys 31(3), 264–323 (1999)CrossRefGoogle Scholar
  6. 6.
    Ripley, B.D.: Pattern recognition and neural networks. Cambridge University Press, Cambridge (1996)MATHGoogle Scholar
  7. 7.
    Wojnarski, M.: LTF-C: Architecture, training algorithm and applications of new neural classifier. Fundamenta Informaticae 54(1), 89–105 (2003)MATHMathSciNetGoogle Scholar
  8. 8.
    Cover, T.M., Thomas, J.A.: Elements of Information Theory. Wiley, Chichester (1991)MATHCrossRefGoogle Scholar
  9. 9.
    Jain, A.K., Law, M.H.C.: Data clustering: A user’s dilemma. In: Pal, S.K., Bandyopadhyay, S., Biswas, S. (eds.) PReMI 2005. LNCS, vol. 3776, pp. 1–10. Springer, Heidelberg (2005)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Marcin Wojnarski
    • 1
  1. 1.Faculty of Mathematics, Informatics and MechanicsWarsaw UniversityWarszawaPoland

Personalised recommendations