Advertisement

Approximate Spectral Clustering Using Topology Preserving Methods and Local Scaling

  • Mashaan Alshammari
  • Masahiro Takatsuka
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11304)

Abstract

Spectral clustering is the type of unsupervised learning that separates data based on their connectivity instead of convexity. However, its computational demands increase cubically with the number of points n. This triggered a stream of studies to ease these demands. An effective solution is to provide an approximated graph \( G^{*} = \left( {V^{*} ,E^{*} } \right) \) for the input data with a reduced set of vertices and edges. Recent similarity measures used to construct the approximated graph \( G^{*} = \left( {V^{*} ,E^{*} } \right) \) have some deficiencies such as: (1) weights on edges highly depend on the cluster density, and (2) larger memory footprint compared to conventional similarity measures. In this work, we employed topology preserving methods (e.g., neural gas) to obtain \( G^{*} = \left( {V^{*} ,E^{*} } \right) \) due to their ability to preserve input data topology. Then we used a conventional similarity measure to assign weights on the graph. The experiments reveal that graphs obtained through topology preserving methods and passed to a locally scaled similarity measure, produce performances comparable to the recent measures with a significantly smaller memory footprint.

Keywords

Spectral clustering Topology preserving methods Neural gas 

References

  1. 1.
    Yan, D., Huang, L., Jordan, M.I.: Fast approximate spectral clustering. In: Proceedings of the 15th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 907–916 (2009)Google Scholar
  2. 2.
    Tasdemir, K.: Vector quantization based approximate spectral clustering of large datasets. Pattern Recogn. 45, 3034–3044 (2012)CrossRefGoogle Scholar
  3. 3.
    Ng, A.Y., Jordan, M.I., Weiss, Y.: On spectral clustering: Analysis and an algorithm. In: Advances in Neural Information Processing Systems (2002)Google Scholar
  4. 4.
    Zelnik-Manor, L., Perona, P.: Self-tuning spectral clustering. In: Proceedings of the 17th International Conference on Neural Information Processing Systems, pp. 1601–1608 (2004)Google Scholar
  5. 5.
    Martinetz, T., Schulten, K.: Topology representing networks. Neural Netw. 7, 507–522 (1994)CrossRefGoogle Scholar
  6. 6.
    Kohonen, T.: The self-organizing map. Proc. IEEE 78, 1464–1480 (1990)CrossRefGoogle Scholar
  7. 7.
    Martinetz, T., Schulten, K.: A “Neural-Gas” Network Learns Topologies. University of Illinois at Urbana-Champaign, Champaign (1991)Google Scholar
  8. 8.
    Fritzke, B.: A growing neural gas network learns topologies. In: Advances in Neural Information Processing Systems, pp. 625–632 (1995)Google Scholar
  9. 9.
    Sugiyama, M.: Dimensionality reduction of multimodal labeled data by local fisher discriminant analysis. J. Mach. Learn. Res. 8, 1027–1061 (2007)zbMATHGoogle Scholar
  10. 10.
    Zhang, X., Li, J., Yu, H.: Local density adaptive similarity measurement for spectral clustering. Pattern Recogn. Lett. 32, 352–358 (2011)CrossRefGoogle Scholar
  11. 11.
    Martinetz, T.: Competitive hebbian learning rule forms perfectly topology preserving maps. In: Gielen, S., Kappen, B. (eds.) ICANN 1993, pp. 427–434. Springer, London (1993).  https://doi.org/10.1007/978-1-4471-2063-6_104CrossRefGoogle Scholar
  12. 12.
    Von Luxburg, U.: A tutorial on spectral clustering. Stat. Comput. 17, 395–416 (2007)MathSciNetCrossRefGoogle Scholar
  13. 13.
    Arbelaez, P., Maire, M., Fowlkes, C., Malik, J.: Contour detection and hierarchical image segmentation. IEEE Trans. Pattern Anal. Mach. Intell. 33, 898–916 (2011)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  1. 1.School of Information TechnologiesThe University of SydneySydneyAustralia

Personalised recommendations