Spectral Feature Vectors for Graph Clustering

  • Bin Luo
  • Richard C. Wilson
  • Edwin R. Hancock
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2396)


This paper investigates whether vectors of graph-spectral features can be used for the purposes of graph-clustering. We commence from the eigenvalues and eigenvectors of the adjacency matrix. Each of the leading eigenmodes represents a cluster of nodes and is mapped to a component of a feature vector. The spectral features used as components of the vectors are the eigenvalues, the cluster volume, the cluster perimeter, the cluster Cheeger constant, the inter-cluster edge distance, and the shared perimeter length. We explore whether these vectors can be used for the purposes of graph-clustering. Here we investigate the use of both central and pairwise clustering methods. On a data-base of view-graphs, the vectors of eigenvalues and shared perimeter lengths provide the best clusters.


Adjacency Matrix Multidimensional Scaling Spectral Cluster Good Cluster Graph Cluster 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    H. Bunke. Error correcting graph matching: On the influence of the underlying cost function. IEEE Transactions on Pattern Analysis and Machine Intelligence, 21:917–922, 1999.CrossRefGoogle Scholar
  2. 2.
    F. R. K. Chung. Spectral Graph Theory. American Mathmatical Society Ed., CBMS series 92, 1997.Google Scholar
  3. 3.
    Chatfield C. and Collins A. J. Introduction to multivariate analysis. Chapman & Hall, 1980.Google Scholar
  4. 4.
    R. Englert and R. Glantz. Towards the clustering of graphs. In 2nd IAPR-TC-15 Workshop on Graph-Based Representation, 1999.Google Scholar
  5. 5.
    Kruskal J. B. Nonmetric multidimensional scaling: A numerical method. Psychometrika, 29:115–129, 1964.zbMATHCrossRefMathSciNetGoogle Scholar
  6. 6.
    Gower J. C. Some distance properties of latent root and vector methods used in multivariate analysis. Biometrika, 53:325–328, 1966.zbMATHMathSciNetGoogle Scholar
  7. 7.
    B. Luo, A. Robles-Kelly, A. Torsello, R. C. Wilson, and E. R. Hancock. Clustering shock trees. In Proceedings of GbR, pages 217–228, 2001.Google Scholar
  8. 8.
    H. Murase and S. K. Nayar. Illumination planning for object recognition using parametric eigenspaces. IEEE Transactions on Pattern Analysis and Machine Intelligence, 16(12):1219–1227, 1994.CrossRefGoogle Scholar
  9. 9.
    S. Rizzi. Genetic operators for hierarchical graph clustering. Pattern Recognition Letters, 19:1293–1300, 1998.zbMATHCrossRefGoogle Scholar
  10. 10.
    J. Segen. Learning graph models of shape. In Proceedings of the Fifth International Conference on Machine Learning, pages 25–29, 1988.Google Scholar
  11. 11.
    K. Sengupta and K. L. Boyer. Organizing large structural modelbases. PAMI, 17(4):321–332, April 1995.Google Scholar
  12. 12.
    Torgerson W. S. Multidimensional scaling. i. theory and method. Psychometrika, 17:401–419, 1952.zbMATHCrossRefMathSciNetGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2002

Authors and Affiliations

  • Bin Luo
    • 1
    • 2
  • Richard C. Wilson
    • 1
  • Edwin R. Hancock
    • 1
  1. 1.Department of Computer ScienceUniversity of YorkUK
  2. 2.Anhui UniversityP.R. China

Personalised recommendations