Adaptive Support Vector Clustering for Multi-relational Data Mining

  • Ping Ling
  • Chun-Guang Zhou
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3971)


A novel self Adaptive Support Vector Clustering algorithm (ASVC) is proposed in this paper to cluster dataset with diverse dispersions. And a Kernel function is defined to measure affinity between multi-relational data. Task of clustering multi-relational data is addressed by integrating the designed Kernel into ASVC. Experimental results indicate that the designed Kernel can capture structured features well and ASVC is of fine performance.


Spectral Cluster Width Parameter Inductive Logic Programming Affinity Matrix Main Table 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Dzeroski, S.: Multi-Relational Data Mining: An Introduction. ACM SIGKDD Explorations Newsletter 5(1) (2003)Google Scholar
  2. 2.
    Woznica, A. (ed.): Kernel-Based Distances for Relational Learning. In: 10th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, vol. 6(2) (2004) Google Scholar
  3. 3.
    Ben-Hur, A., Horn, D., Siegelmann, H.T.: Support Vector Clustering. Journal of Machine Learning Research 2, 125–137 (2001)CrossRefGoogle Scholar
  4. 4.
    Ng, A., Jordan, M., Weiss, Y.: On Spectral Clustering: Analysis and An Algorithm. MIT Press, Cambridge (2002)Google Scholar
  5. 5.
    Shi, J., Malik, J.: Normalized Cuts and Image Segmentation. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 731–737 (1997)Google Scholar
  6. 6.
    Kannan, R., Vempala, S., Vetta, A.: On Clusterings: Good, Bad and Spectral. In: Proceedings of the 41st Annual Symposium on the Foundation of Computer Science, pp. 367–377 (2000)Google Scholar
  7. 7.
    Polito, M., Perona, P.: Grouping and Dimensionality Reduction by Locally Linear Embedding. In: Dietterich, T.G., Becker, S., Ghahramani, Z. (eds.) Advances in Neural Information Processing Systems, vol. 14. MIT Press, Cambridge (2002)Google Scholar
  8. 8.
    Haussler, D.: Convolution Kernels on Discrete Structures. Technical Report, Department of Computer Science, University of California, Santa Cruz (1999)Google Scholar
  9. 9.
    Renders, J.: Kernel Methods in Natural Language Processing. In: Learning Methods for Text Understanding and Mining, Conference Tutorial, France (2004)Google Scholar
  10. 10.
  11. 11.
    Girolami, M.: Mercer Kernel-Based Clustering in Feature Space. IEEE Trans. on Neural Networks 13(3), 780–784 (2002)CrossRefGoogle Scholar
  12. 12.
    Dietterich, T.G., Lathrop, R.H., Lozano-Perez, T.: Solving The Multiple Instance Problem with Axis-Parallel Rectangles. Artificial Intelligence 89(1-2), 31–71 (1997)MATHCrossRefGoogle Scholar
  13. 13.
    Bloedorn, E., Michalski, R.: Data Driven Constr. Ind. IEEE Intel. Syst. 13(2), 30–37 (1998)CrossRefGoogle Scholar
  14. 14.
    Gaertner, T., Flach, P., Kowalczyk: Multi-instance Kernels. In: Proceedings of the 19th International Conference on Machine, pp. 179–186 (2002)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Ping Ling
    • 1
    • 2
  • Chun-Guang Zhou
    • 1
  1. 1.College of Computer ScienceJilin University, Key Laboratory of Symbol Computation, and Knowledge Engineering of the Ministry of EducationChangchunChina
  2. 2.School of Computer ScienceXuzhou Normal UniversityXuzhouChina

Personalised recommendations