Abstract
Several conventional clustering methods consider the squared \(L_2\)-norm which is calculated from objects coordinates. To extract meaningful clusters from a set of massive objects, it is required to calculate the dissimilarity from both objects coordinates and other features such as objects distribution. In this paper, JS-divergence based k-medoids (JSKMdd) is proposed as a novel method for clustering network data. In the proposed method, the dissimilarity that is based on objects coordinates and an object distribution is considered. The effectiveness of the proposed method is verified through numerical experiments with artificial datasets which consist non-linear clusters. The influence of the parameter in the proposed method is also described.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Miyamoto, S., Ichihashi, H., Honda, K.: Algorithms for Fuzzy Clustering. Springer, Heidelberg (2008). https://doi.org/10.1007/978-3-540-78737-2
Jain, A.K.: Data clustering: 50 years beyond \(K\)-means. Pattern Recogn. Lett. 31(8), 651–666 (2010)
Girolami, M.: Mercer kernel-based clustering in feature space. IEEE Trans. Neural networks 13(3), 780–784 (2002)
Endo Y., Haruyama H., Okubo T., On some hierarchical clustering algorithms using kernel functions. In: Proceedings of IEEE International Conference on Fuzzy Systems (FUZZ-IEEE2004), pp. 1513–1518 (2004)
Gustafson, D.E., Kessel, W.C.: Fuzzy clustering with a fuzzy covariance matrix. In: Proceedings of IEEE Conference on Decision and Control, pp. 761–766 (1979)
Davé, R.N., Krishnapuram, R.: Robust clustering methods: a unified view. IEEE Trans. Fuzzy Syst. 5(2), 270–293 (1997)
Hamasuna, Y., Endo, Y.: On semi-supervised fuzzy \(c\)-means clustering for data with clusterwise tolerance by opposite criteria. Soft Comput. 17(1), 71–81 (2013)
Endo, Y., Hamasuna, Y., Hirano, T., Kinoshita, N.: Even-sized clustering based on optimization and its variants. J. Adv. Comput. Intell. Intell. Inform. (JACIII) 22(1), 62–69 (2018)
Breunig, M.M., Kriegel, H.-S., Ng, R.T., Sander, J.: LOF: identifying density-based local outliers. In: Proceedings of the 2000 ACM SIGMOD International Conference on Management of data (SIGMOD 2000), pp. 93–104 (2000)
Kaufman, L., Rousseeuw, P.J.: Finding Groups in Data: An Introduction to Cluster Analysis. Wiley, New York (1990)
Epanechnikov, V.A.: Non-parametric estimation of a multivariate probability density. Theory Probab. Appl. 14, 153–158 (1969)
Fuglede, B., Topsoe, F.: Jensen-Shannon divergence and Hilbert space embedding. In: Proceedings of the International Symposium on Information Theory (ISIT 2004) (2004)
Kullback, S., Leibler, R.A.: On information and sufficiency. Ann. Math. Stat. 22, 79–86 (1951)
von Luxburg, U.: A tutorial on spectral clustering. Stat. Comput. 17(4), 395–416 (2007)
Hubert, L., Arabie, P.: Comparing partitions. J. Classif. 2(1), 193–218 (1985)
Blondel, V.D., Guillaume, J.-L., Lambiotte, R., Lefebvre, E.: Fast unfolding of communities in large networks. J. Stat. Mech. Theory Exp. P10008 (2008)
Acknowledgments
This work was partly supported by JSPS KAKENHI Grant Numbers JP19K12146. This work was also partly supported by Telecommunications Advancement Foundation.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Hamasuna, Y., Kingetsu, Y., Nakano, S. (2019). k-Medoids Clustering Based on Kernel Density Estimation and Jensen-Shannon Divergence. In: Torra, V., Narukawa, Y., Pasi, G., Viviani, M. (eds) Modeling Decisions for Artificial Intelligence. MDAI 2019. Lecture Notes in Computer Science(), vol 11676. Springer, Cham. https://doi.org/10.1007/978-3-030-26773-5_24
Download citation
DOI: https://doi.org/10.1007/978-3-030-26773-5_24
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-26772-8
Online ISBN: 978-3-030-26773-5
eBook Packages: Computer ScienceComputer Science (R0)