Advertisement

A Multi-kernel Semi-supervised Metric Learning Using Multi-objective Optimization Approach

  • Rakesh Kumar Sanodiya
  • Sriparna Saha
  • Jimson Mathew
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11302)

Abstract

A kernel-matrix based distance measure is utilized for computing the similarities between the data points. The available few labeled data is used as constraints to project on initial kernel-matrix using Bregman projection. Since the projection of constraints onto the matrix is not orthogonal, we need to identify an appropriate subset of constraints subject to objective functions, measuring the quality of partitioning of the data. As the kernel-space is large in size, we have divided the original kernel space into multiple kernel sub-spaces so that each kernel can be processed independently and parallelly in advance GPU and kernel semi-supervised metric learning using multi-objective approach is applied on individual kernels parallelly. The multi-objective framework is used to select the best subset of constraints to optimize multiple objective functions for grouping the available data. Our approach outperforms the state of the art algorithms on the various datasets with respect to different validity indices.

Keywords

Semi-supervised Multi-objective optimization Classification Clustering Graphics processing unit (GPU) 

References

  1. 1.
    Davis, J.V., Kulis, B., Jain, P., Sra, S., Dhillon, I.S.: Information-theoretic metric learning. In: Proceedings of the 24th International Conference on Machine Learning, pp. 209–216. ACM (2007)Google Scholar
  2. 2.
    Amid, E., Gionis, A., Ukkonen, A.: A kernel-learning approach to semi-supervised clustering with relative distance comparisons. In: Appice, A., Rodrigues, P.P., Santos Costa, V., Soares, C., Gama, J., Jorge, A. (eds.) ECML PKDD 2015. LNCS (LNAI), vol. 9284, pp. 219–234. Springer, Cham (2015).  https://doi.org/10.1007/978-3-319-23528-8_14CrossRefGoogle Scholar
  3. 3.
    Deb, K., Jain, H.: An evolutionary many-objective optimization algorithm using reference-point-based nondominated sorting approach, part I: solving problems with box constraints. IEEE Trans. Evol. Comput. 18(4), 577–601 (2014)CrossRefGoogle Scholar
  4. 4.
    Deb, K., Pratap, A., Agarwal, S., Meyarivan, T.A.M.T.: A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evol. Comput. 6(2), 182–197 (2002)CrossRefGoogle Scholar
  5. 5.
    Ciro, G.C., Dugardin, F., Yalaoui, F., Kelly, R.: A NSGA-II and NSGA-III comparison for solving an open shop scheduling problem with resource constraints. IFAC-PapersOnLine 49(12), 1272–1277 (2016)CrossRefGoogle Scholar
  6. 6.
    Li, H., Zhang, Q.: Multiobjective optimization problems with complicated Pareto sets, MOEA/D and NSGA-II. IEEE Trans. Evol. Comput. 13(2), 284–302 (2009)CrossRefGoogle Scholar
  7. 7.
    Robert, V. Vasseur, Y.: Comparing high dimensional partitions, with the Coclustering Adjusted Rand Index. ArXiv Preprint ArXiv:1705.06760 (2017)
  8. 8.
    Rendón, E., Abundez, I., Arizmendi, A., Quiroz, E.M.: Internal versus external cluster validation indexes. Int. J. Comput. Commun. 5(1), 27–34 (2011)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  • Rakesh Kumar Sanodiya
    • 1
  • Sriparna Saha
    • 1
  • Jimson Mathew
    • 1
  1. 1.Indian Institute of Technology PatnaPatnaIndia

Personalised recommendations