Semi-supervised Transfer Metric Learning with Relative Constraints

  • Rakesh Kumar SanodiyaEmail author
  • Sriparna Saha
  • Jimson Mathew
  • Prateek Bangwal
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11303)


Distance metric learning is one of the most important aspects behind the performance of numerous algorithms under the data mining paradigm. In this article, we propose a new method for transfer metric learning under semi-supervised setting, using the concept of relative distance constraints to exploit more information from the unlabeled data present in the target task. We need an appropriate distance function for extracting useful information from unlabeled data. For this purpose, we use the concept of pairwise relative distance constraints. With the help of few labeled data, we obtain the pair-wise similarities in the form of inequality and equality constraints. We use the concept of Bregman projection to satisfy such constraints to the initial distance matrix that is composed of both labeled and unlabeled data, and then construct the appropriate K-nearest neighbor graph using this matrix, which provides better results regardless of the dimension of the data.


Metric learning Transfer learning Semi-supervised learning Relative distance comparisons 


  1. 1.
    Xing, E.P., Jordan, M.I., Russell, S.J., Ng, A.Y.: Distance metric learning with application to clustering with side-information. In: Advances in Neural Information Processing Systems, pp. 521–528 (2003)Google Scholar
  2. 2.
    Bilenko, M., Basu, S., Mooney, R.J.: Integrating constraints and metric learning in semi-supervised clustering. In: Proceedings of the Twenty-First International Conference on Machine Learning, p. 11. ACM (2004)Google Scholar
  3. 3.
    Caruana, R.: Multitask learning. Mach. Learn. 28(1), 41–75 (1997)MathSciNetCrossRefGoogle Scholar
  4. 4.
    Zhang, Y., Yeung, D.Y.: Transfer metric learning with semi-supervised extension. ACM Trans. Intell. Syst. Technol. 3(3), 54 (2012)Google Scholar
  5. 5.
    Jin, R., Wang, S., Zhou, Y.: Regularized distance metric learning: theory and algorithm. In: Advances in Neural Information Processing Systems, pp. 862–870 (2009)Google Scholar
  6. 6.
    Zelnik-Manor, L., Perona, P.: Self-tuning spectral clustering. In: Advances in Neural Information Processing Systems, pp. 1601–1608 (2005)Google Scholar
  7. 7.
    Amid, E., Gionis, A., Ukkonen, A.: A kernel-learning approach to semi-supervised clustering with relative distance comparisons. In: Appice, A., Rodrigues, P.P., Santos Costa, V., Soares, C., Gama, J., Jorge, A. (eds.) ECML PKDD 2015, Part I. LNCS (LNAI), vol. 9284, pp. 219–234. Springer, Cham (2015). Scholar
  8. 8.
    Pei, Y., Fern, X.Z., Rosales, R., Tjahja, T.V.: Discriminative clustering with relative constraints (2014). arXiv preprint: arXiv:1501.00037
  9. 9.
    Kulis, B., Sustik, M.A., Dhillon, I.S.: Low-rank kernel learning with Bregman matrix divergences. J. Mach. Learn. Res. 10, 341–376 (2009)MathSciNetzbMATHGoogle Scholar
  10. 10.
    Bregman, L.M.: The relaxation method of finding the common point of convex sets and its application to the solution of problems in convex programming. USSR Comput. Math. Math Phys. 7(3), 200–217 (1967)MathSciNetCrossRefGoogle Scholar
  11. 11.
    Argyriou, A., Evgeniou, T., Pontil, M.: Convex multi-task feature learning. Mach. Learn. 73(3), 243–272 (2008)CrossRefGoogle Scholar
  12. 12.
    Lobo, M.S., Vandenberghe, L., Boyd, S., Lebret, H.: Applications of second-order cone programming. Linear Algebr. Appl. 284(1–3), 193–228 (1998)MathSciNetCrossRefGoogle Scholar
  13. 13.
    Chung, M.K.: Gaussian kernel smoothing. LNCS, pp. 1–10 (2012)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  • Rakesh Kumar Sanodiya
    • 1
    Email author
  • Sriparna Saha
    • 1
  • Jimson Mathew
    • 1
  • Prateek Bangwal
    • 2
  1. 1.Indian Institute of Technology PatnaPatnaIndia
  2. 2.University of Petroleum and Energy StudiesDehradunIndia

Personalised recommendations