Skip to main content
Log in

A multi-task framework for metric learning with common subspace

  • ICONIP 2011
  • Published:
Neural Computing and Applications Aims and scope Submit manuscript

Abstract

Metric learning has been widely studied in machine learning due to its capability to improve the performance of various algorithms. Meanwhile, multi-task learning usually leads to better performance by exploiting the shared information across all tasks. In this paper, we propose a novel framework to make metric learning benefit from jointly training all tasks. Based on the assumption that discriminative information is retained in a common subspace for all tasks, our framework can be readily used to extend many current metric learning methods. In particular, we apply our framework on the widely used Large Margin Component Analysis (LMCA) and yield a new model called multi-task LMCA. It performs remarkably well compared to many competitive methods. Besides, this method is able to learn a low-rank metric directly, which effects as feature reduction and enables noise compression and low storage. A series of experiments demonstrate the superiority of our method against three other comparison algorithms on both synthetic and real data.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Notes

  1. Note that it is straightforward to extend our framework to other metric learning models that optimize the objective function with the transformation matrix.

  2. http://archive.ics.uci.edu/ml/datasets/Wine+Quality.

  3. http://multitask.cs.berkeley.edu/.

  4. http://www-i6.informatik.rwth-aachen.de/~keysers/usps.html.

  5. http://kdd.ics.uci.edu/databases/tic/tic.html.

References

  1. Xing EP, Ng AY, Jordan MI, Russell S (2003) Distance metric learning, with application to clustering with side-information. In: Advances in neural information processing systems, vol 15, pp 505–512

  2. Goldberger J, Roweis S, Hinton G, Salakhutdinov R (2004) Neighbourhood components analysis. In: Advances in neural information processing systems, vol 17, pp 513–520

  3. Weinberger KQ, Saul LK (2009) Distance metric learning for large margin nearest neighbor classification. J Mach Learn Res 10:207–244

    MATH  Google Scholar 

  4. Huang K, Ying Y, Campbell C (2011) Generalized sparse metric learning with relative comparisons. Knowl Inf Syst 28(1):25–45

    Article  Google Scholar 

  5. Torresani L, Lee K (2007) Large margin component analysis. In: Advances in neural information processing, pp 505–512

  6. Davis JV, Kulis B, Jain P, Sra S, Dhillon IS (2007) Information-theoretic metric learning. In: Proceedings of the 24th international conference on machine learning, pp 209–216

  7. Caruana R (1997) Multitask learning. Mach Learn 28(1):41–75

    Article  MathSciNet  Google Scholar 

  8. Evgeniou T, Pontil M (2004) Regularized multi-task learning. In: Proceedings of the tenth ACM SIGKDD international conference on knowledge discovery and data mining, pp 109–117

  9. Argyriou A, Evgeniou T (2008) Convex multi-task feature learning. Mach Learn 73(3):243–272

    Article  Google Scholar 

  10. Micchelli CA, Ponti M (2004) Kernels for multi-task learning. In: Advances in neural information processing, pp 921–928

  11. Zhang Y, Yeung DY, Xu Q (2010) Probabilistic multi-task feature selection. In: Advances in neural information processing systems, pp 2559–2567

  12. Cole R, Fanty M (1990) Spoken letter recognition. In: Proceedings of the workshop on speech and natural language, pp 385–390

  13. Parameswaran S, Weinberger K (2010) Large margin multi-task metric learning. In: Advances in neural information processing systems, vol 23, pp 1867–1875

  14. Webb AR (2002) Statistical pattern recognition. 2nd edn. Wiley, Chichester

    Book  MATH  Google Scholar 

  15. Yang P, Huang K, Liu CL (2011) Multi-task low-rank metric learning based on common subspace. In : Proceedings of International Conference on Neural information processing, vol 7063, pp 151–159

  16. Rosales R, Fung G (2006) Learning sparse metrics via linear programming. In: Proceedings of the 12th ACM SIGKDD international conference on knowledge discovery and data mining, pp 367–373

Download references

Acknowledgments

This work was supported by National Basic Research Program of China (973 Program) grant 2012CB316301, National Natural Science Foundation of China (NSFC) under grants 61075052 and 60825301, and Tsinghua National Laboratory for Information Science and Technology (TNList) Cross-discipline Foundation.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Kaizhu Huang.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Yang, P., Huang, K. & Liu, CL. A multi-task framework for metric learning with common subspace. Neural Comput & Applic 22, 1337–1347 (2013). https://doi.org/10.1007/s00521-012-0956-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00521-012-0956-8

Keywords

Navigation