Skip to main content
Log in

Transfer subspace learning joint low-rank representation and feature selection

  • Published:
Multimedia Tools and Applications Aims and scope Submit manuscript

Abstract

Transfer learning is proposed to solve a general problem in practical applications faced by traditional machine learning methods, that is, the training and test data have different distributions. This paper provides a novel transfer subspace learning method combining low-rank representation (LRR) and feature selection for unsupervised domain adaptation. The core of the proposed method is to map both the source and target data into a latent subspace by a projection such that the discrepancy between domains is reduced. Specifically, by using LRR, a low-rank constraint is imposed on the reconstruction coefficient matrix, and thus the global structure of data can be preserved. Moreover, a structured sparsity-inducing norm based regularization term is introduced into the domain adaptation, which leads to imposing a row-sparsity constraint on the projection matrix. This constraint can enforce rows of the projection matrix corresponding to inessential feature attributes to be all zeros, and thus select relevant features across two domains. As a result, the proposed method has good interpretability and can adaptively perform feature selection. Furthermore, taking into account that the projected samples should be close to each other in the shared subspace if they belong to the same class, regardless of which domain they originally come from, we introduce graph embedding to characterize the local manifold structures of data so as to preserve the relationships between examples in the subspace. Finally, we mathematically formulate the proposed method and derive an iterative algorithm to solve the corresponding problem. The exhaustive experimental evaluations on public datasets confirm the effectiveness of the proposed method in comparison with several state-of-the-art methods.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

References

  1. Belkin M, Niyogi P (2002) Semi-supervised learning on manifolds. In: Proceedings of NIPS, pp 1–23

  2. Blum A, Chawla S (2001) Learning from labeled and unlabeled data using graph mincuts. In: Proceedings of ICML, pp 19–26

  3. Busto PP, Iqbal A, Gall J (2020) Open set domain adaptation for image and action recognition. IEEE Trans Pattern Anal Mach Intell 42(2):413–429

    Article  Google Scholar 

  4. Cai J, Candès EJ, Shen Z (2010) A singular value thresholding algorithm for matrix completion. SIAM J Optim 20(4):1956–1982

    Article  MathSciNet  Google Scholar 

  5. Chen W, Hsu TH, Tsai YH, Chen M, Wang Y (2019) Transfer neural trees: Semi-supervised heterogeneous domain adaptation and beyond. IEEE Trans Image Process 28(9):4620–4633

    Article  MathSciNet  Google Scholar 

  6. Duan L, Tsang IW, Xu D, Maybank SJ (2009) Domain transfer SVM for video concept detection. In: Proceedings of IEEE CVPR, pp 1375–1381

  7. Ganin Y, Ustinova E, Ajakan H, Germain P, Larochelle H, Laviolette F, Marchand M, Lempitsky V (2016) Domain-adversarial training of neural networks. J Mach Learn Res 17(1):1–35

    MathSciNet  MATH  Google Scholar 

  8. Ghifary M, Balduzzi D, Kleijn WB, Zhang M (2017) Scatter component analysis: a unified framework for domain adaptation and domain generalization. IEEE Trans Pattern Anal Mach Intell 39(7):1414–1430

    Article  Google Scholar 

  9. Gong B, Grauman K, Sha F (2013) Connecting the dots with landmarks: Discriminatively learning domain-invariant features for unsupervised domain adaptation. In: Proceedings of ICML, pp 153–159

  10. Gong B, Shi Y, Sha F, Grauman K (2012) Geodesic flow kernel for unsupervised domain adaptation. In: Proceedings of IEEE CVPR, pp 2066–2073

  11. Gopalan R, Li R, Chellappa R (2011) Domain adaptation for object recognition: An unsupervised approach. In: Proceedings of IEEE ICCV, pp 999–1006

  12. Gretton A, Borgwardt KM, Rasch MJ, Schölkopf B, Smola A (2012) A kernel two-sample test. J Mach Learn Res 13:723–773

  13. He F, Nie F, Wang R, Li X, Jia W (2020) Fast semisupervised learning with bipartite graph for large-scale data. IEEE Trans Neural Netw Learn Syst 31(2):626–638

    Article  MathSciNet  Google Scholar 

  14. He Z, Yang B, Chen C, Mu Q, Li Z (2020) CLDA: An Adversarial unsupervised domain adaptation method with classifier-level adaptation. Multimedia Tools and Applications

  15. Jhuo IH, Liu D, Lee DT, Chang SF (2012) Robust visual domain adaptation with low-rank reconstruction. In: Proceedings of IEEE CVPR, pp 2168–2175

  16. Jie N, Qiang Q, Chellappa R (2013) Subspace interpolation via dictionary learning for unsupervised domain adaptation. In: Proceedings of IEEE CVPR, pp 692–699

  17. Jing M, Li J, Lu K, Zhu L, Yang Y (2020) Learning explicitly transferable representations for domain adaptation. Neural Netw 130:39–48

    Article  Google Scholar 

  18. Krizhevsky A, Sutskever I, Hinton GE (2012) ImageNet classification with deep convolutional neural networks. In: Proceedings of NIPS, pp 1097–1105

  19. Liu G, Lin Z, Yan S, Sun J, Yu Y, Ma Y (2013) Robust recovery of subspace structures by low-rank representation. IEEE Trans Pattern Anal Mach Intell 35(1):171–184

    Article  Google Scholar 

  20. Liu G, Lin Z, Yu Y (2010) Robust subspace segmentation by low-rank representation. In: Proceedings of ICML, pp 663–670

  21. Liu F, Lu J, Zhang G (2018) Unsupervised heterogeneous domain adaptation via shared fuzzy equivalence relations. IEEE Trans Fuzzy Syst 26(6):3555–3568

    Article  Google Scholar 

  22. Liu F, Zhang G, Lu J (2020) Heterogeneous domain adaptation: an unsupervised approach. IEEE Trans Neural Netw Learn Syst 31(12):5588–5602

    Article  MathSciNet  Google Scholar 

  23. Long M, Cao Y, Wang J, Jordan MI (2015) Learning transferable features with deep adaptation networks. In: Proceedings of ACM ICML, pp 97–105

  24. Long M, Cao Z, Wang J, Jordan MI (2018) Conditional adversarial domain adaptation. In: Proceedings of NeurIPS, pp 1640–1650

  25. Long M, Wang J, Ding G, Sun J, Yu PS (2013) Transfer feature learning with joint distribution adaptation. In: Proceedings of IEEE ICCV pp 2200–2207

  26. Long M, Wang J, Ding G, Sun J, Yu PS (2014) Transfer joint matching for unsupervised domain adaptation. In: Proceedings of IEEE CVPR, pp 1410–1417

  27. Lu H, Shen C, Cao Z, Xiao Y, Henge A (2018) An embarrassingly simple approach to visual domain adaptation. IEEE Trans Image Process 27(7):3403–3417

    Article  MathSciNet  Google Scholar 

  28. Luo L, Chen L, Hu S, Lu Y, Wang X (2020) Discriminative and geometry-aware unsupervised domain adaptation. IEEE Trans Cybern 50 (9):3914–3927

    Article  Google Scholar 

  29. Pan SJ, Tsang IW, Kwol JT, Yang Q (2011) Domain adaptation via transfer component analysis. IEEE Trans Neural Netw 22(2):199–210

    Article  Google Scholar 

  30. Pan SJ, Yang Q (2009) A survey on transfer learning. IEEE Trans Knowl Data Eng 22(10):1345–1359

    Article  Google Scholar 

  31. Peng J, Sun W, Ma L, Du Q (2019) Discriminative transfer joint matching for domain adaptation in hyperspectral image classification. IEEE Geosci Remote Sens Lett 16(6):972–976

    Article  Google Scholar 

  32. Pereira LAM, Torres RDS (2018) Semi-supervised transfer subspace for domain adaptation. Pattern Recognit 75:235–249

    Article  Google Scholar 

  33. Razzaghi P, Razzaghi P, Abbasi K (2019) Transfer subspace learning via low-rank and discriminative reconstruction matrix. Knowl-Based Syst 163:174–185

    Article  Google Scholar 

  34. Shao M, Kit D, Fu Y (2014) Generalized transfer subspace learning through low-rank constraint. Int J Comput Vis 109(1):74–93

    Article  MathSciNet  Google Scholar 

  35. Si S, Tao D, Geng B (2010) Bregman divergence-based regularization for transfer subspace learning. IEEE Trans Knowl Data Eng 22(7):929–942

    Article  Google Scholar 

  36. Tahmoresnezhad J, Hashemi S (2017) Visual domain adaptation via transfer feature learning. Knowl Inf Syst 50:585–605

    Article  Google Scholar 

  37. Tzeng E, Hoffman J, Zhang N, Saenko K, Darrell T (2014) Deep domain confusion: Maximizing for domain invariance, arXiv:1412.3474

  38. Wang J, Chen Y, Hao S, Feng W, Shen Z (2017) Balanced distribution adaptation for transfer learning. In: Proceedings of IEEE ICDM, pp 1129–1134

  39. Wang Y, Nie L, Li Y, Chen S (2020) Soft large margin clustering for unsupervised domain adaptation. Knowl-Based Syst 192:105344

  40. Wang W, Wang H, Zhang Z, Zhang C, Gao Y (2019) Semi-supervised domain adaptation via Fredholm integral based kernel methods. Pattern Recognit 85:185–197

    Article  Google Scholar 

  41. Xiang S, Nie F, Meng G, Pan C, Zhang C (2012) Discriminative least squares regression for multiclass classification and feature selection. IEEE Trans Neural Netw Learn Syst 23(11):1738–1754

    Article  Google Scholar 

  42. Xie Y, Du Z, Li J, Jing M, Chen E, Lu K (2020) Joint metric and feature representation learning for unsupervised domain adaptation. Knowl-Based Syst 192:105222

  43. Xu Y, Fang X, Wu J, Li X, Zhang D (2016) Discriminative transfer subspace learning via low-rank and sparse representation. IEEE Trans Images Process 25(2):850–863

    Article  MathSciNet  Google Scholar 

  44. Xu F, Yu J, Xia R (2018) Instance-based domain adaptation via multiclustering logistic approximation. IEEE Intell Syst 33(1):78–88

    Article  Google Scholar 

  45. Yan H, Li Z, Wang Q, Li P, Xu Y, Zuo W (2020) Weighted and class-specific maximum mean discrepancy for unsupervised domain adaptation. IEEE Trans Multi 22(9):2420–2433

    Article  Google Scholar 

  46. Yang L, Men M, Xue Y, Zhong P (2020) Low-rank representation-based regularized subspace learning method for unsupervised domain adaptation. Multimed Tools Appl 79:3031–3047

    Article  Google Scholar 

  47. Yang J, Yan R, Hauptmann AG (2007) Cross-domain video concept detection using adaptive SVMs. In: Proceedings of ACM MM, pp 188–197

  48. Yang J, Yin W, Zhang Y, Wang Y (2009) A fast algorithm for edge-preserving variational multichannel image restoration. SIAM J Imaging Sci 2(2):569–592

    Article  MathSciNet  Google Scholar 

  49. Yang L, Zhong P (2020) Discriminative and informative joint distribution adaptation for unsupervised domain adaptation. Knowl-Based Syst 207:106394

  50. Yao Y, Li X, Ye Y, Liu F, Ng MK, Zhang Y (2019) Low-resolution image categorization via heterogeneous domain adaptation. Knowl-Based Syst 163:656–665

    Article  Google Scholar 

  51. Zhang J, Hu H (2019) Domain learning joint with semantic adaptation for human action recognition. Pattern Recognit 90:196–209

    Article  Google Scholar 

  52. Zhang L, Wang P, Wei W, Lu H, Shen C, Hengel AVD, Zhang Y (2019) Unsupervised domain adaptation using robust class-wise matching. IEEE Trans Circ Syst Video Technol 29(5):1339–1349

    Article  Google Scholar 

  53. Zhang Y, Wen J, Wang X, Jiang Z (2014) Semi-supervised learning combining co-training with active learning. Expert Syst Appl 41(5):2372–2378

    Article  Google Scholar 

  54. Zhang W, Zhang X, Lan L, Luo Z (2020) Enhancing unsupervised domain adaptation by discriminative relevance regularization. Knowl Inf Syst 62:3641–3664

    Article  Google Scholar 

  55. Zhang Z, Zhao M, Chow TWS (2015) Graph based constrained semi-supervised learning framework via label propagation over adaptive neighborhood. IEEE Trans Knowl Data Eng 27(9):2362–2376

    Article  Google Scholar 

Download references

Acknowledgments

The authors would like to thank the editors and reviewers for their constructive comments and suggestions which can help improve the quality of the paper. This work was supported by the opening foundation of Engineering Research Center of Intelligent Computing for Complex Energy Systems, Ministry of Education. This work was also supported by the Natural Science Foundation of Guangdong province (Grant No. 2018A0303130026) and the Teacher Research Capacity Promotion Program of Beijing Normal University, Zhuhai.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Qinghua Zhou.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Yang, L., Zhou, Q. Transfer subspace learning joint low-rank representation and feature selection. Multimed Tools Appl 81, 38353–38373 (2022). https://doi.org/10.1007/s11042-022-12504-z

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11042-022-12504-z

Keywords

Navigation