Skip to main content

Semi-supervised Feature Selection Using Sparse Laplacian Support Vector Machine

  • Conference paper
  • First Online:
Neural Computing for Advanced Applications (NCAA 2020)

Abstract

Semi-supervised feature selection is an active topic in machine learning and data mining. Laplacian support vector machine (LapSVM) has been successfully applied to semi-supervised learning. However, LapSVM cannot be directly applied to feature selection. To remedy it, we propose a sparse Laplacian support vector machine (SLapSVM) and apply it to semi-supervised feature selection. On the basis of LapSVM, SLapSVM introduces the \(\ell _1\)-norm regularization, which means the solution of SLapSVM has sparsity. In addition, the training procedure of SLapSVM can be formulated as solving a quadratic programming problem, which indicates that the solution of SLapSVM is unique and global. SLapSVM can perform feature selection and classification at the same time. Experimental results on semi-supervised classification problems show the feasibility and effectiveness of the proposed semi-supervised learning algorithms.

This work was supported in part by the Natural Science Foundation of the Jiangsu Higher Education Institutions of China under Grant No. 19KJA550002, by the Six Talent Peak Project of Jiangsu Province of China under Grant No. XYDXX-054, by the Priority Academic Program Development of Jiangsu Higher Education Institutions, and by the Collaborative Innovation Center of Novel Software Technology and Industrialization.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Belkin, M., Niyogi, P.: Semi-supervised learning on riemannian manifolds. Mach. Learn. 56(1–3), 209–239 (2004)

    Article  MATH  Google Scholar 

  2. Belkin, M., Niyogi, P., Sindhwani, V.: Manifold regularization: a geometric framework for learning from labeled and unlabeled examples. J. Mach. Learn. Res. 7(1), 2399–2434 (2006)

    MathSciNet  MATH  Google Scholar 

  3. Bennett, K.P., Demiriz, A.: Semi-supervised support vector machines. In: Proceedings of International Conference on Neural Information Processing Systems, pp. 368–374 (1999)

    Google Scholar 

  4. Blum, A., Chawla, S.: Learning from labeled and unlabeled data using graph mincuts. In: Proceedings of Eighteenth International Conference on Machine Learning, pp. 19–26 (2001)

    Google Scholar 

  5. Blum, A., Mitchell, T.: Combining labeled and unlabeled data with co-training. In: Proceedings of the 1998 Conference on Computational Learning Theory, pp. 92–100 (1998)

    Google Scholar 

  6. Cheng, S.J., Huang, Q.C., Liu, J.F., Tang, X.L.: A novel inductive semi-supervised SVM with graph-based self-training. In: Yang, J., Fang, F., Sun, C. (eds.) IScIDE 2012. LNCS, vol. 7751, pp. 82–89. Springer, Heidelberg (2013). https://doi.org/10.1007/978-3-642-36669-7_11

    Chapter  Google Scholar 

  7. Dheeru, D., Karra Taniskidou, E.: UCI machine learning repository (2017). http://archive.ics.uci.edu/ml

  8. Gammerman, A., Vovk, V., Vapnik, V.: Learning by transduction. In: Proceedings of the Fourteenth Conference on Uncertainty in Artificial Intelligence, pp. 148–155. Morgan Kaufmann, San Francisco, CA (2013)

    Google Scholar 

  9. Gasso, G., Zapien, K., Canu, S.: Sparsity regularization path for semi-supervised SVM. In: Proceedings of International Conference on Machine Learning and Applications, pp. 25–30 (2007)

    Google Scholar 

  10. Jiang, J., Ma, J., Chen, C., Jiang, X., Wang, Z.: Noise robust face image super-resolution through smooth sparse representation. IEEE Trans. Cybern. 47(11), 3991–4002 (2017)

    Article  Google Scholar 

  11. Kothari, R., Jain, V.: Learning from labeled and unlabeled data. In: Proceedings of International Joint Conference on Neural Networks, pp. 2803–2808 (2002)

    Google Scholar 

  12. Le, H.M., Thi, H.A.L., Nguyen, M.C.: Sparse semi-supervised support vector machines by DC programming and DCA. Neurocomputing 153, 62–76 (2015)

    Article  Google Scholar 

  13. Liu, W., He, J., Chang, S.F.: Large graph construction for scalable semi-supervised learning. In: Proceedings of International Conference on Machine Learning, pp. 679–686 (2010)

    Google Scholar 

  14. Lu, Z., Peng, Y.: Robust image analysis by l1-norm semi-supervised learning. IEEE Trans. Image Process. 24(1), 176–188 (2015)

    Article  MathSciNet  Google Scholar 

  15. Poggio, T., Girosi, F.: A sparse representation for function approximation. Neural Comput. 10(6), 1445–1454 (1998)

    Article  Google Scholar 

  16. Rabiner, L.R.: A tutorial on hidden markov models and selected applications in speech recognition. Proc. IEEE 77(2), 257–286 (1989)

    Article  Google Scholar 

  17. Refaeilzadeh, P., Tang, L., Liu, H.: Cross-Validation. In: Lui, L., Özsu, M.T. (eds.) Encyclopedia of Database Systems, pp. 532–538. Springer, New York (2016)

    Google Scholar 

  18. Schölkopf, B.: Sparseness of support vector machines. Mach. Learn. 4(6), 1071–1105 (2008)

    MathSciNet  Google Scholar 

  19. Shahshahani, B.M., Landgrebe, D.A.: The effect of unlabeled samples in reducing the small sample size problem and mitigating the hughes phenomenon. IEEE Trans. Geosci. Remoto Sens. 32(5), 1087–1095 (1994)

    Article  Google Scholar 

  20. Shental, N., Bar-Hillel, A., Hertz, T., Weinshall, D.: Gaussian mixture models with equivalence constraints. In: Constrained Clustering: Advances in Algorithms, Theory, and Applications, pp. 33–58. Chapman & Hall, London (2009)

    Google Scholar 

  21. Sun, Y., et al.: Discriminative local sparse representation by robust adaptive dictionary pair learning. IEEE Trans. Neural Networks Learn. Syst. 1–15 (2020)

    Google Scholar 

  22. Tan, J., Zhen, L., Deng, N., Zhang, Z.: Laplacian p-norm proximal support vector machine for semi-supervised classification. Neurocomputing 144(1), 151–158 (2014)

    Article  Google Scholar 

  23. Tang, B., Zhang, L.: Semi-supervised feature selection based on logistic I-RELIEF for multi-classification. In: Geng, X., Kang, B.-H. (eds.) PRICAI 2018. LNCS (LNAI), vol. 11012, pp. 719–731. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-97304-3_55

    Chapter  Google Scholar 

  24. Tang, B., Zhang, L.: Multi-class semi-supervised logistic I-RELIEF feature selection based on nearest neighbor. In: Yang, Q., Zhou, Z.-H., Gong, Z., Zhang, M.-L., Huang, S.-J. (eds.) PAKDD 2019. LNCS (LNAI), vol. 11440, pp. 281–292. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-16145-3_22

    Chapter  Google Scholar 

  25. Yakowitz, S.J.: An introduction to Bayesian networks. Technometrics 39(3), 336–337 (1997)

    Article  Google Scholar 

  26. Yedidia, J.H.S., Freeman, W.T., Weiss, Y.: Generalized belief propagation. Adv. Neural Inf. Process. Syst. 13(10), 689–695 (2000)

    Google Scholar 

  27. Zhang, L., Zhou, W.: On the sparseness of 1-norm support vector machines. Neural Networks 23(3), 373–385 (2010)

    Article  MATH  Google Scholar 

  28. Zhang, L., Zhou, W., Chang, P., Liu, J., Yan, Z., Wang, T., Li, F.: Kernel sparse representation-based classifier. IEEE Trans. Signal Process. 60, 1684–1695 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  29. Zhang, L., Zhou, W., Li, F.: Kernel sparse representation-based classifier ensemble for face recognition. Multimedia Tools Appl. 74(1), 123–137 (2015)

    Article  Google Scholar 

  30. Zhou, D., Schölkopf, B.: Learning from labeled and unlabeled data using random walks. In: Rasmussen, C.E., Bülthoff, H.H., Schölkopf, B., Giese, M.A. (eds.) DAGM 2004. LNCS, vol. 3175, pp. 237–244. Springer, Heidelberg (2004). https://doi.org/10.1007/978-3-540-28649-3_29

    Chapter  Google Scholar 

  31. Zhou, D., Schölkopf, B.: A regularization framework for learning from graph data. In: ICML Workshop on Statistical Relational Learning and Its Connections to Other Fields, pp. 132–137 (2004)

    Google Scholar 

  32. Zhu, J., Rosset, S., Hastie, T., Tibshirani, R.: 1-norm support vector machines. In: Proceedings of the 16th International Conference on Neural Information Processing Systems, vol. 16, no. 1, pp. 49–56 (2003)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Li Zhang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Zhang, L., Zheng, X., Xu, Z. (2020). Semi-supervised Feature Selection Using Sparse Laplacian Support Vector Machine. In: Zhang, H., Zhang, Z., Wu, Z., Hao, T. (eds) Neural Computing for Advanced Applications. NCAA 2020. Communications in Computer and Information Science, vol 1265. Springer, Singapore. https://doi.org/10.1007/978-981-15-7670-6_10

Download citation

  • DOI: https://doi.org/10.1007/978-981-15-7670-6_10

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-15-7669-0

  • Online ISBN: 978-981-15-7670-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics