Skip to main content

Adaptive Graph Learning for Supervised Low-Rank Spectral Feature Selection

  • Conference paper
  • First Online:
Applications and Techniques in Information Security (ATIS 2018)

Part of the book series: Communications in Computer and Information Science ((CCIS,volume 950))

Abstract

Spectral feature selection (SFS) is getting more and more attention in recent years. However, conventional SFS has some weaknesses that may corrupt the performance of feature selection, since (1) SFS generally preserves the either global structure or local structure, which can’t provide comprehensive information for the model; (2) graph learning and feature selection of SFS is two individual processes, which is hard to achieve the global optimization. Thus, a novel SFS is proposed via introducing a low-rank constraint for capturing inherent structure of data, and utilizing an adaptive graph learning to couple the graph learning and feature data learning in an iterative framework to output a robust and accurate learning model. A optimization algorithm is proposed to solve the proposed problem with a fast convergence. By comparing to some classical and first-class feature selection methods, our method has exhibited a competitive performance.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    http://archive.ics.uci.edu/ml/.

References

  1. Cai, X., Ding, C., Nie, F., Huang, H.: On the equivalent of low-rank linear regressions and linear discriminant analysis based regressions. In: ACM SIGKDD, pp. 1124–1132 (2013)

    Google Scholar 

  2. Cai, X., Nie, F., Huang, H.: Exact top-k feature selection via l 2, 0-norm constraint. In: International Joint Conference on Artificial Intelligence, pp. 1240–1246 (2013)

    Google Scholar 

  3. Chang, X., Nie, F., Yang, Y., Huang, H.: A convex formulation for semi-supervised multi-label feature selection. In: AAAI, pp. 1171–1177 (2014)

    Google Scholar 

  4. Daubechies, I., DeVore, R.A., Fornasier, M., Güntürk, C.S.: Iteratively re-weighted least squares minimization: proof of faster than linear rate for sparse recovery. In: CISS, pp. 26–29 (2008)

    Google Scholar 

  5. Hu, R., et al.: Graph self-representation method for unsupervised feature selection. Neurocomputing 220, 130–137 (2017)

    Article  Google Scholar 

  6. Nie, F., Zhu, W., Li, X.: Unsupervised feature selection with structured graph optimization. In: AAAI, pp. 1302–1308 (2016)

    Google Scholar 

  7. Nie, F., Zhu, W., Li, X.: Unsupervised feature selection with structured graph optimization. In: Thirtieth AAAI Conference on Artificial Intelligence, pp. 1302–1308 (2016)

    Google Scholar 

  8. Qian, M., Zhai, C.: Robust unsupervised feature selection. In: IJCAI, pp. 1621–1627 (2013)

    Google Scholar 

  9. Li, Y., Zhang, J., Yang, L., Zhu, X., Zhang, S., Fang, Y.: Low-rank sparse subspace for spectral clustering. IEEE Trans. Knowl. Data Eng. https://doi.org/10.1109/TKDE.2018.2858782

  10. Zhang, S., Li, X., Zong, M., Zhu, X., Wang, R.: Efficient kNN classification with different numbers of nearest neighbors. IEEE Trans. Neural Netw. Learn. Syst. 29(5), 1774–1785 (2018)

    Article  MathSciNet  Google Scholar 

  11. Zheng, W., Zhu, X., Wen, G., Zhu, Y., Yu, H., Gan, J.: Unsupervised feature selection by self-paced learning regularization. Pattern Recogn. Lett. (2018). https://doi.org/10.1016/j.patrec.2018.06.029

  12. Zheng, W., Zhu, X., Zhu, Y., Hu, R., Lei, C.: Dynamic graph learning for spectral feature selection. Multimed. Tools Appl. (2017). https://doi.org/10.1007/s11042-017-5272-y

  13. Zhu, P., Zuo, W., Zhang, L., Hu, Q., Shiu, S.C.K.: Unsupervised feature selection by regularized self-representation. Pattern Recogn. 48(2), 438–446 (2015)

    Article  Google Scholar 

  14. Zhu, X., Huang, Z., Yang, Y., Shen, H.T., Xu, C., Luo, J.: Self-taught dimensionality reduction on the high-dimensional small-sized data. Pattern Recogn. 46(1), 215–229 (2013)

    Article  Google Scholar 

  15. Zhu, X., Li, X., Zhang, S., Ju, C., Wu, X.: Robust joint graph sparse coding for unsupervised spectral feature selection. IEEE Trans. Neural Netw. Learn. Syst. 28(6), 1263–1275 (2017)

    Article  MathSciNet  Google Scholar 

  16. Zhu, X., Li, X., Zhang, S., Xu, Z., Yu, L., Wang, C.: Graph PCA hashing for similarity search. IEEE Trans. Multimed. 19(9), 2033–2044 (2017)

    Article  Google Scholar 

  17. Zhu, X., Suk, H.-I., Huang, H., Shen, D.: Low-rank graph-regularized structured sparse regression for identifying genetic biomarkers. IEEE Trans. Big Data 3(4), 405–414 (2017)

    Article  Google Scholar 

  18. Zhu, X., Wu, X., Ding, W., Zhang, S.: Feature selection by joint graph sparse coding (2013)

    Google Scholar 

  19. Zhu, X., Zhang, S., Hu, R., Zhu, Y., et al.: Local and global structure preservation for robust unsupervised spectral feature selection. IEEE Trans. Knowl. Data Eng. 30(3), 517–529

    Article  Google Scholar 

  20. Zhu, Y., Kim, M., Zhu, X., Yan, J., Kaufer, D., Wu, G.: Personalized Diagnosis for Alzheimer’s Disease. In: Descoteaux, M., Maier-Hein, L., Franz, A., Jannin, P., Collins, D.L., Duchesne, S. (eds.) MICCAI 2017. LNCS, vol. 10435, pp. 205–213. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-66179-7_24

    Chapter  Google Scholar 

  21. Zhu, Y., Lucey, S.: Convolutional sparse coding for trajectory reconstruction. IEEE Trans. Pattern Anal. Mach. Intell. 37(3), 529–540 (2015)

    Article  Google Scholar 

  22. Zhu, Y., Zhu, X., Kim, M., Kaufer, D., Wu, G.: A novel dynamic hyper-graph inference framework for computer assisted diagnosis of neuro-diseases. In: Niethammer, M., et al. (eds.) IPMI 2017. LNCS, vol. 10265, pp. 158–169. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-59050-9_13

    Chapter  Google Scholar 

  23. Zhu, Y., Zhang, X., Hu, R., Wen, G.: Adaptive structure learning for low-rank supervised feature selection. Pattern Recogn. Lett. 109, 89–96 (2018)

    Article  Google Scholar 

  24. Zhu, Y., Zhong, Z., Cao, W., Cheng, D.: Graph feature selection for dementia diagnosis. Neurocomputing 195(C), 19–22 (2016)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhi Zhong .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Zhong, Z. (2018). Adaptive Graph Learning for Supervised Low-Rank Spectral Feature Selection. In: Chen, Q., Wu, J., Zhang, S., Yuan, C., Batten, L., Li, G. (eds) Applications and Techniques in Information Security. ATIS 2018. Communications in Computer and Information Science, vol 950. Springer, Singapore. https://doi.org/10.1007/978-981-13-2907-4_14

Download citation

  • DOI: https://doi.org/10.1007/978-981-13-2907-4_14

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-13-2906-7

  • Online ISBN: 978-981-13-2907-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics