Skip to main content

Supervised Feature Space Reduction for Multi-Label Nearest Neighbors

  • Conference paper
  • First Online:
Advances in Artificial Intelligence: From Theory to Practice (IEA/AIE 2017)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 10350))

Abstract

With the ability to process many real-world problems, multi-label classification has received a large attention in recent years and the instance-based ML-kNN classifier is today considered as one of the most efficient. But it is sensitive to noisy and redundant features and its performances decrease with increasing data dimensionality. To overcome these problems, dimensionality reduction is an alternative but current methods optimize reduction objectives which ignore the impact on the ML-kNN classification. We here propose ML-ARP, a novel dimensionality reduction algorithm which, using a variable neighborhood search metaheuristic, learns a linear projection of the feature space which specifically optimizes the ML-kNN classification loss. Numerical comparisons have confirmed that ML-ARP outperforms ML-kNN without data processing and four standard multi-label dimensionality reduction algorithms.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Abdi, H., Williams, L.J.: Principal component analysis. Wiley Interdisc. Rev. Comput. Stat. 2(4), 433–459 (2010)

    Article  Google Scholar 

  2. Bellet, A., Habrard, A., Sebban, M.: A survey on metric learning for feature vectors and structured data. arXiv preprint arXiv:1306.6709 (2013)

  3. Bishop, C.M.: Pattern Recognition and Machine Learning. Springer, New York (2006)

    MATH  Google Scholar 

  4. Burges, C.J.: Geometric methods for feature extraction and dimensional reduction-a guided tour. In: Maimon, O., Rokach, L. (eds.) Data Mining and Knowledge Discovery Handbook, pp. 53–82. Springer, New York (2009)

    Google Scholar 

  5. Calvo, B., Santafe, G.: scmamp: statistical comparison of multiple algorithms in multiple problems. R J. 8(1), 248–256 (2015)

    Google Scholar 

  6. Dasgupta, S., Gupta, A.: An elementary proof of a theorem of johnson and lindenstrauss. Random Struct. Algorithms 22(1), 60–65 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  7. Guo, Y., Schuurmans, D.: Semi-supervised multi-label classification. In: Flach, P.A., Bie, T., Cristianini, N. (eds.) ECML PKDD 2012. LNCS, vol. 7524, pp. 355–370. Springer, Heidelberg (2012). doi:10.1007/978-3-642-33486-3_23

    Chapter  Google Scholar 

  8. Hotelling, H.: Relations between two sets of variates. Biometrika 28(3/4), 321–377 (1936)

    Article  MATH  Google Scholar 

  9. Ji, S., Ye, J.: Linear dimensionality reduction for multi-label classification. In: IJCAI, vol. 9, pp. 1077–1082. Citeseer (2009)

    Google Scholar 

  10. Madjarov, G., Kocev, D., Gjorgjevikj, D., Džeroski, S.: An extensive experimental comparison of methods for multi-label learning. Pattern Recogn. 45(9), 3084–3104 (2012)

    Article  Google Scholar 

  11. Ran, R., Oh, H.: Adaptive sparse random projections for wireless sensor networks with energy harvesting constraints. EURASIP J. Wirel. Commun. Networking 2015(1), 113 (2015)

    Article  Google Scholar 

  12. Rosipal, R., Krämer, N.: Overview and recent advances in partial least squares. In: Saunders, C., Grobelnik, M., Gunn, S., Shawe-Taylor, J. (eds.) SLSFS 2005. LNCS, vol. 3940, pp. 34–51. Springer, Heidelberg (2006). doi:10.1007/11752790_2

    Chapter  Google Scholar 

  13. Sun, L., Ji, S., Ye, J.: Canonical correlation analysis for multilabel classification: a least-squares formulation, extensions, and analysis. IEEE Trans. Pattern Anal. Mach. Intell. 33(1), 194–200 (2011)

    Article  Google Scholar 

  14. Tsoumakas, G., Katakis, I.: Multi-label Classification: An Overview. Department of Informatics, Aristotle University of Thessaloniki, Greece (2006)

    Google Scholar 

  15. Tsoumakas, G., Spyromitros-Xioufis, E., Vilcek, J., Vlahavas, I.: Mulan: a JAVA library for multi-label learning. J. Mach. Learn. Res. 12(Jul), 2411–2414 (2011)

    MathSciNet  MATH  Google Scholar 

  16. Wan, S., Mak, M.W., Kung, S.Y.: Sparse regressions for predicting and interpreting subcellular localization of multi-label proteins. BMC Bioinformatics 17(1), 97 (2016)

    Article  Google Scholar 

  17. Xiao, Y., Kaku, I., Zhao, Q., Zhang, R.: A reduced variable neighborhood search algorithm for uncapacitated multilevel lot-sizing problems. Eur. J. Oper. Res. 214(2), 223–231 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  18. Zhang, M.L., Zhou, Z.H.: Ml-knn: a lazy learning approach to multi-label learning. Pattern Recogn. 40(7), 2038–2048 (2007)

    Article  MATH  Google Scholar 

  19. Zhang, M.L., Zhou, Z.H.: A review on multi-label learning algorithms. IEEE Trans. Knowl. Data Eng. 26(8), 1819–1837 (2014)

    Article  Google Scholar 

  20. Zhang, Y., Zhou, Z.H.: Multilabel dimensionality reduction via dependence maximization. ACM Trans. Knowl. Discov. Data (TKDD) 4(3), 14 (2010)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Wissam Siblini .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Cite this paper

Siblini, W., Alami, R., Meyer, F., Kuntz, P. (2017). Supervised Feature Space Reduction for Multi-Label Nearest Neighbors. In: Benferhat, S., Tabia, K., Ali, M. (eds) Advances in Artificial Intelligence: From Theory to Practice. IEA/AIE 2017. Lecture Notes in Computer Science(), vol 10350. Springer, Cham. https://doi.org/10.1007/978-3-319-60042-0_21

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-60042-0_21

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-60041-3

  • Online ISBN: 978-3-319-60042-0

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics