Skip to main content

Incremental Input Variable Selection by Block Addition and Block Deletion

  • Conference paper
Artificial Neural Networks and Machine Learning – ICANN 2014 (ICANN 2014)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 8681))

Included in the following conference series:

Abstract

In selecting input variables by block addition and block deletion (BABD), multiple input variables are added and then deleted, keeping the cross-validation error below that using all the input variables. The major problem of this method is that selection time becomes large as the number of input variables increases. To alleviate this problem, in this paper, we propose incremental block addition and block deletion of input variables. In this method, for an initial subset of input variables we select input variables by BABD. Then in the incremental step, we add some input variables that are not added before to the current selected input variables and iterate BABD. To guarantee that the cross-validation error decreases monotonically by incremental BABD, we undo incremental BABD if the obtained cross-validation error rate is worse than that at the previous incremental step. We evaluate incremental BABD using some benchmark data sets and show that by incremental BABD, input variable selection is speeded up with the approximation error comparable to that by batch BABD.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Vapnik, V.N.: Statistical Learning Theory. John Wiley & Sons (1998)

    Google Scholar 

  2. Suykens, J.A.K.: Least squares support vector machines for classification and nonlinear modeling. Neural Network World 10(1–2), 29–47 (2000)

    MathSciNet  Google Scholar 

  3. Abe, S.: Support Vector Machines for Pattern Classification. Springer (2005)

    Google Scholar 

  4. Fung, G.M., Mangasarian, O.L.: A feature selection Newton method for support vector machine classification. Computational Optimization and Applications 28(2), 185–202 (2004)

    Article  MATH  MathSciNet  Google Scholar 

  5. Stearns, S.D.: On selecting features for pattern classifiers. In: Proc. Third International Conference on Pattern Recognition, pp. 71–75 (1976)

    Google Scholar 

  6. Pudil, P., Novovičová, J., Kittler, J.: Floating search methods in feature selection. Pattern Recognition Letters 15(11), 1119–1125 (1994)

    Article  Google Scholar 

  7. Zhang, T.: Adaptive forward-backward greedy algorithm for sparse learning with linear models. In: Proc. NIPS 1921, pp. 1921–1928. MIT Press (2009)

    Google Scholar 

  8. Liu, H., Setiono, R.: Incremental feature selection. Applied Intelligence 9(3), 217–230 (1998)

    Article  Google Scholar 

  9. Perkins, S., Lacker, K., Theiler, J.: Grafting: Fast, incremental feature selection by gradient descent in function space. Journal of Machine Learning Research 3, 1333–1356 (2003)

    MATH  MathSciNet  Google Scholar 

  10. Ruiz, R., Riquelme, J.C., Aguilar-Ruiz, J.S.: Incremental wrapper-based gene selection from microarray data for cancer classification. Pattern Recognition 39(12), 2383–2392 (2006)

    Article  Google Scholar 

  11. Bermejo, P., Gamez, J.A., Puerta, J.M.: Speeding up incremental wrapper feature subset selection with naive Bayes classifier. Knowledge-Based Systems 55, 140–147 (2014)

    Article  Google Scholar 

  12. Nagatani, T., Ozawa, S., Abe, S.: Fast variable selection by block addition and block deletion. Journal of Intelligent Learning Systems and Applications 2(4), 200–211 (2010)

    Article  Google Scholar 

  13. Nagatani, T., Abe, S.: Feature selection by block addition and block deletion. In: Mana, N., Schwenker, F., Trentin, E. (eds.) ANNPR 2012. LNCS (LNAI), vol. 7477, pp. 48–59. Springer, Heidelberg (2012)

    Chapter  Google Scholar 

  14. Asuncion, A., Newman, D.J.: UCI machine learning repository (2007), http://www.ics.uci.edu/~mlearn/MLRepository.html

  15. Milano Chemometrics and QSAR Research Group , http://michem.disat.unimib.it/chm/download/download.htm

  16. UCL Machine Learning Group, http://mlg.info.ucl.ac.be/index.php?page=DataBases

  17. Hedenfalk, I., et al.: Gene-expression profiles in hereditary breast cancer. The New England Journal of Medicine 344(8), 539–548 (2001)

    Article  Google Scholar 

  18. Golub, T.R., et al.: Molecular classification of cancer: Class discovery and class prediction by gene expression monitoring. Science 286, 531–537 (1999)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer International Publishing Switzerland

About this paper

Cite this paper

Abe, S. (2014). Incremental Input Variable Selection by Block Addition and Block Deletion. In: Wermter, S., et al. Artificial Neural Networks and Machine Learning – ICANN 2014. ICANN 2014. Lecture Notes in Computer Science, vol 8681. Springer, Cham. https://doi.org/10.1007/978-3-319-11179-7_69

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-11179-7_69

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-11178-0

  • Online ISBN: 978-3-319-11179-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics