Skip to main content

An Adaption of Relief for Redundant Feature Elimination

  • Conference paper

Part of the Lecture Notes in Computer Science book series (LNTCS,volume 7368)

Abstract

Feature selection is important for many learning problems improving speed and quality. Main approaches include individual evaluation and subset evaluation methods. Individual evaluation methods, such as Relief, are efficient but can not detect redundant features, which limits the applications. A new feature selection algorithm removing both irrelevant and redundant features is proposed based on the basic idea of Relief. For each feature, not only effectiveness is evaluated, but also informativeness is considered according to the performance of other features. Experiments on bench mark datasets show that the new algorithm can removing both irrelevant and redundant features and keep the efficiency like a individual evaluation method.

Keywords

  • Feature selection
  • Relief algorithm
  • Redudant features

This is a preview of subscription content, access via your institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (Canada)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (Canada)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (Canada)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Yu, L., Liu, H.: Efficient Feature Selection Via Analysis of Relevance and Redundancy. J. Mach. Learn. Res. 5, 1205–1224 (2004)

    MathSciNet  MATH  Google Scholar 

  2. Kira, K., Rendell, L.A.: The Feature Selection Problem: Traditional Methods and a New Algorithm. In: Proceedings of the National Conference on Artificial Intelligence, p. 129. John Wiley & Sons Ltd., Hoboken (1992)

    Google Scholar 

  3. Guyon, I., Elisseeff, A.: An Introduction to Variable and Feature Selection. J. Mach. Learn. Res. 3, 1157–1182 (2003)

    MATH  Google Scholar 

  4. Lee, C., Lee, G.G.: Information Gain and Divergence-based Feature Selection for Machine Learning-based Text Categorization. Inform. Process. Manag. 42(1), 155–165 (2006)

    CrossRef  Google Scholar 

  5. Shang, W., Huang, H., Zhu, H., Lin, Y., Qu, Y., Wang, Z.: A Novel Feature Selection Algorithm for Text Categorization. Expert. Syst. Appl. 33(1), 1–5 (2007)

    CrossRef  Google Scholar 

  6. Kononenko, I.: On Biases in Estimating Multi-valued Attributes. In: Proceedings of the 14th International Joint Conference on Artificial Intelligence, vol. 14, pp. 1034–1040. Morgan Kaufmann Publishers Inc., San Francisco (1995)

    Google Scholar 

  7. Breiman, L.: Classification and Regression Trees. Chapman & Hall/CRC (1984)

    Google Scholar 

  8. Kononenko, I.: Estimating Attributes: Analysis and Extensions of RELIEF. In: Bergadano, F., De Raedt, L. (eds.) ECML 1994. LNCS, vol. 784, pp. 171–182. Springer, Heidelberg (1994)

    CrossRef  Google Scholar 

  9. Robnik-Šikonja, M., Kononenko, I.: An Adaptation of Relief for Attribute Estimation in Regression. In: Proceedings of the Fourteenth International Conference on Machine Learning, pp. 296–304. Morgan Kaufmann Publishers Inc., San Francisco (1997)

    Google Scholar 

  10. John, G., Kohavi, R., Pfleger, K.: Irrelevant Features and the Subset Selection Problem. In: Proceedings of the Eleventh International Conference on Machine Learning, vol. 129, pp. 121–129. Morgan Kaufmann Publishers Inc., San Francisco (1994)

    Google Scholar 

  11. Koller, D., Sahami, M.: Toward Optimal Feature Selection. In: Proceedings of the Thirteenth International Conference on Machine Learning, vol. 1996, pp. 284–292. Morgan Kaufmann Publishers Inc., San Francisco (1996)

    Google Scholar 

  12. Hall, M.: Correlation-based Feature Selection for Machine Learning. PhD thesis, The University of Waikato (1999)

    Google Scholar 

  13. Almuallim, H., Dietterich, T.G.: Learning Boolean Concepts in the Presence of Many Irrelevant Features. Artif. Intell. 69(1-2), 279–305 (1994)

    CrossRef  MathSciNet  MATH  Google Scholar 

  14. Liu, H., Setiono, R.: A Probabilistic Approach to Feature Selection a Filter Solution. In: Machine Learning International Conference, pp. 319–327. Morgan Kaufmann Publishers, Inc., San Francisco (1996)

    Google Scholar 

  15. Robnik-Šikonja, M., Kononenko, I.: Theoretical and Empirical Analysis of ReliefF and RReliefF. Mach. Learn. 53(1), 23–69 (2003)

    CrossRef  MATH  Google Scholar 

  16. Kononenko, I., Šimec, E., Robnik-Šikonja, M.: Overcoming the Myopia of Inductive Learning Algorithms with RELIEFF. Appl. Intell. 7(1), 39–55 (1997)

    CrossRef  Google Scholar 

  17. Kononenko, I., Simec, E.: Induction of Decision Trees Using RELIEFF. In: Math. Stat. Method. Artif. Intell. Springer (1995)

    Google Scholar 

  18. Moore, J.H., White, B.C.: Tuning ReliefF for Genome-Wide Genetic Analysis. In: Marchiori, E., Moore, J.H., Rajapakse, J.C. (eds.) EvoBIO 2007. LNCS, vol. 4447, pp. 166–175. Springer, Heidelberg (2007)

    CrossRef  Google Scholar 

  19. Greene, C.S., Penrod, N.M., Kiralis, J., Moore, J.H.: Spatially Uniform ReliefF (SURF) for Computationally-efficient Filtering of Gene-gene Interactions. BioData Min. 2(1), 1–9 (2009)

    CrossRef  Google Scholar 

  20. Zhang, Y., Ding, C., Li, T.: Gene Selection Algorithm by Combining ReliefF and MRMR. BMC Genomics 9(suppl. 2), 27 (2008)

    CrossRef  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and Permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Wu, T., Xie, K., Nie, C., Song, G. (2012). An Adaption of Relief for Redundant Feature Elimination. In: Wang, J., Yen, G.G., Polycarpou, M.M. (eds) Advances in Neural Networks – ISNN 2012. ISNN 2012. Lecture Notes in Computer Science, vol 7368. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-31362-2_9

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-31362-2_9

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-31361-5

  • Online ISBN: 978-3-642-31362-2

  • eBook Packages: Computer ScienceComputer Science (R0)