Advertisement

A Hypergraph-Based Approach to Feature Selection

  • Zhihong Zhang
  • Edwin R. Hancock
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6854)

Abstract

In many data analysis tasks, one is often confronted with the problem of selecting features from very high dimensional data. The feature selection problem is essentially a combinatorial optimization problem which is computationally expensive. To overcome this problem it is frequently assumed that either features independently influence the class variable or do so only involving pairwise feature interaction. To overcome this problem, we draw on recent work on hyper-graph clustering to extract maximally coherent feature groups from a set of objects using high-order (rather than pairwise) similarities. We propose a three step algorithm that, namely, i) first constructs a graph in which each node corresponds to each feature, and each edge has a weight corresponding to the interaction information among features connected by that edge, ii) perform hypergraph clustering to select a highly coherent set of features, iii) further selects features based on a new measure called the multidimensional interaction information (MII). The advantage of MII is that it incorporates third or higher order feature interactions. This is realized using hypergraph clustering, which separates features into clusters prior to selection, thereby allowing us to limit the search space for higher order interactions. Experimental results demonstrate the effectiveness of our feature selection method on a number of standard data-sets.

Keywords

Hypergraph clustering Multidimensional interaction information(MII) 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Balagani, S., Phoha, V.: On the Feature Selection Criterion Based on an Approximation of Multidimensional Mutual Information. IEEE TPAMI 32(7), 1342–1343 (2010)CrossRefGoogle Scholar
  2. 2.
    Blum, L., Rivest, L.: Training a 3-Node Neural Network is NP-complete. Neural Networks 5(1), 117–127 (1992)CrossRefGoogle Scholar
  3. 3.
    Covões, T., Hruschka, E., de Castro, L., Santos, Á.: A Cluster-based Feature Selection Approach. In: Corchado, E., Wu, X., Oja, E., Herrero, Á., Baruque, B. (eds.) HAIS 2009. LNCS, vol. 5572, pp. 169–176. Springer, Heidelberg (2009)CrossRefGoogle Scholar
  4. 4.
    Devijver, A., Kittler, J.: Pattern Recognition: A Statistical Approach, vol. 761. Prentice-Hall, London (1982)zbMATHGoogle Scholar
  5. 5.
    Guo, B., Nixon, S.: Gait Feature Subset Selection by Mutual Information. IEEE TSMC, Part A: Systems and Humans 39(1), 36–46 (2008)Google Scholar
  6. 6.
    Kwak, N., Choi, H.: Input Feature Selection by Mutual Information Based on Parzen Window. IEEE TPAMI 24(12), 1667–1671 (2002)CrossRefGoogle Scholar
  7. 7.
    Zhang, F., Zhao, Y.J., Fen, J.: Unsupervised Feature Selection based on Feature Relevance. In: ICMLC, vol. 1, pp. 487–492 (2009)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Zhihong Zhang
    • 1
  • Edwin R. Hancock
    • 1
  1. 1.Department of Computer ScienceUniversity of YorkUK

Personalised recommendations