Comparison of Various Feature Selection Methods in Application to Prototype Best Rules

  • Marcin Blachnik
Part of the Advances in Intelligent and Soft Computing book series (AINSC, volume 57)


Prototype based rules is an interesting tool for data analysis. However most of prototype selection methods like CFCM+LVQ algorithm do not have embedded feature selection methods and require feature selection as initial preprocessing step. The problem that appears is which of the feature selection methods should be used with CFCM+LVQ prototype selection method, and what advantages or disadvantages of certain solutions can be pointed out. The analysis of the above problems is based on empirical data analysis.


Feature Selection Feature Subset Feature Selection Method Ranking Method Feature Selection Algorithm 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Duch, W.: Similarity based methods: a general framework for classification approximation and association. Control and Cybernetics 29, 937–968 (2000)zbMATHMathSciNetGoogle Scholar
  2. 2.
    Duch, W., Blachnik, M.: Fuzzy rule-based systems derived from similarity to prototypes. In: Pal, N., Kasabov, N., Mudi, R., Pal, S., Parui, S. (eds.) ICONIP 2004. LNCS, vol. 3316, pp. 912–917. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  3. 3.
    Blachnik, M., Duch, W., Wieczorek, T.: Selection of prototypes rules context searching via clustering. In: Rutkowski, L., Tadeusiewicz, R., Zadeh, L.A., Żurada, J.M. (eds.) ICAISC 2006. LNCS(LNAI), vol. 4029, pp. 573–582. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  4. 4.
    Shridhar, D., Bartlett, E., Seagrave, R.: Information theoretic subset selection. Computers in Chemical Engineering 22, 613–626 (1998)CrossRefGoogle Scholar
  5. 5.
    Shanonn, C., Weaver, W.: The Mathematical Theory of Communication. University of Illinois Press (1946)Google Scholar
  6. 6.
    Setiono, R., Liu, H.: Improving backpropagation learning with feature selection. The International Journal of Artificial Intelligence, Neural Networks, and Complex Problem-Solving Technologies 6, 129–139 (1996)Google Scholar
  7. 7.
    de Mantaras, R.L.: A distance-based attribute selecting measure for decision tree induction. Machine Learning 6, 81–92 (1991)CrossRefGoogle Scholar
  8. 8.
    Chi, J.: Entropy based feature evaluation and selection technique. In: Proc. of 4-th Australian Conf. on Neural Networks (ACNN 1993) (1993)Google Scholar
  9. 9.
    Yu, L., Liu, H.: Feature selection for high-dimensional data: A fast correlation-based filter solution. In: Proceedings of The Twentieth International Conference on Machine Learning (2003)Google Scholar
  10. 10.
    Duch, W., Biesiada, J.: Feature selection for high-dimensional data: A kolmogorov-smirnov correlation-based filter solution. In: Advances in Soft Computing, pp. 95–104. Springer, Heidelberg (2005)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Marcin Blachnik
    • 1
  1. 1.Electrotechnology DepartmentSilesian University of TechnologyPoland

Personalised recommendations