Rough Feature Selection for Intelligent Classifiers

  • Qiang Shen
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4400)


The last two decades have seen many powerful classification systems being built for large-scale real-world applications. However, for all their accuracy, one of the persistent obstacles facing these systems is that of data dimensionality. To enable such systems to be effective, a redundancy-removing step is usually required to pre-process the given data. Rough set theory offers a useful, and formal, methodology that can be employed to reduce the dimensionality of datasets. It helps select the most information rich features in a dataset, without transforming the data, all the while attempting to minimise information loss during the selection process. Based on this observation, this paper discusses an approach for semantics-preserving dimensionality reduction, or feature selection, that simplifies domains to aid in developing fuzzy or neural classifiers. Computationally, the approach is highly efficient, relying on simple set operations only. The success of this work is illustrated by applying it to addressing two real-world problems: industrial plant monitoring and medical image analysis.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Devijver, P., Kittler, J.: Pattern Recognition: a Statistical Approach. Prentice-Hall, Englewood Cliffs (1982)zbMATHGoogle Scholar
  2. 2.
    Dudois, D., Prade, H.: Putting rough sets and fuzzy sets together. In: Slowinski, R. (ed.) Intelligent Decision Support, pp. 203–232. Kluwer Academic Publishers, Dordrecht (1992)Google Scholar
  3. 3.
    Lozowski, A., Cholewo, T., Zurada, J.: Crisp rule extraction from perceptron network classifiers. In: Proceedings of International Conference on Neural Networks, vol. Plenary, Panel and Special Sessions, pp. 94–99 (1996)Google Scholar
  4. 4.
    Mandelbrot, B.: The Fractal Geometry of Nature. Freeman, San Francisco (1982)zbMATHGoogle Scholar
  5. 5.
    Mitchell, T.: Machine Learning. McGraw-Hill, New York (1997)zbMATHGoogle Scholar
  6. 6.
    Orlowska, E.: Incomplete Information: Rough Set Analysis. Springer, Heidelberg (1997)Google Scholar
  7. 7.
    Pawlak, Z.: Rough Sets: Theoretical Aspects of Reasoning About Data. Kluwer Academic Publishers, Dordrecht (1991)zbMATHGoogle Scholar
  8. 8.
    Peters, J., Skowron, A.: A rough set approach to knowledge discovery. International Journal of Intelligent Systems 17, 109–112 (2002)CrossRefGoogle Scholar
  9. 9.
    Pawlak, Z., Skowron, A.: Rough set rudiments. Bulletin of International Rough Set Society 3(4), 43–47 (2000)Google Scholar
  10. 10.
    Quinlan, J.R.: C4.5: Programs for Machine Learning. The Morgan Kaufmann Series in Machine Learning. Morgan Kaufmann, San Francisco (1993)Google Scholar
  11. 11.
    Rumelhant, D., Hinton, E., Williams, R.: Learning internal representations by error propagating. In: Rumelhant, E., McClelland, J. (eds.) Parallel Distributed Processing, MIT Press, Cambridge (1986)Google Scholar
  12. 12.
    Shang, C., Shen, Q.: Rough feature selection for neural network based image classification. International Journal of Image and Graphics 2, 541–555 (2002)CrossRefGoogle Scholar
  13. 13.
    Shen, Q.: Semantics-preserving dimensionality reduction in intelligent modelling. In: Shanahan, L.J., Ralescu, A. (eds.) Modelling with Words, Springer, Heidelberg (2003)Google Scholar
  14. 14.
    Shen, Q., Chouchoulas, A.: A fuzzy-rough approach for generating classification rules. Pattern Recognition 35, 341–354 (2002)CrossRefGoogle Scholar
  15. 15.
    Shen, Q., Jensen, R.: Selecting informative features with fuzzy-rough sets and its application for complex systems monitoring. Pattern Recognition 37, 1351–1363 (2004)zbMATHCrossRefGoogle Scholar

Copyright information

© Springer Berlin Heidelberg 2007

Authors and Affiliations

  • Qiang Shen
    • 1
  1. 1.Department of Computer Science, The University of Wales, Aberystwyth SY23 3DBUK

Personalised recommendations