Advertisement

Rule Extraction and Reduction for Hyper Surface Classification

  • Qing He
  • Jincheng Li
  • Zhongzhi Shi
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5552)

Abstract

Hyper Surface Classification (HSC), which is based on Jordan Curve Theorem in Topology, is one of the accurate and efficient classification algorithms. The hyper surface obtained by the training process exhibits excellent generalization performance on datasets not only of large size but also of high dimensionality. The classification knowledge hidden in the classifier, however, is hard to interpret by human. How to obtain the classification rules is an important problem. In this paper, we firstly extract rule from the sample directly. In order to avoid rule redundance, two optimal policies, selecting Minimal Consistent Subset (MCS) for the training set and merging some neighboring cubes, are exerted to reduce the rules set. Experimental results show that the two policies are able to accurately acquire the knowledge implied by the hyper surface and express the good generalization performance of HSC. Moreover, the time for classifying the unlabeled sample by the rules set can be shorten correspondingly.

Keywords

HSC MCS Rule Extraction Rule Reduction 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    He, Q., Shi, Z.Z., Ren, L.A.: The Classification Method Based on Hyper Surface. In: Proc. 2002 IEEE Int. Joint Conference on Neural Networks, pp. 1499–1503 (2002)Google Scholar
  2. 2.
    He, Q., Shi, Z.Z., Ren, L.A., Lee, E.S.: A Novel Classification Method Based on Hyper surface. Int. J. of Mathematical and Computer Modeling 38, 395–407 (2003)CrossRefzbMATHGoogle Scholar
  3. 3.
    He, Q., Zhao, X.R., Shi, Z.Z.: Classification based on dimension transposition for high dimension data. Soft Computing 11, 329–334 (2006)CrossRefGoogle Scholar
  4. 4.
    Zhao, X.R., He, Q., Shi, Z.Z.: Hyper surface classifiers ensemble for high dimensional data sets. In: 3rd Int. Symp. Neural Networks, pp. 1299–1304 (2006)Google Scholar
  5. 5.
    Gallant, S.I.: Connectionist expert system. Communications of the ACM 31, 152–169 (1988)CrossRefGoogle Scholar
  6. 6.
    Andrews, R., Diederich, J., Tickle, A.B.: Survey and critique of techniques for extracting rules from trained artificial neural networks. Knowledge-Based System 8, 373–389 (1995)CrossRefzbMATHGoogle Scholar
  7. 7.
    Núñez, H., Angulo, C., Català, A.: Rule extraction from support vector machines. In: Proc. 2002 European Symposium on Artificial Neural Networks, pp. 107–112 (2002)Google Scholar
  8. 8.
    Fountoukis, S.G., Bekakos, M.P., Kontos, J.P.: Rule extraction from decision trees with complex nominal data. Parallel & Scientific Computations 9, 119–128 (2001)zbMATHGoogle Scholar
  9. 9.
    Hart, P.E.: The condensed nearest neighbor rule. IEEE Trans. Inform. Th. IT 214, 515–516 (1968)CrossRefGoogle Scholar
  10. 10.
    Gates, G.W.: The reduced nearest neighbor rule. IEEE Trans. Inform. Th. IT 218, 431–433 (1972)CrossRefGoogle Scholar
  11. 11.
    Cohen, W.W.: Fast effective rule induction. In: In Machine Learning: Proc. of the Twelfth International Conference, pp. 115–123 (1995)Google Scholar
  12. 12.
    Gao, B.J., Ester, M., Fraser, S., Schulte, O., Xiong, H.: The Minimum Consistent Subset Cover Problem and its Applications in Data Mining. In: Proc. the 13th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, pp. 310–319 (2007)Google Scholar
  13. 13.
    Dasarathy, B.V.: Minimal Consistent Set Identification for Optimal Nearest Neighbor Decision Systems Design. IEEE Trans. on System, Man, and Cybernetics 24, 511–517 (1994)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Qing He
    • 1
    • 2
  • Jincheng Li
    • 1
    • 2
  • Zhongzhi Shi
    • 1
  1. 1.Key Laboratory of Intelligent Information ProcessingInstitute of Computing Technology, Chinese Academy of SciencesBeijingChina
  2. 2.Graduate School of the Chinese Academy of SciencesBeijingChina

Personalised recommendations