Advertisement

HyperSurface Classifiers Ensemble for High Dimensional Data Sets

  • Xiu-Rong Zhao
  • Qing He
  • Zhong-Zhi Shi
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3971)

Abstract

Based on Jordan Curve Theorem, a universal classification method called HyperSurface Classifier (HSC) has recently been proposed. Experimental results show that in three-dimensional space, this method works fairly well in both accuracy and efficiency even for large size data up to 107. However, what we really need is an algorithm that can deal with data not only of massive size but also of high dimensionality. In this paper, an approach based on the idea of classifiers ensemble by dimension dividing without dimension reduction for high dimensional data is proposed. The most important difference between HSC ensemble and the traditional ensemble is that the sub-datasets are obtained by dividing the features rather than by dividing the sample set. Experimental results show that this method has a preferable performance on high dimensional datasets.

Keywords

High Dimensional Data Classifier Ensemble Decision Attribute Massive Size Preferable Performance 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    He, Q., Shi, Z.Z., Ren, L.A.: The Classification Method Based on Hyper Surface. In: Proc. 2002 IEEE International Joint Conference on Neural Networks, Las Vegas, pp. 1499–1503 (2002)Google Scholar
  2. 2.
    He, Q., Shi, Z.Z., Ren, L.A., Lee, E.S.: A Novel Classification Method Based on Hyper Surface. Int. J. of Mathematical and Computer Modeling 38, 395–407 (2003)MATHCrossRefMathSciNetGoogle Scholar
  3. 3.
    He, Q., Zhao, X.R., Shi, Z.Z.: A Kind of Dimension Reduction Method for Classification Based on Hypersurface. In: Wang, X.Z. (ed.) Proc. of International Conference on Machine Learning and Cybernetics, pp. 3248–3253. IEEE Press, Los Alamitos (2005)Google Scholar
  4. 4.
    Hansen, L.K., Salamon, P.: Neural Network Ensembles. IEEE Transaction on Pattern Analysis and Machine Intelligence 12(10), 993–1001 (1990)CrossRefGoogle Scholar
  5. 5.
    Zhou, Z.H., Wu, J., Jiang, Y., Chen, S.F.: Genetic Algorithm Based Selective Neural Network Ensemble. In: Proc. the 17th International Joint Conference on Artificial Intelligence, Seattle WA2, pp. 797–802 (2001)Google Scholar
  6. 6.
    Wang, S.J.: Bionic (Topological) Pattern Recognition—A New Model of Pattern Recognition Theory and Its Applications. Acta Electronica Sinica 30(10), 1417–1420 (2002)Google Scholar
  7. 7.
    Wang, S.J., Qu, Y.F., Li, W.J., Qin, H.: Face Recognition: Biomimetic Pattern Recognition vs. Traditional Recognition. Acta Electronica Sinica 32(7), 1057–1061 (2004)Google Scholar
  8. 8.
    Cao, W.M., Hao, F., Wang, S.J.: The Application of DBF Neural Networks for Object Recognition. Information Sciences—Informatics and Computer Science: An International Journal 160(1-4), 153–160 (2004)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Xiu-Rong Zhao
    • 1
  • Qing He
    • 1
  • Zhong-Zhi Shi
    • 1
  1. 1.The Key Laboratory of Intelligent Information Processing, Department of Intelligence Software, Institute of Computing TechnologyChinese Academy of SciencesBeijingChina

Personalised recommendations