Supervised Feature Extraction Algorithm Based on Continuous Divergence Criterion

  • Shifei Ding
  • Zhongzhi Shi
  • Fengxiang Jin
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4114)


Feature extraction plays an important part in pattern recognition (PR), data mining, machine learning et al. In this paper, a novel supervised feature extraction algorithm based on continuous divergence criterion (CDC) is set up. Firstly, the concept of the CDC is given, and some properties of the CDC are studied, and proved that CDC here is a kind of distance measure, i.e. it satisfies three requests of distance axiomatization, which can be used to measure the difference degree of a two-class problem. Secondly, based on CDC, the basic principle of supervised feature extraction are studied, a new concept of accumulated information rate (AIR) is given, which can be used to measure the degree of feature compression for two-class, and a new supervised feature extraction algorithm is constructed. At last, the experimental results demonstrate that the algorithm here is valid and reliable.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Duda, R.O., Hart, P.E. (eds.): Pattern Classification and Scene Analysis. Wiley, New York (1973)zbMATHGoogle Scholar
  2. 2.
    Theodoridis, S., Koutroumbas, K. (eds.): Pattern Recognition, 2nd edn. Academic Press, New York (2003)Google Scholar
  3. 3.
    Fukunaga, K. (ed.): Introduction to Statistical Pattern Recognition, 2nd edn. Academic Press, New York (1990)zbMATHGoogle Scholar
  4. 4.
    Hand, D.J. (ed.): Discrimination and Classification. Wiley, New York (1981)zbMATHGoogle Scholar
  5. 5.
    Turk, M., Pentland, A.: Eigenfaces for recognition. Journal Cognitive Neuroscience 3(1), 71–86 (1991)CrossRefGoogle Scholar
  6. 6.
    Turk, M., Pentland, A.: Face recognition using Eigenfaces. In: Proceedings IEEE Conference on Computer Vision and Pattern Recognition, pp. 586–591 (1991)Google Scholar
  7. 7.
    Yang, J., Yang, J.Y.: A Generalized K-L Expansion Method That Can Deal with Small Sample Size and High-dimensional Problems. Pattern Analysis Applications 6(6), 47–54 (2003)zbMATHCrossRefGoogle Scholar
  8. 8.
    Wang, H.W. (ed.): Partial Least Squares Regression and Applications. National Defence Industry Press, Beijing (2000)Google Scholar
  9. 9.
    Ding, S.F., Jin, F.X., Shi, Z.Z.: Information Feature Compression Algorithm Based on Partial Least Squares. Journal of Computer-aided Design & Computer Graphics 17(2), 368–371 (2005)Google Scholar
  10. 10.
    Shannon, C.E.: A Mathematical Theory of Communication. Bell Syst. Tech. J. 27, 379–423 (1948)zbMATHMathSciNetGoogle Scholar
  11. 11.
    Cover, T.M., Thomas, J.A. (eds.): Elements of Information Theory. Wiley, New York (1991)zbMATHGoogle Scholar
  12. 12.
    Ding, S.F., Shi, Z.Z.: Symmetric Cross Entropy and Information Feature Compression Algorithm. Journal of Computational Information Systems 1(2), 247–252 (2005)Google Scholar
  13. 13.
    Chen, S.H., Lu, C.L.: An Entropic Approach to Dimensionality Reduction in the Representation Space on Discrete Processes. Journal of Nanjing Institute of Meteorology 24(1), 74–82 (2001)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Shifei Ding
    • 1
    • 2
  • Zhongzhi Shi
    • 2
  • Fengxiang Jin
    • 3
  1. 1.College of Information Science and EngineeringShandong Agricultural UniversityTaianP.R. China
  2. 2.Key Laboratory of Intelligent Information Processing, Institute of Computing TechnologyChinese Academy of SciencesBeijingP.R. China
  3. 3.College of Geo-Information Science and EngineeringShandong University of Science and TechnologyQingdaoP.R. China

Personalised recommendations