Divergence-Based Supervised Information Feature Compression Algorithm

  • Shi-Fei Ding
  • Zhong-Zhi Shi
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3971)


In this paper, a novel supervised information feature compression algorithm based on divergence is set up. Firstly, according to the information theory, the concept and its properties of the divergence, i.e. average separability information (ASI) is studied, and a concept of symmetry average separability information (SASI) is proposed, and proved that the SASI here is a kind of distance measure, i.e. the SASI satisfies three requests of distance axiomatization, which can be used to measure the difference degree of a two-class problem. Secondly, based on the SASI, a compression theorem is given, and can be used to design information feature compression algorithm. Based on these discussions, we design a novel supervised information feature compression algorithm based on the SASI. At last, the experimental results demonstrate that the algorithm here is valid and reliable.


Feature Vector Pattern Vector True Distance Feature Compression Information Compression 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Duda, R.O., Hart, P.E. (eds.): Pattern Classification and Scene Analysis. Wiley, New York (1973)MATHGoogle Scholar
  2. 2.
    Devroye, L., Gyorfi, L., Lugosi, G. (eds.): A Probabilistic Theory of Pattern Recognition. Springer, New York (1996)MATHGoogle Scholar
  3. 3.
    Fukunaga, K. (ed.): Introduction to Statistical Pattern Recognition, 2nd edn. Academic Press, New York (1990)MATHGoogle Scholar
  4. 4.
    Hand, D.J. (ed.): Discrimination and Classification. Wiley, New York (1981)MATHGoogle Scholar
  5. 5.
    Turk, M., Pentland, A.: Eigenfaces for Recognition. Journal Cognitive Neuroscience 3(1), 71–86 (1991)CrossRefGoogle Scholar
  6. 6.
    Yang, J., Yang, J.Y.: A Generalized K-L Expansion Method That Can Deal with Small Sample Size and High-dimensional Problems. Pattern Analysis Applications 6(6), 47-54 (2003)Google Scholar
  7. 7.
    Cover, T.M., Thomas, J.A.: Elements of Information Theory. Wiley, New York (1991)MATHCrossRefGoogle Scholar
  8. 8.
    Ding, S.F., Shi, Z.Z.: Symmetric Cross Entropy and Information Feature Compression Algorithm. Journal of Computational Information Systems 1(2), 247–252 (2005)Google Scholar
  9. 9.
    Chen, S.H., Lu, C.L.: An Entropic Approach to Dimensionality Reduction in the Representation Space on Discrete Processes. Journal of Nanjing Institute of Meteorology 24(1), 74–82 (2001)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Shi-Fei Ding
    • 1
    • 2
  • Zhong-Zhi Shi
    • 2
  1. 1.College of Information Science and EngineeringShandong Agricultural UniversityTaianP.R. China
  2. 2.Key Laboratory of Intelligent Information Processing, Institute of Computing TechnologyChinese Academy of SciencesBeijingP.R. China

Personalised recommendations