On the application of probabilistic distance measures for the extraction of features from imperfectly labeled patterns

  • C. Chitti Babu


A commonly used approach for feature selection is to select those features that extremize certain probabilistic distance measures. In most of the procedures it is assumed that the labels of the patterns are perfect. There are many practical situations in which the labels of the patterns are imperfect. This paper examines the applicability of the extremization of the Bhattacharyya distance, the divergence, equivocation, Kalmogrov variational distance, and Matusita distance as criteria for selecting the effective features from imperfectly identified patterns.


Operating System Feature Selection Distance Measure Practical Situation Variational Distance 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    T. Kailath, “Divergence and Bhattacharyya distance measures in signal selection,”IEEE Trans. Commun. Technology COM-15:52–60 (1967).Google Scholar
  2. 2.
    H. Kobayashi and J. B. Thomas, “Distance measures and related criteria,” inProc. 5th Ann. Allerton Conf. Circuit and System Theory, October 1967, pp. 491–500.Google Scholar
  3. 3.
    T. Marill and D. M. Green, “On the effectiveness of receptors in recognition systems,”IEEE Trans. Inf. Theory IT-9:11–17 (1963).Google Scholar
  4. 4.
    T. T. Kadota and L. A. Shepp, “On the best finite set of linear observabies for discriminating two Gaussian signals,”IEEE Trans. Inf. Theory,IT-18:278–284 (1967).Google Scholar
  5. 5.
    J. T. Tou and R. P. Heydorn, “Some approaches to optimum feature extraction,” inComputer and Information Sciences, Vol. II, J. T. Tou, Ed. (Academic, New York, 1967), pp. 57–89.Google Scholar
  6. 6.
    K. S. Fu, P. J. Min, and T. J. Li, “Feature selection in pattern recognition,”IEEE Trans. Systems Sci. and Cybernetics SSC-6:33–39 (1970).Google Scholar
  7. 7.
    A. Caprihan and R. J. P. Defigueiredo, “On the extraction of pattern features from continuous measurements,”IEEE Trans. Systems Sci. and Cybernetics SSC-6:110–115 (1970).Google Scholar
  8. 8.
    R. L. Kashyap, “Algorithms for pattern classification,” inAdaptive, Learning and Pattern Recognition Systems, J. M. Mendel and K. S. Ku, Eds. (Academic, New York, 1970).Google Scholar
  9. 9.
    R. O. Duda and R. C. Singleton, “Training a threshold logic unit with imperfectly classified patterns,” Presented at the WESCON Conv. Los Angeles, California, August 1964.Google Scholar
  10. 10.
    A. W. Whitney and S. J. Dwyer, “Performance and implementation of thek-nearest neighbor decision rule with incorrectly identified samples,” inProc. 4th Ann. Allerton Conf. Circuit and System Theory, 1966, pp. 96–106.Google Scholar
  11. 11.
    K. Shanmugam and A. M. Breiphol, “An error-correcting procedure for learning with an imperfect teacher,”IEEE Trans. Systems, Man and Cybernetics SMC-1:223–229 (1971).Google Scholar
  12. 12.
    C. Chitti Babu, “On the extraction of pattern features from imperfectly identified samples,”IEEE Trans. Computers C-21:410–411 (1972).Google Scholar
  13. 13.
    C. Chitti Babu, “On the application of divergence for the extraction of features from imperfectly labeled patterns,”IEEE Trans. System, Man and Cybernetics SMC-2:290–292 (1972).Google Scholar
  14. 14.
    G. H. Hardy, J. E. Littlewood, and G. Polya,Inequalities, 2nd ed. (Cambridge Univ. Press, 1952).Google Scholar
  15. 15.
    F. M. Reza,An Introduction to Information Theory (McGraw-Hill, New York, 1961).Google Scholar
  16. 16.
    M. E. Hellman and J. Raviv, “Probability of error, equivocation and the Chernoff bound,”IEEE Trans. Information Theory IT-16:368–372 (1970).Google Scholar
  17. 17.
    A. Feinstein,Foundations of Information Theory (McGraw-Hill, New York, 1958).Google Scholar
  18. 18.
    C. Chitti Babu, “On feature extraction in pattern recognition,”Proc. 5th Hawaii Int. Conf. on System Sciences, January 1972, pp. 20–23.Google Scholar
  19. 19.
    B. P. Adhikari and D. D. Joshi, “Distance discrimination et resume exhaustif,”Publ. Inst. Statist. 5:57–74 (1956).Google Scholar
  20. 20.
    K. Matusita, “A distance and related statistics in multivariate analysis,” inMulti-variate Analysis, P. R. Krishnaiah, Ed. (Academic, New York, 1966), pp. 187–200.Google Scholar

Copyright information

© Plenum Publishing Corporation 1973

Authors and Affiliations

  • C. Chitti Babu
    • 1
  1. 1.School of EngineeringUniversity of CaliforniaIrvine

Personalised recommendations