Non-iterative Heteroscedastic Linear Dimension Reduction for Two-Class Data

From Fisher to Chernoff
  • Marco Loog
  • Robert P. W. Duin
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2396)

Abstract

Linear discriminant analysis (LDA) is a traditional solution to the linear dimension reduction (LDR) problem, which is based on the maximization of the between-class scatter over the within-class scatter. This solution is incapable of dealing with heteroscedastic data in a proper way, because of the implicit assumption that the covariance matrices for all the classes are equal. Hence, discriminatory information in the difference between the covariance matrices is not used and, as a consequence, we can only reduce the data to a single dimension in the two-class case. We propose a fast non-iterative eigenvector-based LDR technique for heteroscedastic two-class data, which generalizes, and improves upon LDA by dealing with the aforementioned problem. For this purpose, we use the concept of directed distance matrices, which generalizes the between-class covariance matrix such that it captures the differences in (co)variances.

Keywords

Linear Discriminant Analysis Covariance Matrice Eigenvalue Decomposition Discriminatory Information Fisher Criterion 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    T. W. Anderson and R. R. Bahadur. Classification into two multivariate normal distributions with different covariance matrices. Annals of Mathematical Statistics, 33:420–431, 1962.MathSciNetMATHCrossRefGoogle Scholar
  2. 2.
    C. H. Chen. On information and distance measures, error bounds, and feature selection. The information scientist, 10:159–173, 1979.CrossRefGoogle Scholar
  3. 3.
    J. K. Chung, P. L. Kannappan, C. T. Ng, and P. K. Sahoo. Measures of distance between probability distributions. Journal of mathematical analysis and applications, 138:280–292, 1989.MATHCrossRefMathSciNetGoogle Scholar
  4. 4.
    H. P. Decell and S. K. Marani. Feature combinations and the Bhattacharyya criterion. Communications in Statistics. Part A. Theory and Methods, 5:1143–1152, 1976.MathSciNetCrossRefGoogle Scholar
  5. 5.
    H. P. Decell and S. M. Mayekar. Feature combinations and the divergence criterion. Computers and Mathematics with Applications, 3:71–76, 1977.MATHCrossRefGoogle Scholar
  6. 6.
    R. A. Fisher. The use of multiple measurements in taxonomic problems. Annals of Eugenics, 7:179–188, 1936.Google Scholar
  7. 7.
    K. Fukunaga. Introduction to Statistical Pattern Recognition. Academic Press, New York, 1990.MATHGoogle Scholar
  8. 8.
    B. van Ginneken and B. M. ter Haar Romeny. Automatic segmentation of lung fields in chest radiographs. Medical Physics, 27(10):2445–2455, 2000.CrossRefGoogle Scholar
  9. 9.
    A. K. Jain, R. P. W. Duin, and J. Mao. Statistical pattern recognition: A review. IEEE Transactions on Pattern Analysis and Machine Intelligence, 22(1):4–37, 2000.CrossRefGoogle Scholar
  10. 10.
    D. Kazakos. On the optimal linear feature. IEEE Transactions on Information Theory, 24:651–652, 1978.MATHCrossRefMathSciNetGoogle Scholar
  11. 11.
    N. Kumar and A. G. Andreou. Generalization of linear discriminant analysis in a maximum likelihood framework. In Proceedings of the Joint Meeting of the American Statistical Association, 1996.Google Scholar
  12. 12.
    M. Loog. Approximate Pairwise Accuracy Criteria for Multiclass Linear Dimension Reduction: Generalisations of the Fisher Criterion. Number 44 in WBBM Report Series. Delft University Press, Delft, 1999.Google Scholar
  13. 13.
    W. Malina. On an extended Fisher criterion for feature selection. IEEE Transactions on Pattern Analysis and Machine Intelligence, 3:611–614, 1981.CrossRefGoogle Scholar
  14. 14.
    G. J. McLachlan. Discriminant Analysis and Statistical Pattern Recognition. John Wiley & Sons, New York, 1992.CrossRefGoogle Scholar
  15. 15.
    P. M. Murphy and D. W. Aha. UCI Repository of machine learning databases. [http://www.ics.uci.edu/~mlearn/mlrepository.html].
  16. 16.
    C. R. Rao. The utilization of multiple measurements in problems of biological classification. Journal of the Royal Statistical Society. Series B, 10:159–203, 1948.Google Scholar
  17. 17.
    J. A. Rice. Mathematical Statistics and Data Analysis. Duxbury Press, Belmont, second edition, 1995.MATHGoogle Scholar
  18. 18.
    G. Strang. Linear algebra and its applications. Harcourt Brace Jovanovich, third edition, 1988.Google Scholar
  19. 19.
    J. D. Tubbs, W. A. Coberly, and D. M. Young. Linear dimension reduction and Bayes classification. Pattern Recognition, 15:167–172, 1982.MATHCrossRefMathSciNetGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2002

Authors and Affiliations

  • Marco Loog
    • 1
  • Robert P. W. Duin
    • 2
  1. 1.Image Sciences InstituteUniversity Medical Center UtrechtUtrechtThe Netherlands
  2. 2.Pattern Recognition Group, Department of Applied PhysicsDelft University of TechnologyDelftThe Netherlands

Personalised recommendations