Frontiers of Computer Science

, Volume 8, Issue 5, pp 785–792 | Cite as

Linear discriminant analysis with worst between-class separation and average within-class compactness

Research Article

Abstract

Linear discriminant analysis (LDA) is one of the most popular supervised dimensionality reduction (DR) techniques and obtains discriminant projections by maximizing the ratio of average-case between-class scatter to average-case within-class scatter. Two recent discriminant analysis algorithms (DAS), minimal distance maximization (MDM) and worst-case LDA (WLDA), get projections by optimizing worst-case scatters. In this paper, we develop a new LDA framework called LDA with worst between-class separation and average within-class compactness (WSAC) by maximizing the ratio of worst-case between-class scatter to average-case within-class scatter. This can be achieved by relaxing the trace ratio optimization to a distance metric learning problem. Comparative experiments demonstrate its effectiveness. In addition, DA counterparts using the local geometry of data and the kernel trick can likewise be embedded into our framework and be solved in the same way.

Keywords

dimensionality reduction linear discriminant analysis the worst separation the average compactness 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Fukunaga K. Statistical Pattern Recognition. Academic Press, 1990MATHGoogle Scholar
  2. 2.
    Duda R. Hart P. Stork D. Pattern Classification. 2nd ed. New York: John Wiley & Sons, Inc., 2001MATHGoogle Scholar
  3. 3.
    Yang B. Chen S. Wu X. A structurally motivated framework for discriminant analysis. Pattern Analysis and Application, 2011, 14(4): 349–367MathSciNetCrossRefGoogle Scholar
  4. 4.
    Jae H. Nojun K. Generalization of linear discriminant analysis using Lp-norm. Pattern Recognition Letters, 2013, 34(6): 679–685CrossRefGoogle Scholar
  5. 5.
    Ching W. Chu D. Liao L. Wang X. Regularized orthogonal linear discriminant analysis. Pattern Recognition, 2012, 45(7): 2719–2732CrossRefMATHGoogle Scholar
  6. 6.
    Li H. Jiang T. Zhang K. Efficient and robust feature extraction by maximum margin criterion. IEEE Transaction on Neural Networks, 2006, 17(1): 157–165CrossRefGoogle Scholar
  7. 7.
    Nenadic Z. Information discriminant analysis: feature extraction with an information-theoretic objective. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2007, 29(8): 1394–1407CrossRefGoogle Scholar
  8. 8.
    Zhu M. Martinez A. Subclass discriminant analysis. IEEE Transactions on Pattern Analysis andMachine Intelligence, 2006, 28(8): 1274–1286CrossRefGoogle Scholar
  9. 9.
    Gao Q. Liu J. Zhang H. Hou J. Yang X. Enhanced fisher discriminant criterion for image recognition. Pattern Recognition, 2012, 45(10): 3717–3724CrossRefGoogle Scholar
  10. 10.
    Cai D. He X. Kun Z. Han J. Bao H. Local sensitive discriminant analysis. In Proceedings of the international joint conference on artificial intelligence. 2007, 141–146Google Scholar
  11. 11.
    Fan Z. Xu Y. Zhang D. Local linear discriminant analysis framework using sample neighbors. IEEE Transaction on Neural Networks, 2011, 22(7): 1119–1132CrossRefGoogle Scholar
  12. 12.
    Xu B. Huang K. Liu C. Dimensionality reduction by minimal distance maximization. In: Proceedings of 20th International Conference on Pattern Recognition, 2010, 569–572Google Scholar
  13. 13.
    Zhang Y. Yeung D. Worst-case linear discriminant analysis. In: Proceedings of Advances in Neural Information Processing Systems. 2010, 2568–2576Google Scholar
  14. 14.
    Ying Y. Li P. Distance metric learning with eigenvalue optimization. Journal of Machine Learning Research, 2012, 13: 1–26MathSciNetMATHGoogle Scholar
  15. 15.
    Overton M. Womersley R. Optimality conditions and duality theory for minimizing sums of the largest eigenvalues of symmetric matrices. Math Programming, 1993, 62(2): 321–357MathSciNetCrossRefMATHGoogle Scholar
  16. 16.
    Overton M. On minimizing the maximum eigenvalue of a symmetric matrix. SIAM Journal on Matrix Analysis and Applications, 1988, 9(2): 256–268MathSciNetCrossRefMATHGoogle Scholar
  17. 17.
    Zhang Y. Yeung D. Semi-supervised generalized discriminant analysis. IEEE Transactions on Neural Network, 2011, 22(8): 1207–1217CrossRefGoogle Scholar
  18. 18.
    Mika S. Ratsch G. Weston J. Scholkopf B. Smola A. Muller K. Constructing descriptive and discriminative nonlinear features: Rayleigh coefficients in kernel feature spaces. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2003, 25(5): 623–628CrossRefGoogle Scholar
  19. 19.
    Duin R. Loog M. Linear dimensionality reduction via a heteroscedastic extension of LDA: the Chernoff criterion. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2004, 26(6): 732–739CrossRefGoogle Scholar
  20. 20.
    Frank A. Asuncion A. UCI Machine Learning Repository. http://archive.ics.uci.edu/ml
  21. 21.
    Belhumeur P. Hespanha J. Kriegman D. Eigenfaces vs fisherfaces: recognition using class specific linear projection. IEEE Transactions on Pattern Analysis and Machine Intelligence, 1997, 19(7): 711–720CrossRefGoogle Scholar
  22. 22.
    Martinez A. Benavente R. The AR-face database. Technical Report 24, CVC, 1998. http://www2.ece.ohiostate.edu/?aleix/ARdatabase.html Google Scholar
  23. 23.
    Nene S. Nayar S. Murase H. Columbia Object Image Library (COIL-20). Technical Report005, CUCS, 1996. http://www1.cs.columbia.edu/CAVE/software/softlib/coil-20.php Google Scholar

Copyright information

© Higher Education Press and Springer-Verlag Berlin Heidelberg 2014

Authors and Affiliations

  1. 1.College of Computer Science and TechnologyNanjing University of Aeronautics and AstronauticsNanjingChina

Personalised recommendations