Skip to main content
Log in

A Gradient Linear Discriminant Analysis for Small Sample Sized Problem

  • Published:
Neural Processing Letters Aims and scope Submit manuscript

Abstract

The purpose of conventional linear discriminant analysis (LDA) is to find an orientation which projects high dimensional feature vectors of different classes to a more manageable low dimensional space in the most discriminative way for classification. The LDA technique utilizes an eigenvalue decomposition (EVD) method to find such an orientation. This computation is usually adversely affected by the small sample size problem. In this paper we have presented a new direct LDA method (called gradient LDA) for computing the orientation especially for small sample size problem. The gradient descent based method is used for this purpose. It also avoids discarding the null space of within-class scatter matrix and between-class scatter matrix which may have discriminative information useful for classification.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Fukunaga K (1990) Introduction to statistical pattern recognition. Academic Press Inc., Hartcourt Brace Jovanovich, Publishers

  2. Swets DL, Weng J (1996) Using discriminative eigenfeatures for image retrieval. IEEE Trans. Pattern Anal Mach Intell 18(8): 831–836

    Article  Google Scholar 

  3. Belhumeur PN, Hespanha JP, Kriegman DJ (1997) Eigenfaces vs. fisherfaces: recognition using class specific linear projection. IEEE Trans Pattern Anal Mach Intell 19(7): 711–720

    Article  Google Scholar 

  4. Zhao W, Chellappa R, Nandhakumar N (1998) Empirical performance analysis of linear discriminant classifiers. Proc. IEEE Conf Comput Vision Pattern Recogn, 164–169

  5. Zhao W, Chellappa R, Phillips PJ (1999) Subspace linear discriminant analysis for face recognition, Tech. Rep. CAR-TR-914, Center for Automation Research, University of Maryland, College Park

  6. Sharma A, Paliwal KK, Onwubolu GC (2006) Class-dependent PCA, MDC and LDA: a combined classifier for pattern classification. Pattern Recogn 39(7): 1215–1229

    Article  MATH  Google Scholar 

  7. Chen L-F, Liao H-YM, Ko M-T, Lin J-C, Yu G-J (2000) A new LDA-based face recognition system which can solve the small sample size problem. Pattern Recogn 33: 1713–1726

    Article  Google Scholar 

  8. Yu H, Yang J (2001) A direct LDA algorithm for high-dimensional data-with application to face recognition. Pattern Recogn 34: 2067–2070

    Article  MATH  Google Scholar 

  9. Lu J, Plataniotis KN, Venetsanopoulos AN (2003) Face recognition using LDA-based algorithms. IEEE Trans Neural Netw 14(1): 195–200

    Article  Google Scholar 

  10. Lotlikar R, Kothari R (2000) Fractional-step dimensionality reduction. IEEE Trans Pattern Anal Mach. Intell 22(6): 623–627

    Google Scholar 

  11. Duda RO, Hart PE (1973) Pattern classification and scene analysis. Wiley, New York

    MATH  Google Scholar 

  12. Blake CL, Merz CJ (1998) UCI repository of machine learning databases,http://www.ics.uci.edu/~mlearn. University of Calif., Dept. of Information and Comp. Sci., Irvine

  13. Magnus JR, Neudecker H (1994) Matrix differential calculus with applications in statistics and econometrics. Wiley

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Alok Sharma.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Sharma, A., Paliwal, K.K. A Gradient Linear Discriminant Analysis for Small Sample Sized Problem. Neural Process Lett 27, 17–24 (2008). https://doi.org/10.1007/s11063-007-9056-7

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11063-007-9056-7

Keywords

Navigation