Skip to main content

Linear Discriminant Analysis

  • Chapter
  • First Online:
Robust Data Mining

Part of the book series: SpringerBriefs in Optimization ((BRIEFSOPTI))

Abstract

In this chapter we discuss another popular data mining algorithm that can be used for supervised or unsupervised learning. Linear Discriminant Analysis (LDA) was proposed by R. Fischer in 1936. It consists in finding the projection hyperplane that minimizes the interclass variance and maximizes the distance between the projected means of the classes. Similarly to PCA, these two objectives can be solved by solving an eigenvalue problem with the corresponding eigenvector defining the hyperplane of interest. This hyperplane can be used for classification, dimensionality reduction and for interpretation of the importance of the given features. In the first part of the chapter we discuss the generic formulation of LDA whereas in the second we present the robust counterpart scheme originally proposed by Kim and Boyd. We also discuss the non linear extension of LDA through the kernel transformation.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Baudat, G., Anouar, F.: Generalized discriminant analysis using a kernel approach. Neural computation 12(10), 2385–2404 (2000)

    Article  Google Scholar 

  2. Boyd, S., Vandenberghe, L.: Convex optimization. Cambridge Univ Pr (2004)

    Google Scholar 

  3. Fisher, R.: The use of multiple measurements in taxonomic problems. Annals of Eugenics 7(7), 179–188 (1936)

    Google Scholar 

  4. Kim, S.J., Boyd, S.: A minimax theorem with applications to machine learning, signal processing, and finance. SIAM Journal on Optimization 19(3), 1344–1367 (2008)

    Article  MathSciNet  MATH  Google Scholar 

  5. Kim, S.J., Magnani, A., Boyd, S.: Robust fisher discriminant analysis. Advances in Neural Information Processing Systems 18, 659 (2006)

    Google Scholar 

  6. Rao, C.: The utilization of multiple measurements in problems of biological classification. Journal of the Royal Statistical Society. Series B (Methodological) 10(2), 159–203 (1948)

    Google Scholar 

  7. Shawe-Taylor, J., Cristianini, N.: Kernel methods for pattern analysis. Cambridge Univ Pr (2004)

    Google Scholar 

  8. Sion, M.: On general minimax theorems. Pacific Journal of Mathematics 8(1), 171–176 (1958)

    MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2013 Petros Xanthopoulos,Panos M. Pardalos,Theodore B. Trafalis

About this chapter

Cite this chapter

Xanthopoulos, P., Pardalos, P.M., Trafalis, T.B. (2013). Linear Discriminant Analysis. In: Robust Data Mining. SpringerBriefs in Optimization. Springer, New York, NY. https://doi.org/10.1007/978-1-4419-9878-1_4

Download citation

Publish with us

Policies and ethics