Matrix Metric Adaptation Linear Discriminant Analysis of Biomedical Data

  • M. Strickert
  • J. Keilwagen
  • F. -M. Schleif
  • T. Villmann
  • M. Biehl
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5517)


A structurally simple, yet powerful, formalism is presented for adapting attribute combinations in high-dimensional data, given categorical data class labels. The rank-1 Mahalanobis distance is optimized in a way that maximizes between-class variability while minimizing within-class variability. This optimization target has resemblance to Fisher’s linear discriminant analysis (LDA), but the proposed formulation is more general and yields improved class separation, which is demonstrated for spectrum data and gene expression data.


adaptive Mahalanobis distance LDA 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Blake, C., Merz, C.: UCI repository of machine learning databases (1998)Google Scholar
  2. 2.
    Davis, J., Goadrich, M.: The relationship between Precision-Recall and ROC curves. In: ICML 2006: Proceedings of the 23rd international conference on machine learning, pp. 233–240. ACM, New York (2006)Google Scholar
  3. 3.
    Furey, T.S., Cristianini, N., Duffy, N., Bednarski, D.W., Schummer, M., Haussler, D.: Support vector machine classification and validation of cancer tissue samples using microarray expression data. Bioinformatics 16(10), 906–914 (2000)CrossRefGoogle Scholar
  4. 4.
    Golub, T., Slonim, D., Tamayo, P., Huard, C., Gaasenbeek, M., Mesirov, J., Coller, H., Loh, M., Downing, J., Caligiuri, M., Bloomfield, C., Lander, E.: Molecular classification of cancer: class discovery and class prediction by gene expression monitoring. Science 286(5439), 531–537 (1999)CrossRefGoogle Scholar
  5. 5.
    Guyon, I., Gunn, S., Nikravesh, M., Zadeh, L.: Feature Extraction: Foundations and Applications. Springer, Berlin (2006)CrossRefzbMATHGoogle Scholar
  6. 6.
    Hammer, B., Villmann, T.: Generalized relevance learning vector quantization. Neural Networks 15, 1059–1068 (2002)CrossRefGoogle Scholar
  7. 7.
    Hu, S., Rao, J.: Statistical redundancy testing for improved gene selection in cancer classification using microarray data. Cancer Informatics 2, 29–41 (2007)Google Scholar
  8. 8.
    Lee, E., Cook, D., Klinke, S., Lumley, T.: Projection pursuit for exploratory supervised classification. Journal of Computational and Graphical Statistics 14(4), 831–846 (2005)MathSciNetCrossRefGoogle Scholar
  9. 9.
    Massachusetts Institute of Technology. Journal of Machine Learning Research – Special Issue on Variable and Feature Selection, vol. 3 (2003)Google Scholar
  10. 10.
    McLachlan, G.: Discriminant Analysis and Statistical Pattern Recognition. Wiley Interscience, Hoboken (2004)zbMATHGoogle Scholar
  11. 11.
    Molla, M., Waddell, M., Page, D., Shavlik, J.: Using machine learning to design and interpret gene-expression microarrays. AI Magazine 25(1), 23–44 (2004)Google Scholar
  12. 12.
    Nocedal, J., Wright, S.J.: Numerical Optimization. Springer, Heidelberg (1999)CrossRefzbMATHGoogle Scholar
  13. 13.
    Rossi, F., Villa, N.: Support vector machine for functional data classification. Neurocomputing 69(7–9), 730–742 (2006)CrossRefGoogle Scholar
  14. 14.
    Schneider, P., Biehl, M., Hammer, B.: Matrix adaptation in discriminative vector quantization. IfI Technical Report Online IFI-08-08, Clausthal University of Technology (2008)Google Scholar
  15. 15.
    Soto, A.J., Cecchini, R.L., Vazquez, G.E., Ponzoni, I.: A wrapper-based feature selection method for admet prediction using evolutionary computing. In: Marchiori, E., Moore, J.H. (eds.) EvoBIO 2008. LNCS, vol. 4973, pp. 188–199. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  16. 16.
    Strickert, M., Seiffert, U., Sreenivasulu, N., Weschke, W., Villmann, T., Hammer, B.: Generalized relevance LVQ (GRLVQ) with correlation measures for gene expression data. Neurocomputing 69, 651–659 (2006)CrossRefGoogle Scholar
  17. 17.
    Sun, Y.: Iterative relief for feature weighting: Algorithms, theories, and applications. IEEE Transactions on Pattern Analysis and Machine Intelligence 29(6), 1035–1051 (2007)CrossRefGoogle Scholar
  18. 18.
    Weinberger, K., Saul, L.: Fast solvers and efficient implementations for distance metric learning. In: McCallum, A., Roweis, S. (eds.) Proceedings of the 25th Annual International Conference on Machine Learning (ICML 2008), pp. 1160–1167. Omni Press (2008)Google Scholar
  19. 19.
    Yang, L.: Distance Metric Learning: A Comprehensive Survey. Report, Michigan State University (2006),

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • M. Strickert
    • 1
  • J. Keilwagen
    • 1
  • F. -M. Schleif
    • 2
  • T. Villmann
    • 3
  • M. Biehl
    • 4
  1. 1.Research Group Data InspectionIPK GaterslebenGermany
  2. 2.Research Group Computational IntelligenceUniversity of LeipzigGermany
  3. 3.Dept. of Computer ScienceUniversity of Applied Sciences MittweidaGermany
  4. 4.Intelligent Systems GroupUniversity of GroningenThe Netherlands

Personalised recommendations