Advertisement

Local Metric Adaptation for Soft Nearest Prototype Classification to Classify Proteomic Data

  • F. -M. Schleif
  • T. Villmann
  • B. Hammer
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3849)

Abstract

We propose a new method for the construction of nearest prototype classifiers which is based on a Gaussian mixture approach interpreted as an annealed version of Learning Vector Quantization. Thereby we allow the adaptation of the underling metric which is useful in proteomic research. The algorithm performs a gradient descent on a cost function adapted from soft nearest prototype classification. We investigate the properties of the algorithm and assess its performance on two clinical cancer data sets. Results show that the algorithm performs reliable with respect to alternative state of the art classifiers.

Keywords

classification learning vector quantization metric adaptation mass spectrometry proteomic profiling 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Binz, P., Appel, R.: Mass spectrometry-based proteomics: current status and potential use in clinical chemistry. Clin. Chem. Lab. Med. 41, 1540–1551 (2003)CrossRefGoogle Scholar
  2. 2.
    Kohonen, T.: Self-Organizing Maps. Springer Series in Information Sciences, vol. 30. Springer, Heidelberg (1995) (2nd Ext. Ed. 1997)Google Scholar
  3. 3.
    Crammer, K., Gilad-Bachrach, R., Navot, A., Tishby, A.: Margin analysis of the lvq algorithm. In: Proc. NIPS 2002 (2002)Google Scholar
  4. 4.
    Seo, S., Bode, M., Obermayer, K.: Soft nearest prototype classification. IEEE Transaction on Neural Networks 14, 390–398 (2003)CrossRefGoogle Scholar
  5. 5.
    Blake, C., Merz., C.: UCI rep. of mach. learn. databases (1998) available at, http://www.ics.uci.edu/mlearn/MLRepository.html
  6. 6.
    MHH Hannover, I.S., Daltonik, B.: internal results on leukaemia (2004)Google Scholar
  7. 7.
    Hammer, B., Strickert, M., Villmann, T.: Supervised neural gas with general similarity measure. Neural Processing Letters 21, 21–44 (2005)CrossRefGoogle Scholar
  8. 8.
    Vapnik, V.: Statistical Learning Theory. Wiley, Chichester (1998)zbMATHGoogle Scholar
  9. 9.
    Hammer, B., Villmann, T.: Mathematical aspects of neural networks. In: Proc. of Europ. Symp. on Art. Neural Netw., Brussels, pp. 59–72 (2003)Google Scholar
  10. 10.
    Kohonen, T., Kaski, S., Lappalainen, H.: Self-organized formation of various invariant-feature filters in the adaptive-subspace SOM. Neural Computation 9, 1321–1344 (1997)CrossRefGoogle Scholar
  11. 11.
    Sato, A.S., Yamada, K.: Generalized learning vector quantization. In: Tesauro, G., Touretzky, D., Leen, T. (eds.) Advances in Neural Information Processing Systems, vol. 7, pp. 423–429. MIT Press, Cambridge (1995)Google Scholar
  12. 12.
    Hammer, B., Villmann, T.: Generalized relevance learning vector quantization. Neural Netw. 15, 1059–1068 (2002)CrossRefGoogle Scholar
  13. 13.
    Adam, B., Qu, Y., Davis, J., Ward, M., Clements, M., Cazares, L., Semmes, O., Schellhammer, P., Yasui, Y., Feng, Z., Wright, G.: Serum protein finger printing coupled with a pattern-matching algorithm distinguishes prostate cancer from benign prostate hyperplasia and healthy men. Cancer Research 62, 3609–3614 (2002)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • F. -M. Schleif
    • 1
    • 3
  • T. Villmann
    • 2
  • B. Hammer
    • 4
  1. 1.Dept. of Math. and Comp. ScienceUniv. LeipzigLeipzigGermany
  2. 2.Clinic for PsychotherapyUniv. LeipzigLeipzigGermany
  3. 3.Bruker Daltonik GmbHLeipzigGermany
  4. 4.Dept. of Comp. ScienceClausthal Univ. of Technology 

Personalised recommendations