Discriminant Projections Embedding for Nearest Neighbor Classification

  • Petia Radeva
  • Jordi Vitrià
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3287)

Abstract

In this paper we introduce a new embedding technique to linearly project labeled data samples into a new space where the performance of a Nearest Neighbor classifier is improved. The approach is based on considering a large set of simple discriminant projections and finding the subset with higher classification performance. In order to implement the feature selection process we propose the use of the adaboost algorithm. The performance of this technique is tested in a multiclass classification problem related to the production of cork stoppers for wine bottles.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Fisher, R.: On subharmonic solutions of a Hamiltonian system. The use of multiple measurements in taxonomic problems. Ann. Eugenics 7, 179–188 (1936)Google Scholar
  2. 2.
    Aladjem, M.: Linear discriminant analysis for two classes via removal of classification structure. IEEE Trans. Pattern Anal. Machine Intell. 19(2), 187–192 (1997)CrossRefGoogle Scholar
  3. 3.
    Fukunaga, K., Mantock, J.: Nonparametric discriminant analysis. IEEE Trans. Pattern Anal. Machine Intell. 5(6), 671–678 (1983)MATHCrossRefGoogle Scholar
  4. 4.
    Devijver, P., Kittler, J.: Pattern Recognition: A Statistical Approach. Prentice Hall, London (1982)MATHGoogle Scholar
  5. 5.
    Fukunaga, K.: Introduction to Statistical Pattern Recognition, 2nd edn. Academic Press, Boston (1990)MATHGoogle Scholar
  6. 6.
    Bressan, M., Vitria, J.: Nonparametric discriminant analysis and nearest neighbor classification. Pattern Recognition Letters 24(15), 2743–2749 (2003)CrossRefGoogle Scholar
  7. 7.
    Viola, P., Jones, M.: Rapid object detection using a boosted cascade of simple features. In: IEEE Conference on CVPR, Kauai, Hawaii, pp. 511–518 (2001)Google Scholar
  8. 8.
    Freund, Y., Schapire, R.E.: Experiments with a new boosting algorithm. In: International Conference on Machine Learning, pp. 148–156 (1996)Google Scholar
  9. 9.
    Schapire, R.E.: A brief introduction to boosting. In: IJCAI, pp. 1401–1406 (1999)Google Scholar
  10. 10.
    Radeva, P., Bressan, M., Tobar, A., Vitrià, J.: Bayesian Classification for Inspection of Industrial Products. In: Escrig Monferrer, M.T., Toledo, F., Golobardes, E. (eds.) Topics in Artificial Intelligence. Sringer Verlag Series: Lecture Notes in Computer Science, vol. 2504, pp. 399–407 (2002)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2004

Authors and Affiliations

  • Petia Radeva
    • 1
  • Jordi Vitrià
    • 1
  1. 1.Computer Vision Centre and Dept. InformàticaUniversitat Autònoma de BarcelonaBellaterra (Barcelona)Spain

Personalised recommendations