Advertisement

Generalizing Dissimilarity Representations Using Feature Lines

  • Mauricio Orozco-Alzate
  • Robert P. W. Duin
  • César Germán Castellanos-Domínguez
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4756)

Abstract

A crucial issue in dissimilarity-based classification is the choice of the representation set. In the small sample case, classifiers capable of a good generalization and the injection or addition of extra information allow to overcome the representational limitations. In this paper, we present a new approach for enriching dissimilarity representations. It is based on the concept of feature lines and consists in deriving a generalized version of the original dissimilarity representation by using feature lines as prototypes. We use a linear normal density-based classifier and the nearest neighbor rule, as well as two different methods for selecting prototypes: random choice and a length-based selection of the feature lines. An important observation is that just a few long feature lines are needed to obtain a significant improvement in performance over the other representation sets and classifiers. In general, the experiments show that this alternative representation is especially profitable for some correlated datasets.

Keywords

Dissimilarity representation feature lines generalization 

References

  1. 1.
    Cover, T.M., Hart, P.E.: Nearest neighbor pattern classification. Theory IT-13(1), 21–27 (1967)zbMATHCrossRefGoogle Scholar
  2. 2.
    Pȩkalska, E., Duin, R.P.W., Paclík, P.: Prototype selection for dissimilarity-based classifiers. Pattern Recognition 39, 189–208 (2006)CrossRefGoogle Scholar
  3. 3.
    Hastie, T., Tibshirani, R.: Discriminant adaptive nearest neighbor classification and regression. In: Touretzky, D.S., Mozer, M.C., Hasselmo, M.E. (eds.) Advances in Neural Information Processing Systems, vol. 8, pp. 409–415. The MIT Press, Cambridge (1996)Google Scholar
  4. 4.
    Sánchez, J.S., Pla, F., Ferri, F.J.: Improving the k-NCN classification rule through heuristic modifications. Pattern Recognition Letters 19(13), 1165–1170 (1998)zbMATHCrossRefGoogle Scholar
  5. 5.
    Domeniconi, C., Peng, J., Gunopulos, D.: Locally adaptive metric nearest-neighbor classification. IEEE Trans. Pattern Anal. Mach. Intell. 24(9), 1281–1285 (2002)CrossRefGoogle Scholar
  6. 6.
    Wilson, D.R., Martínez, T.R.: Improved heterogeneous distance functions. J. Artif. Intell. Res (JAIR) 6, 1–34 (1997)zbMATHGoogle Scholar
  7. 7.
    Avesani, P., Blanzieri, E., Ricci, F.: Advanced metrics for class-driven similarity search. In: DEXA 1999: Proceedings of the 10th International Workshop on Database & Expert Systems Applications, p. 223. IEEE Computer Society, Washington, DC, USA (1999)Google Scholar
  8. 8.
    Paredes, R., Vidal, E.: A class-dependent weighted dissimilarity measure for nearest neighbor classification problems. Pattern Recogn. Lett. 21(12), 1027–1036 (2000)zbMATHCrossRefGoogle Scholar
  9. 9.
    Wang, J., Neskovic, P., Cooper, L.N.: Improving nearest neighbor rule with a simple adaptive distance measure. Pattern Recogn. Lett. 28(2), 207–213 (2007)CrossRefGoogle Scholar
  10. 10.
    Li, S.Z., Lu, J.: Face recognition using the nearest feature line method. Neural Networks 10(2), 439–443 (1999)CrossRefGoogle Scholar
  11. 11.
    Chien, J.T., Wu, C.C.: Discriminant waveletfaces and nearest feature classifiers for face recognition. IEEE Trans. Pattern Anal. Machine Intell. 24(12), 1644–1649 (2002)CrossRefGoogle Scholar
  12. 12.
    Pȩkalska, E., Duin, R.P.W.: Dissimilarity representations allow for building good classifiers. Pattern Recognition Lett. 23, 943–956 (2002)CrossRefGoogle Scholar
  13. 13.
    Pȩkalska, E., Duin, R.P.W.: The Dissimilarity Representation for Pattern Recognition: Foundations and Applications. World Scientific, Singapore (2005)Google Scholar
  14. 14.
    Duin, R.P.W., Pȩkalska, E.: Possibilities of zero-error recognition by dissimilarity representations. In: Inesta, J.M., Mico, L. (eds.) PRIS 2002, Alicante, Spain, pp. 20–32. ICEIS Press (April 2002)Google Scholar
  15. 15.
    Duin, R.P.W., Juszczak, P., de Ridder, D., Paclík, P., Pȩkalska, E., Tax, D.M.J.: PRTools4: a Matlab Toolbox for Pattern Recognition. Technical report, Information and Communication Theory Group: Delft University of Technology, The Netherlands (2004), http://www.prtools.org/
  16. 16.
    Highleyman, W.H.: Linear decision functions, with application to pattern recognition. Proceedings of the IRE 50(6), 1501–1514 (1962)CrossRefMathSciNetGoogle Scholar
  17. 17.
    Newman, D.J., Hettich, S.C.L.B., Merz, C.J.: UCI repository of machine learning databases (1998), http://www.ics.uci.edu/~mlearn/MLRepository.html
  18. 18.

Copyright information

© Springer-Verlag Berlin Heidelberg 2007

Authors and Affiliations

  • Mauricio Orozco-Alzate
    • 1
    • 2
  • Robert P. W. Duin
    • 1
  • César Germán Castellanos-Domínguez
    • 2
  1. 1.Information and Communication Theory Group, Faculty of Electrical Engineering, Mathematics and Computer Science, Delft University of Technology, P.O. Box 5031, 2600GA DelftThe Netherlands
  2. 2.Grupo de Control y Procesamiento Digital de Señales, Universidad Nacional de Colombia Sede Manizales, Carrera 27 # 64-60, Manizales (Caldas)Colombia

Personalised recommendations