On Optimizing Dissimilarity-Based Classification Using Prototype Reduction Schemes

  • Sang-Woon Kim
  • B. John Oommen
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4141)


The aim of this paper is to present a strategy by which a new philosophy for pattern classification, namely that pertaining to Dissimilarity-Based Classifiers (DBCs), can be efficiently implemented. This methodology, proposed by Duin and his co-authors (see [3], [4], [5], [6], [8]), is a way of defining classifiers between the classes, and is not based on the feature measurements of the individual patterns, but rather on a suitable dissimilarity measure between them. The problem with this strategy is, however, the need to compute, store and process the inter-pattern dissimilarities for all the training samples, and thus, the accuracy of the classifier designed in the dissimilarity space is dependent on the methods used to achieve this. In this paper, we suggest a novel strategy to enhance the computation for all families of DBCs. Rather than compute, store and process the DBC based on the entire data set, we advocate that the training set be first reduced into a smaller representative subset. Also, rather than determine this subset on the basis of random selection, or clustering etc., we advocate the use of a Prototype Reduction Scheme (PRS), whose output yields the points to be utilized by the DBC. Apart from utilizing PRSs, in the paper we also propose simultaneously employing the Mahalanobis distance as the dissimilarity-measurement criterion to increase the DBC’s classification accuracy. Our experimental results demonstrate that the proposed mechanism increases the classification accuracy when compared with the “conventional” approaches for samples involving real-life as well as artificial data sets.


Mahalanobis Distance Near Neighbour Dissimilarity Measure Dissimilarity Matrix Statistical Pattern Recognition 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Fukunaga, K.: Introduction to Statistical Pattern Recognition, 2nd edn. Academic Press, San Diego (1990)MATHGoogle Scholar
  2. 2.
    Jain, A.K., Duin, R.P.W., Mao, J.: Statistical pattern recognition: A review. IEEE Trans. Pattern Anal. and Machine Intell. PAMI-22(1), 4–37 (2000)CrossRefGoogle Scholar
  3. 3.
    Duin, R.P.W., Ridder, D., Tax, D.M.J.: Experiments with a featureless approach to pattern recognition. Pattern Recognition Letters 18, 1159–1166 (1997)CrossRefGoogle Scholar
  4. 4.
    Duin, R.P.W., Pekalska, E., de Ridder, D.: Relational discriminant analysis. Pattern Recognition Letters 20, 1175–1181 (1999)CrossRefGoogle Scholar
  5. 5.
    Pekalska, E., Duin, R.P.W.: Dissimilarity representations allow for buillding good classifiers. Pattern Recognition Letters 23, 943–956 (2002)CrossRefMATHGoogle Scholar
  6. 6.
    Pekalska, E.: Dissimilarity representations in pattern recognition. Concepts, theory and applications, Ph.D. thesis, Delft University of Technology, Delft, The Netherlands (2005)Google Scholar
  7. 7.
    Horikawa, Y.: On properties of nearest neighbor classifiers for high-dimensional patterns in dissimilarity-based classification. IEICE Trans. Information & Systems J88-D-II(4), 813–817 (2005) (in Japanese)Google Scholar
  8. 8.
    Pekalska, E., Duin, R.P.W., Paclik, P.: Prototype selection for dissimilarity-based classifiers. Pattern Recognition 39, 189–208 (2006)CrossRefMATHGoogle Scholar
  9. 9.
    Bezdek, J.C., Kuncheva, L.I.: Nearest prototype classifier designs: An experimental study. International Journal of Intelligent Systems 16(12), 1445–11473 (2001)CrossRefMATHGoogle Scholar
  10. 10.
    Dasarathy, B.V.: Nearest Neighbor (NN) Norms: NN Pattern Classification Techniques. IEEE Computer Society Press, Los Alamitos (1991)Google Scholar
  11. 11.
    Hart, P.E.: The condensed nearest neighbor rule. IEEE Trans. Inform. Theory IT-14, 515–516 (1968)CrossRefGoogle Scholar
  12. 12.
    Fukunaga, K., Mantock, J.M.: Nonparametric data reduction. IEEE Trans. Pattern Anal. and Machine Intell. PAMI-6(1), 115–118 (1984)CrossRefGoogle Scholar
  13. 13.
    Chang, C.L.: Finding prototypes for nearest neighbor classifiers. IEEE Trans. Computers C-23(11), 1179–1184 (1974)CrossRefGoogle Scholar
  14. 14.
    Bezdek, J.C., Reichherzer, T.R., Lim, G.S., Attikiouzel, Y.: Multiple-prototype classifier design. IEEE Trans. Systems, Man, and Cybernetics - Part C SMC-28(1), 67–79 (1998)CrossRefGoogle Scholar
  15. 15.
    Burges, C.J.C.: A tutorial on support vector machines for pattern recognition. Data Mining and Knowledge Discovery 2(2), 121–167 (1998)CrossRefGoogle Scholar
  16. 16.
    Kohonen, T.: Self-Oganizing Maps. Springer, Berlin (1995)Google Scholar
  17. 17.
    Kim, S.-W., Oommen, B.J.: On Optimizing Dissimilarity-based Classification Using Prototype Reduction Schemes (unabridged version of this paper)Google Scholar
  18. 18.
    Kim, S.-W., Oommen, B.J.: Enhancing prototype reduction schemes with LVQ3-type algorithms. Pattern Recognition 36(5), 1083–1093 (2003)CrossRefMATHGoogle Scholar
  19. 19.
    Kim, S.-W., Oommen, B.J.: A Brief Taxonomy and Ranking of Creative Prototype Reduction Schemes. Pattern Analysis and Applications Journal 6(3), 232–244 (2003)CrossRefMathSciNetGoogle Scholar
  20. 20.
    Kim, S.-W., Oommen, B.J.: Enhancing Prototype Reduction Schemes with Recursion: A Method Applicable for “Large” Data Sets. IEEE Trans. Systems, Man, and Cybernetics - Part B SMC-34(3), 1384–1397 (2004)CrossRefGoogle Scholar
  21. 21.
    Kim, S.-W., Oommen, B.J.: On using prototype reduction schemes to optimize kernel-based nonlinear subspace methods. Pattern Recognition 37(2), 227–239 (2004)CrossRefMATHGoogle Scholar
  22. 22.
    Kim, S.-W., Oommen, B.J.: On using prototype reduction schemes and classifier fusion strategies to optimize kernel-based nonlinear subspace methods. IEEE Transactions on Pattern Analysis and Machine Intelligence 27(3), 455–460 (2005)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Sang-Woon Kim
    • 1
  • B. John Oommen
    • 2
  1. 1.Dept. of Computer Science and EngineeringMyongji UniversityYonginKorea
  2. 2.School of Computer ScienceCarleton UniversityOttawaCanada

Personalised recommendations