Advertisement

Reducing Training Sets by NCN-based Exploratory Procedures

  • M. Lozano
  • José S. Sánchez
  • Filiberto Pla
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2652)

Abstract

In this paper, a new approach to training set size reduction is presented. This scheme basically consists of defining a small number of prototypes that represent all the original instances. Although the ultimate aim of the algorithm proposed here is to obtain a strongly reduced training set, the performance is empirically evaluated over nine real datasets by comparing not only the reduction rate but also the classification accuracy with those of other condensing techniques.

Keywords

Near Neighbour Pattern Recognition Letter Prototype Selection Prototype Selection Method Lower Reduction Percentage 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Aha, D.W., Kibler, D., Albert, M.K.: Instance-based learning algorithms. Machine Learning 6, 37–66 (1991)Google Scholar
  2. 2.
    Ainslie, M.C., Sánchez, J.S.: Space partitioning for instance reduction in lazy learning algorithms. In: 2nd Workshop on Integration and Collaboration Aspects of Data Mining, Decision Suport and Meta-Learning, pp. 13–18 (2002)Google Scholar
  3. 3.
    Bezdek, J.C., Kuncheva, L.I.: Nearest prototype classifier designs: an experimental study. International Journal of Intelligent Systems 16, 1445–1473 (2001)MATHCrossRefGoogle Scholar
  4. 4.
    Chang, C.L.: Finding prototypes for nearest neighbor clasifiers. IEEE Trans. on Computers 23, 1179–1184 (1974)MATHCrossRefGoogle Scholar
  5. 5.
    Chaudhuri, B.B.: A new definition of neighbourhood of a point in multidimensional space. Pattern Recognition Letters 17, 11–17 (1996)CrossRefGoogle Scholar
  6. 6.
    Chen, C.H., Józwik, A.: A sample set condensation algorithm for the class sensitive artificial neural network. Pattern Recognition Letters 17, 819–823 (1996)CrossRefGoogle Scholar
  7. 7.
    Dasarathy, B.V.: Nearest neighbor (NN) norms: NN pattern classification techniques. IEEE Computer Society Press, Los Alamitos (1990)Google Scholar
  8. 8.
    Dasarathy, B.V.: Minimal consistent subset (mcs) identification for optimal nearest neighbor decision systems design. IEEE Trans. on Systems, Man, and Cybernetics 24, 511–517 (1994)CrossRefGoogle Scholar
  9. 9.
    Devijver, P.A., Kittler, J.: Pattern Recognition: A Statistical Approach. Prentice Hall, Englewood Cliffs (1982)MATHGoogle Scholar
  10. 10.
    Hart, P.: The condensed nearest neigbor rule. IEEE Trans on Information Theory 14, 505–516 (1968)CrossRefGoogle Scholar
  11. 11.
    Merz, C.J., Murphy, P.M.: UCI Repository of Machine Learning Databases. Dept. of Information and Computer Science. U. of California, Irvine (1998)Google Scholar
  12. 12.
    Sánchez, J.S., Pla, F., Ferri, F.J.: On the use of neighbourhood-based nonparametric classifiers. Pattern Recognition Letters 18, 1179–1186 (1997)CrossRefGoogle Scholar
  13. 13.
    Wilson, D.L.: Asymptotic properties of nearest neigbor rules using edited data sets. IEEE Trans. on Systems, Man and Cybernetics 2, 408–421 (1972)MATHCrossRefGoogle Scholar
  14. 14.
    Wilson, D.R., Martinez, T.R.: Reduction techniques for instance-based learning algorithms. Machine Learning 38, 257–286 (2000)MATHCrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2003

Authors and Affiliations

  • M. Lozano
    • 1
  • José S. Sánchez
    • 1
  • Filiberto Pla
    • 1
  1. 1.Dept. Lenguajes y Sistemas InformáticosUniversitat Jaume ICastellónSpain

Personalised recommendations