Advertisement

Neural Processing Letters

, Volume 15, Issue 2, pp 147–156 | Cite as

Kernel Nearest-Neighbor Algorithm

  • Kai Yu
  • Liang Ji
  • Xuegong Zhang
Article

Abstract

The ‘kernel approach’ has attracted great attention with the development of support vector machine (SVM) and has been studied in a general way. It offers an alternative solution to increase the computational power of linear learning machines by mapping data into a high dimensional feature space. This ‘approach’ is extended to the well-known nearest-neighbor algorithm in this paper. It can be realized by substitution of a kernel distance metric for the original one in Hilbert space, and the corresponding algorithm is called kernel nearest-neighbor algorithm. Three data sets, an artificial data set, BUPA liver disorders database and USPS database, were used for testing. Kernel nearest-neighbor algorithm was compared with conventional nearest-neighbor algorithm and SVM Experiments show that kernel nearest-neighbor algorithm is more powerful than conventional nearest-neighbor algorithm, and it can compete with SVM.

kernel nearest-neighbor nonlinear classification 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Duda, R. O. and Hart, P. E.: Pattern Classi¢cation and Scene Analysis, Wiley, New York, 1973.Google Scholar
  2. 2.
    Hart, P. E.: The condensed nearest neighbor rule, IEEE Trans. Inf. Theory 16 (1968), 515–516.CrossRefGoogle Scholar
  3. 3.
    Wilson, D. L.: Asymptotic properties of nearest neighbor rules using edited data, IEEE Trans. Syst. Man Cybern. SMC-2 (1972), 408–421.CrossRefGoogle Scholar
  4. 4.
    Aizerman, M. A., Braverman, E. M. and Rozonoer, L. I.: Theoretical foundation of potential function method in pattern recognition learning, Automat. Remote Contr. 25 (1964), 821–837.MathSciNetGoogle Scholar
  5. 5.
    Aizerman, M. A., Braverman, E.M. and Rozonoer, L. I.: The Robbince-Monroe process and the method of potential functions, Automat. Remote Contr. 28 (1965), 1882–1885.Google Scholar
  6. 6.
    Vapnik, V. N.: The Nature of Statistical Learning Theory, Springer-Verlag, New York, 1995.zbMATHGoogle Scholar
  7. 7.
    Schölkopf, B., Smola, A. and Müller, K. R.: Nonlinear component analysis as a kernel eigenvalue problem, Neural Comput. 10 (1998), 1299–1319.CrossRefGoogle Scholar
  8. 8.
    Courant, R. and Hilbert, D.: Methods of Mathematical Physics, J. Wiley, New York, 1953.Google Scholar
  9. 9.
    Forsyth, R. S.: UCI Repository of machine learning databases, Irvine, CA: University of California, Department of Information and Computer Science, 1990.Google Scholar
  10. 10.
    LeCun, Y. et al.: Backpropagation applied to handwritten zip code recognition, Neural Comput. 1 (1989), 541–551.Google Scholar
  11. 11.
    Collobert, R. and Bengio, S.: Support Vector Machines for Large-Scale Regression Problems, IDIAP-RR–00–17, 2000.Google Scholar
  12. 12.
    Schölkopf, B., Burges, C. and Vapnik, V.: Extracting support data for a given task, In: U. M. Fayyad. and R. Uthurusamy (eds), Proc. 1st International Conference on Knowledge Discovery & Data Mining, Menlo Park, AAAI Press, 1995.Google Scholar

Copyright information

© Kluwer Academic Publishers 2002

Authors and Affiliations

  • Kai Yu
    • 1
  • Liang Ji
    • 2
  • Xuegong Zhang
    • 1
  1. 1.State Key Laboratory of Intelligent Technology and Systems, Institute of Information Processing, Department of AutomationTsinghua UniversityBeijingP.R. China
  2. 2.Tsinghua UniversityBeijingP.R. China

Personalised recommendations