Skip to main content

The Support Vector method

  • Part II: Cortical Maps and Receptive Fields
  • Conference paper
  • First Online:
Artificial Neural Networks — ICANN'97 (ICANN 1997)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 1327))

Included in the following conference series:

Abstract

The Support Vector (SV) method is a new general method of function estimation which does not depend explicitly on the dimensionality of input space. It was applied for pattern recognition, regression estimation, and density estimation problems as well as for problems of solving linear operator equations. In this article we describe the general idea of the SV method and present theorems demonstrating that the generalization ability of the SV method is based on factors which classical statistics do not take into account. We also describe the SV method for density estimation in a set of functions defined by a mixture of an infinite number of Gaussians.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. B.E. Boser, I.M. Guyon, and V.N. Vapnik. A training algorithm for optimal margin classifier. Proceedings of the 5th Annual ACM Workshop on Computational Learning Theory. pp 144–152, Pittsburgh, PA, 1992.

    Google Scholar 

  2. C. Cortes and V.Vapnik. Support Vector Network. Machine Learning. 20:273–297, 1995.

    Google Scholar 

  3. H. Drucker, C.J. Burges, L. Kaufman, A. Smola, and V. Vapnik. Support vector regression machines. In Advances in Neural Information Processing Systems 9, 1997, MIT Press.

    Google Scholar 

  4. V. Vapnik. The Nature of Statistical Learning Theory. Springer Verlag, 1995, New-York.

    Google Scholar 

  5. V.N. Vapnik and A. Ya. Chervonenkis. Theory of Pattern Recognition (in Russian) Nauka, Moscow, 1974. German translation: W.N. Wapnik, A Ja. Tscherwonenkis Teorie der Zeichenerkennung, Akademia, Berlin, 1979.

    Google Scholar 

  6. V. Vapnik, S. Golowich, and A. Smola. Support vector method for function approximation, regression estimation and signal processing. In Advances in Neural Information Processing Systems, 9., 1997 MIT Press.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Wulfram Gerstner Alain Germond Martin Hasler Jean-Daniel Nicoud

Rights and permissions

Reprints and permissions

Copyright information

© 1997 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Vapnik, V.N. (1997). The Support Vector method. In: Gerstner, W., Germond, A., Hasler, M., Nicoud, JD. (eds) Artificial Neural Networks — ICANN'97. ICANN 1997. Lecture Notes in Computer Science, vol 1327. Springer, Berlin, Heidelberg. https://doi.org/10.1007/BFb0020166

Download citation

  • DOI: https://doi.org/10.1007/BFb0020166

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-63631-1

  • Online ISBN: 978-3-540-69620-9

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics