Incorporating invariances in support vector learning machines

  • Bernhard Schölkopf
  • Chris Burges
  • Vladimir Vapnik
Oral Presentations: Theory Theory II: Learning
Part of the Lecture Notes in Computer Science book series (LNCS, volume 1112)


Developed only recently, support vector learning machines achieve high generalization ability by minimizing a bound on the expected test error; however, so far there existed no way of adding knowledge about invariances of a classification problem at hand. We present a method of incorporating prior knowledge about transformation invariances by applying transformations to support vectors, the training examples most critical for determining the classification boundary.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. Abu-Mostafa, Y. S.: Hints. Neural Computation 7 (1995) 639–671Google Scholar
  2. Boser, B. E., Guyon, I. M., Vapnik, V.: A training algorithm for optimal margin classifiers. Fifth Annual Workshop on Computational Learning Theory, PittsburghACM (1992) 144–152.Google Scholar
  3. Bottou, L., Cortes, C., Denker, J. S., Drucker, H., Guyon, I., Jackel, L. D., Le Cun, Y., Müller, U. A., Säckinger, E., Simard, P., Vapnik, V.: Comparison of classifier methods: a case study in handwritten digit recognition. Proceedings of the 12th International Conference on Pattern Recognition and Neural Networks, Jerusalem (1994)Google Scholar
  4. Burges, C.: Simplified support vector decision rules. 13th International Conference on Machine Learning (1996)Google Scholar
  5. Cortes, C., Vapnik, V.: Support Vector Networks. Machine Learning 20 (1995) 1–25Google Scholar
  6. Drucker, H., Schapire, R., Simard, P.: Boosting performance in neural networks. International Journal of Pattern Recognition and Artificial Intelligence 7 (1993) 705–719Google Scholar
  7. Le Cun, Y., Boser, B., Denker, J. S., Henderson, D., Howard, R. E., Hubbard, W., Jackel, L. J.: Backpropagation applied to handwritten zip code recognition. Neural Computation 1 (1989) 541–551Google Scholar
  8. Schölkopf, B., Burges, C., Vapnik, V.: Extracting support data for a given task. In: Fayyad, U. M., Uthurusamy, R. (eds.): Proceedings, First International Conference on Knowledge Discovery & Data Mining, AAAI Press, Menlo Park, CA (1995)Google Scholar
  9. Segman, J., Rubinstein, J., Zeevi, Y. Y.: The canonical coordinates method for pattern deformation: theoretical and computational considerations. IEEE Transactions on Pattern Analysis and Machine Intelligence 14 (1992) 1171–1183Google Scholar
  10. Simard, P., Le Cun, Y., Denker, J.: Efficient pattern recognition using a new transformation distance. In: Hanson, S. J., Cowan, J. D., Giles, C. L. (eds.): Advances in Neural Information Processing Systems 5, Morgan Kaufmann, San Mateo, CA (1993)Google Scholar
  11. Simard, P., Victorri, B., Le Cun, Y., Denker, J.: Tangent Prop — a formalism for specifying selected invariances in an adaptive network. In: Moody, J. E., Hanson, S. J., Lippmann, R. P.: Advances in Neural Information Processing Systems 4, Morgan Kaufmann, San Mateo, CA (1992)Google Scholar
  12. Vapnik, V.: Estimation of Dependences Based on Empirical Data. [in Russian] Nauka, Moscow (1979); English translation: Springer Verlag, New York (1982)Google Scholar
  13. Vapnik, V.: The Nature of Statistical Learning Theory. Springer Verlag, New York (1995)Google Scholar
  14. Vetter, T., Poggio, T.: Image Synthesis from a Single Example Image. Proceedings of the European Conference on Computer Vision, Cambridge UK, in press (1996)Google Scholar
  15. Vetter, T., Poggio, T., Bülthoff, H.: The importance of symmetry and virtual views in three-dimensional object recognition. Current Biology 4 (1994) 18–23Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1996

Authors and Affiliations

  • Bernhard Schölkopf
    • 1
    • 2
  • Chris Burges
    • 2
  • Vladimir Vapnik
    • 2
  1. 1.Max-Planck-Institut für biologische KybernetikTübingenGermany
  2. 2.AT&T Bell LaboratoriesHolmdelUSA

Personalised recommendations