Encyclopedia of Machine Learning and Data Mining

2017 Edition
| Editors: Claude Sammut, Geoffrey I. Webb

VC Dimension

  • Thomas Zeugmann
Reference work entry
DOI: https://doi.org/10.1007/978-1-4899-7687-1_881

Motivation and Background

We define an important combinatorial parameter that measures the combinatorial complexity of a family of subsets taken from a given universe (learning domain) X. This parameter was originally defined by Vapnik and Chervonenkis (1971) and is thus commonly referred to as Vapnik-Chervonenkis dimension, commonly abbreviated as VC dimension. Subsequently, Dudley (19781979) generalized Vapnik and Chervonenkis’ (1971) results. The reader is also referred to Vapnik’s (2000) book in which he greatly extends the original ideas. This results in a theory which is called structural risk minimization.

The importance of the VC dimension for  PAC learning was discovered by Blumer et al. (1989) who introduced the notion to computational learning theory.

As Anthony and Biggs (1992, Page 71) have put it, “The development of this notion is probably the most significant contribution that mathematics has made to Computational Learning Theory.”

Recall that we use | S | and (S)...

This is a preview of subscription content, log in to check access.

Recommended Reading

  1. Anthony M, Bartlett PL (1999) Neural network learning: theoretical foundations. Cambridge University Press, CambridgeCrossRefMATHGoogle Scholar
  2. Anthony M, Biggs N (1992) Computational learning theory. Cambridge tracts in theoretical computer science, Vol 30. Cambridge University Press, CambridgeGoogle Scholar
  3. Arora S, Barak B (2009) Computational complexity: A Modern approach. Cambridge University Press, CambridgeCrossRefMATHGoogle Scholar
  4. Blumer A, Ehrenfeucht A, Haussler D, Warmuth MK (1989) Learnability and the Vapnik-Chervonenkis dimension. J ACM 36(4):929–965MathSciNetCrossRefMATHGoogle Scholar
  5. Dudley RM (1978) Central limit theorems for empirical measures. Ann Probab 6(6):899–929MathSciNetCrossRefMATHGoogle Scholar
  6. Dudley RM (1979) Corrections to “Central limit theorems for empirical measures”. Ann Probab 7(5):909–911MathSciNetCrossRefMATHGoogle Scholar
  7. Goldberg PW, Jerrum MR (1995) Bounding the Vapnik-Chervonenkis dimension of concept classes parameterized by real numbers. Mach Learn 18(2-3):131–148CrossRefMATHGoogle Scholar
  8. Gurvits L (1997) Linear algebraic proofs of VC-dimension based inequalities. In: Ben-David S (ed) Proceedings of the third european conference on computational learning theory, EuroCOLT ’97, Jerusalem, Israel, March 1997, Lecture notes in artificial Intelligence, vol 1208. Springer, pp 238–250Google Scholar
  9. Haussler D, Littlestone N, Warmuth MK (1994) Predicting {0, 1} functions on randomly drawn points. Info Comput 115(2):248–292MathSciNetCrossRefMATHGoogle Scholar
  10. Haussler D, Welz E (1987) Epsilon nets and simplex range queries. Discret Comput Geom 2:127–151MathSciNetCrossRefMATHGoogle Scholar
  11. Karpinski M, Macintyre A (1995) Polynomial bounds for VC dimension of sigmoidal neural networks. In: Proceedings of the 27th annual ACM symposium on theory of computing, ACM Press, New York, pp 200–208Google Scholar
  12. Karpinski M, Werther T (1994) VC dimension and sampling complexity of learning sparse polynomials and rational functions. In: Hanson SJ, Drastal GA, Rivest RL (eds) Computational learning theory and natural learning systems. Constraints and prospects, vol I, chap. 11 MIT Press, pp 331–354Google Scholar
  13. Kearns MJ, Vazirani UV (1994) An Introduction to computational learning theory. The MIT Press, Cambridge, MassachusettsGoogle Scholar
  14. Linial N, Mansour Y, Rivest RL (1991) Results on learnability and the Vapnik-Chervonenkis dimension. Inform Comput 90(1):33–49MathSciNetCrossRefMATHGoogle Scholar
  15. Littlestone N (1988) Learning quickly when irrelevant attributes abound: A new linear-threshold algorithm. Mach Learn 2(4):285–318Google Scholar
  16. Maass W, Turán G (1990) On the complexity of learning from counterexamples and membership queries. In: Proceedings of the 31st annual symposium on foundations of computer science (FOCS 1990), St. Louis, 22-24 October 1990. IEEE Computer Society Press, Los Alamitos, pp 203–210Google Scholar
  17. Mitchell A, Scheffer T, Sharma A, Stephan F (1999) The VC-dimension of subclasses of pattern languages. In: Watanabe O, Yokomori T (eds) Proceedings of the 10th international conference on algorithmic learning theory, ALT ’99, Tokyo, Dec 1999, Lecture notes in artificial intelligence, vol 1720. Springer, pp 93–105.Google Scholar
  18. Natschläger T, Schmitt M (1996) Exact VC-dimension of Boolean monomials. Infor Process Lett 59(1): 19–20MathSciNetCrossRefMATHGoogle Scholar
  19. Papadimitriou CH, Yannakakis M (1996) On limited nondeterminism and the complexity of the V-C dimension. J Comput Syst Sci 53(2):161–170MathSciNetCrossRefMATHGoogle Scholar
  20. Sakurai A (1995) On the VC-dimension of depth four threshold circuits and the complexity of Boolean-valued functions. Theoret Comput Sci 137(1):109–127MathSciNetCrossRefMATHGoogle Scholar
  21. Sauer N (1972) On the density of families of sets. J Comb Theory (A) 13(1):145–147MathSciNetCrossRefMATHGoogle Scholar
  22. Schaefer M (1999) Deciding the Vapnik-Červonenkis dimension is \(\Sigma _{3}^{p}\)-complete. J Comput Syst Sci 58(1): 177–182CrossRefMATHGoogle Scholar
  23. Shinohara A (1995) Complexity of computing Vapnik-Chervonenkis dimension and some generalized dimensions. Theoret Comput Sci 137(1):129–144MathSciNetCrossRefMATHGoogle Scholar
  24. Vapnik VN (2000) The nature of statistical learning theory, 2nd edn. Springer, BerlinCrossRefMATHGoogle Scholar
  25. Vapnik VN, Chervonenkis AY (1971) On the uniform convergence of relative frequencies of events to their probabilities. Theory Probab Appl 16(2):264–280CrossRefMATHGoogle Scholar
  26. Vapnik VN, Chervonenkis AY (1974) Theory of pattern recognition. Nauka, Moskwa (In Russian)MATHGoogle Scholar
  27. Wenocur RS, Dudley RM (1981) Some special Vapnik-Chervonenkis classes. Discret Math 33:313–318MathSciNetCrossRefMATHGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2017

Authors and Affiliations

  1. 1.Hokkaido UniversitySapporoJapan