- 274 Downloads
Motivation and Background
We define an important combinatorial parameter that measures the combinatorial complexity of a family of subsets taken from a given universe (learning domain) X. This parameter was originally defined by Vapnik and Chervonenkis (1971) and is thus commonly referred to as Vapnik–Chervonenkis dimension, abbreviated as VC dimension. Subsequently, Dudley (1978, 1979) generalized Vapnik and Chervonenkis (1971) results. The reader is also referred to Vapnik’s (2000) book in which he greatly extends the original ideas. This results in a theory which is called structural risk minimization.
As Anthony and Biggs (1992, p. 71) have put it, “The development of this notion is probably the most significant contribution that mathematics has made to Computational Learning Theory.”
Recall that we use...
- Anthony, M., & Biggs, N. (1992). Computational learning theory. Cambridge tracts in theoretical computer science (No. 30). Cambridge: Cambridge University Press.Google Scholar
- Gurvits, L. (1997). Linear algebraic proofs of VC-dimension based inequalities. In S. Ben-David (Ed.), Computational learning theory, third European conference, EuroCOLT ’97, Jerusalem, Israel, March 1997, Proceedings, Lecture notes in artificial intelligence (Vol. 1208, pp. 238–250). Springer.Google Scholar
- Karpinski, M., & Werther, T. (1994). VC dimension and sampling complexity of learning sparse polynomials and rational functions. In S. J. Hanson, G. A. Drastal, and R. L. Rivest (Eds.), Computational learning theory and natural learning systems, Vol. I: Constraints and prospects (Chap. 11, pp. 331–354). Cambridge, MA: MIT Press.Google Scholar
- Kearns, M. J., & Vazirani, U. V. (1994). An introduction to computational learning theory. Cambridge, MA: MIT Press.Google Scholar
- Littlestone, N. (1988). Learning quickly when irrelevant attributes abound: A new linear-threshold algorithm. Machine Learning, 2(4), 285–318.Google Scholar
- Maass, W., & Turan, G. (1990). On the complexity of learning from counterexamples and membership queries. In Proceedings of the thirty-first annual symposium on Foundations of Computer Science (FOCS 1990), St. Louis, Missouri, October 22–24, 1990 (pp. 203–210). Los Alamitos, CA: IEEE Computer Society Press.Google Scholar
- Mitchell, A., Scheffer, T., Sharma, A., & Stephan, F. (1999). The VC-dimension of subclasses of pattern languages. In O. Watanabe & T. Yokomori (Eds.), Algorithmic learning theory, tenth international conference, ALT’99, Tokyo, Japan, December 1999, Proceedings, Lecture notes in artificial intelligence (Vol. 1720, pp. 93–105). Springer.Google Scholar
- Sakurai, A. (1995). On the VC-dimension of depth four threshold circuits and the complexity of Boolean-valued functions. Theoretical Computer Science, 137(1), 109–127. Special issue for ALT ’93Google Scholar
- Vapnik, V. N., & Chervonenkis, A. Y. (1974). Theory of pattern recognition. Moskwa: Nauka (in Russian).Google Scholar