Advertisement

Complexity of computing Vapnik-Chervonenkis dimension

  • Ayumi Shinohara
Selected Papers Approximate Learning
Part of the Lecture Notes in Computer Science book series (LNCS, volume 744)

Abstract

The Vapnik-Chervonenkis (VC) dimension is known to be the crucial measure of the polynomial-sample learnability in the PAC-learning model. This paper investigates the complexity of computing VC-dimension of a concept class over a finite learning domain. We consider a decision problem called the discrete VC-dimension problem which is, for a given matrix representing a concept class F and an integer K, to determine whether the VC-dimension of F is greater than K or not. We prove that (1) the discrete VC-dimension problem is polynomial-time reducible to the satisfiability problem of length J with O(log2J) variables, and (2) for every constant C, the satisfiability problem in conjunctive normal form with m clauses and Clog2m variables is polynomial-time reducible to the discrete VC-dimension problem. These results can be interpreted, in some sense, that the problem is “complete” for the class of nO(log n time computable sets.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    S. Ben-David, N. Cesa-Bianchi, and P.M. Long. Characterizations of learnability for classes of {0,..., n}-valued functions. In Proc. 5th Annual Workshop on Computational Learning Theory, pages 333–340, 1992.Google Scholar
  2. 2.
    A. Blumer, A. Ehrenfeucht, D. Haussler, and M.K. Warmuth. Learnability and the Vapnik-Chervonenkis dimension. JACM, 36(4):929–965, 1989.Google Scholar
  3. 3.
    N. Linial, Y. Mansour, and R.L. Rivest. Results on learnability and the Vapnik-Chervonenkis dimension. Information and Computation, 90:33–49, 1991.Google Scholar
  4. 4.
    N. Megiddo and U. Vishkin. On finding a minimum dominating set in a tournament. Theoretical Computer Science, 61:307–316, 1988.Google Scholar
  5. 5.
    B.K. Natarajan. Machine Learning — A Theoretical Approach. Morgan Kaufmann Publishers, 1991.Google Scholar
  6. 6.
    S.H. Nienhuys-Cheng and M. Polman. Complexity dimensions and learnability. In Proc. European Conference on Machine Learning, (Lecture Notes in Artificial Intelligence 667), pages 348–353, 1993.Google Scholar
  7. 7.
    K. Romanik. Approximate testing and learnability. In Proc. 5th Annual Workshop on Computational Learning Theory, pages 327–332, 1992.Google Scholar
  8. 8.
    L.G. Valiant. A theory of the learnable. CACM, 27(11):1134–1142, 1984.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1993

Authors and Affiliations

  • Ayumi Shinohara
    • 1
  1. 1.Research Institute of Fundamental Information ScienceKyushu University 33FukuokaJapan

Personalised recommendations