Advertisement

Complexity of computing generalized VC-dimensions

  • Ayumi Shinohara
Extended Abstracts
Part of the Lecture Notes in Computer Science book series (LNCS, volume 784)

Abstract

In the PAC-learning model, the Vapnik-Chervonenkis (VC) dimension plays the key role to estimate the polynomial-sample learnability of a class of binary functions. For a class of {0,..., m}-valued functions, the notion has been generalized in various ways. This paper investigates the complexity of computing some of generalized VC-dimensions: VC*-dimension, Ψ*-dimension, and ΨG-dimension. For each dimension, we consider a decision problem that is, for a given matrix representing a class F of functions and an integer K, to determine whether the dimension of F is greater than K or not. We prove that the VC*-dimension problem is polynomial-time reducible to the satisfiability problem of length J with O(log2J) variables, which includes the original VC-dimension problem as a special case. We also show that the ΨG-dimension problem is still reducible to the satisfiability problem of length J with O(log2 J), while the Ψ*-dimension problem becomes NP-complete.

References

  1. 1.
    N. Alon. On the density of sets of vectors. Discrete Mathematics, 46:199–202, 1983.CrossRefGoogle Scholar
  2. 2.
    S. Ben-David, N. Cesa-Bianchi, and P. M. Long. Characterizations of learnability for classes of ..., n-valued functions. In Proceedings of the 5th Annual Workshop on Computational Learning Theory, pages 333–340, 1992.Google Scholar
  3. 3.
    A. Blumer, A. Ehrenfeucht, D. Haussler, and M. Warmuth. Learnability and the Vapnik-Chervonenkis dimension. JACM, 36(4):929–965, 1989.Google Scholar
  4. 4.
    D. Haussler. Decision theoretic generalizations of the PAC model for neural net and other learning applications. Information and Computation, 100:78–150, 1992.Google Scholar
  5. 5.
    N. Linial, Y. Mansour, and R. L. Rivest. Results on learnability and the Vapnik-Chervonenkis dimension. Information and Computation, 90:33–49, 1991.Google Scholar
  6. 6.
    N. Megiddo and U. Vishkin. On finding a minimum dominating set in a tournament. Theoretical Computer Science, 61:307–316, 1988.Google Scholar
  7. 7.
    B. Natarajan. On learning sets and functions. Machine Learning, 4(1):67–97, 1989.Google Scholar
  8. 8.
    B. Natarajan. Machine Learning-A Theoretical Approach. Morgan Kaufmann Publishers, 1991.Google Scholar
  9. 9.
    S. Nienhuys-Cheng and M. Polman. Complexity dimensions and learnability. In Proc. European Conference on Machine Learning, (Lecture Notes in Artificial Intelligence 667), pages 348–353, 1993.Google Scholar
  10. 10.
    C.H. Papadimitriou and M. Yannakakis. On limited nondeterminism and the complexity of the V-C dimension. In Proc. 8th Annual Conference on Structure in Complexity Theory, pages 12–18, 1993.Google Scholar
  11. 11.
    A. Shinohara. Complexity of computing Vapnik-Chervonenkis dimension. In Proc. 4th Workshop on Algorithmic Learning Theory, pages 279–287, 1993.Google Scholar
  12. 12.
    A. Shinohara. Complexity of computing generalized VC-dimensions. RIFIS Technical Report, RIFIS-TR-CS 78, Research Institute of Fundamental Information Science, Kyushu University, 1993.Google Scholar
  13. 13.
    L. Valiant. A theory of the learnable. CACM, 27(11):1134–1142, 1984.Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 1994

Authors and Affiliations

  • Ayumi Shinohara
    • 1
  1. 1.Research Institute of Fundamental Information ScienceKyushu University 33FukuokaJapan

Personalised recommendations