Queries Revisited

  • Dana Angluin
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2225)

Abstract

We begin with a brief tutorial on the problem of learning a finite concept class over a finite domain using membership queries and/or equivalence queries. We then sketch general results on the number of queries needed to learn a class of concepts, focusing on the various notions of combinatorial dimension that have been employed, including the teaching dimension, the exclusion dimension, the extended teaching dimension, the fingerprint dimension, the sample exclusion dimension, the Vapnik-Chervonenkis dimension, the abstract identification dimension, and the general dimension.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    D. Angluin. Queries and concept learning. Machine Learning, 2:319–342, 1988.Google Scholar
  2. 2.
    D. Angluin. Negative results for equivalence queries. Machine Learning, 5:121–150, 1990.Google Scholar
  3. 3.
    E. M. Arkin, H. Meijer, J. S. B. Mitchell, D. Rappaport, and S. S. Skiena. Decision trees for geometric models. In Proceedings of the Ninth Annual Symposium on Computational Geometry, pages 369–378, San Diego, CA, 1993. ACM Press.Google Scholar
  4. 4.
    J. L. Balcázar, J. Castro, and D. Guijarro. Abstract combinatorial characterizations of exact learning via queries. In Proceedings of the 13th Annual Conference on Computational Learning Theory, pages 248–254. Morgan Kaufmann, San Francisco, 2000.Google Scholar
  5. 5.
    J. L. Balcázar, J. Castro, and D. Guijarro. A general dimension for exact learning. In Proceedings of the 14th Annual Conference on Computational Learning Theory, 2001.Google Scholar
  6. 6.
    J. L. Balcázar, J. Castro, D. Guijarro, and H.-U. Simon. The consistency dimension and distribution-dependent learning from queries. In Proceedings of the 10th International Conference on Algorithic Learning Theory-ALT’ 99, volume 1720 of LNAI, pages 77–92. Springer-Verlag, 1999.Google Scholar
  7. 7.
    A. Blumer, A. Ehrenfeucht, D. Haussler, and M. K. Warmuth. Learnability and the Vapnik-Chervonenkis dimension. J. ACM, 36:929–965, 1989.MATHMathSciNetCrossRefGoogle Scholar
  8. 8.
    A. Ehrenfeucht, D. Haussler, M. Kearns, and L. Valiant. A general lower bound on the number of examples needed for learning. Inform. Comput., 82:247–261, 1989.MATHMathSciNetCrossRefGoogle Scholar
  9. 9.
    R. Gavaldà. On the power of equivalence queries. In EUROCOLT: European Conference on Computational Learning Theory, pages 193–203. Clarendon Press, 1993.Google Scholar
  10. 10.
    S. A. Goldman and M. J. Kearns. On the complexity of teaching. J. of Comput. Syst. Sci., 50:20–31, 1995.MATHMathSciNetCrossRefGoogle Scholar
  11. 11.
    Y. Hayashi, S. Matsumoto, A. Shinohara, and M. Takeda. Uniform characterizations of polynomial-query learnabilities. In Proceedings of the 1st International Conference on Discovery Science (DS-98), volume 1532of LNAI, pages 84–92, 1998.Google Scholar
  12. 12.
    T. Hegedüs. Generalized teaching dimensions and the query complexity of learning. In Proceedings of the 8th Annual Conference on Computational Learning Theory, pages 108–117. ACM Press, New York, NY, 1995.Google Scholar
  13. 13.
    L. Hellerstein, K. Pillaipakkamnatt, V. Raghavan, and D. Wilkins. How many queries are needed to learn? In Proceedings of the Twenty-Seventh Annual ACM Symposium on the Theory of Computing, pages 190–199, 1995.Google Scholar
  14. 14.
    R. Hyafil and R. L. Rivest. Constructing optimal binary trees is NP-complete. Information Processing Letters, 5:15–17, 1976.MATHMathSciNetCrossRefGoogle Scholar
  15. 15.
    N. Littlestone. Learning quickly when irrelevant attributes abound: A new linearthreshold algorithm. Machine Learning, 2:285–318, 1988.Google Scholar
  16. 16.
    W. Maass and G. Turán. Lower bound methods and separation results for on-line learning models. Machine Learning, 9:107–145, 1992.MATHGoogle Scholar
  17. 17.
    M. Moshkov. Test theory and problems of machine learning. In Proceedings of the International School-Seminar on Discrete Mathematics and Mathematical Cybernetics, pages 6–10. MAX Press, Moscow, 2001.Google Scholar
  18. 18.
    A. Shinohara and S. Miyano. Teachability in computational learning. New Generation Computing, 8(4):337–348, 1991.MATHCrossRefGoogle Scholar
  19. 19.
    L. G. Valiant. A theory of the learnable. Commun. ACM, 27:1134–1142, 1984.MATHCrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2001

Authors and Affiliations

  • Dana Angluin
    • 1
  1. 1.Computer Science DepartmentYale UniversityNew Haven

Personalised recommendations