Abstract
This paper describes the probably approximately correct model of concept learning, paying special attention to the case where instances are points in Euclidean n-space. The problem of learning from noisy training data is also studied.
Supported by NSF grant CCR-9108753. Email:sloan@eecs.uic.edu
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Dana Angluin. Queries and concept learning. Machine Learning, 2:319–342, 1987.
Dana Angluin and Philip Laird. Learning from noisy examples. Machine Learning, 2(4):343–370. 1988.
Dana Angluin and Carl H. Smith. Inductive inference: Theory and methods. Computing Surveys, 15(3):237–269, September 1983.
Patrick Assouad. Densité et dimension. Ann. Inst. Fourier, Grenoble, 33(3):233–282, 1983.
S. Ben-David, A. Itai, and E. Kushilevitz. Learning by distances. In Proc. 3rd Annu. Workshop on Comput. Learning Theory, pages 232–245. San Mateo, CA, 1990. Morgan Kaufmann.
Shai Ben-David and Gyora M. Benedek. Measurability constraints on pac learnability. Technical report, Technion, Haifa, 1991.
Anselm Blumer, Andrzej Ehrenfeucht, David Haussler, and Manfred K. Warmuth. Occam’s razor. Information Processing Letters, 24:377–380, April 1987.
Anselm Blumer, Andrzej Ehrenfeucht, David Haussler, and Manfred K. Warmuth. Learnability and the Vapnik-Chervonenkis dimension. Journal of the ACM, 36(4):929–965, 1989.
R. M. Dudley. A course on empirical processes. In Lecture Notes in Mathematics No. 1097. Springer-Verlag, 1984.
Andrzej Ehrenfeucht, David Haussler, Michael Kearns, and Leslie Valiant. A general lower bound on the number of examples needed for learning. Information and Computation, 82:247–261, 1989.
E. Mark Gold. Language identification in the limit. Information and Control 10:447–474, 1967.
Sally A. Goldman and Robert H. Sloan. Can PAC learning algorithms tolerate random attribute noise? Algorithmica, 1995. To appear. Preliminary version available as “The Difficulty of Random Attribute Noise,” Washington University Dept. of Computer Science Tech. Report WUCS-91–92. 1991.
David Haussler, Michael Kearns, Nick Littlestone, and Manfred K. Warmuth. Equivalence of models for polynomial learnability. Information and Computation, 95(2): 129–161, December 1991.
David Haussler and Elmo Welzl. Epsilon-nets and simplex range queries. Discrete Computational Geometry, 2:127–151, 1987.
David Helmbold, Robert Sloan, and Manfred K. Warmuth. Learning nested differences of intersection-closed concept classes. Machine Learning, 5(2): 165–196, 1990.
David Helmbold, Robert Sloan, and Manfred K. Warmuth. Learning integer lattices. SIAM Journal on Computing, 21(2):240–266, 1992.
M. Kearns and M. Li. Learning in the presence of malicious errors. SIAM J. Comput., 22:807–837, 1993.
Michael Kearns, Ming Li, Leonard Pitt, and Leslie Valiant. Recent results on boolean concept learning. In Proceedings of the Fourth International Workshop on Machine Learning, pages 337–352, University of California, Irvine, June 1987.
Nathan Linial, Yishay Mansour, and Ronald L. Rivest. Results on learnability and the Vapnik-Chervonenkis dimension. Information and Computation, 90(1):33–49, January 1991.
Judea Pearl. On the connection between the complexity and credibility of inferred models. Journal of General Systems, 4:255–264, 1978.
Leonard Pitt and Leslie G. Valiant. Computational limitations on learning from examples. Journal of the ACM, 35(4):965–984, 1988.
David Pollard. Convergence of Stochastic Processes. Springer-Verlag, 1984.
Steven Salzberg. A nearest hyperrectangle learning method. Machine Learning, 6(3):251–276, 1991.
Robert H. Sloan. Four types of noise in data for PAC learning. Information Processing Letters. To appear.
Robert H. Sloan. Types of noise in data for concept learning. In First Workshop on Computational Learning Theory, pages 91–96. Morgan Kaufmann, 1988.
Leslie G. Valiant. A theory of the learnable. Communications of the ACM, 27(11):1134–1142, November 1984.
Leslie G. Valiant. Learning disjunctions of conjunctions. In Proceedings IJCAI-85, pages 560–566. International Joint Committee for Artificial Intelligence, Morgan Kaufmann, August 1985.
V. N. Vapnik. Estimation of Dependences Based on Empirical Data. Springer-Verlag, New York, 1982.
V. N. Vapnik and A. Ya. Chervonenkis. On the uniform convergence of relative frequencies of events to their probabilities. Theory of Probability and its Applications, XVI(2):264–280. 1971.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 1996 Birkhäuser Boston
About this chapter
Cite this chapter
Sloan, R.H. (1996). Pac Learning, Noise, and Geometry. In: Kueker, D.W., Smith, C.H. (eds) Learning and Geometry: Computational Approaches. Progress in Computer Science and Applied Logic, vol 14. Birkhäuser Boston. https://doi.org/10.1007/978-1-4612-4088-4_2
Download citation
DOI: https://doi.org/10.1007/978-1-4612-4088-4_2
Publisher Name: Birkhäuser Boston
Print ISBN: 978-1-4612-8646-2
Online ISBN: 978-1-4612-4088-4
eBook Packages: Springer Book Archive