Skip to main content

Pac Learning, Noise, and Geometry

  • Chapter
  • 266 Accesses

Part of the book series: Progress in Computer Science and Applied Logic ((PCS,volume 14))

Abstract

This paper describes the probably approximately correct model of concept learning, paying special attention to the case where instances are points in Euclidean n-space. The problem of learning from noisy training data is also studied.

Supported by NSF grant CCR-9108753. Email:sloan@eecs.uic.edu

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Dana Angluin. Queries and concept learning. Machine Learning, 2:319–342, 1987.

    Google Scholar 

  2. Dana Angluin and Philip Laird. Learning from noisy examples. Machine Learning, 2(4):343–370. 1988.

    Google Scholar 

  3. Dana Angluin and Carl H. Smith. Inductive inference: Theory and methods. Computing Surveys, 15(3):237–269, September 1983.

    Article  MathSciNet  Google Scholar 

  4. Patrick Assouad. Densité et dimension. Ann. Inst. Fourier, Grenoble, 33(3):233–282, 1983.

    Article  MathSciNet  MATH  Google Scholar 

  5. S. Ben-David, A. Itai, and E. Kushilevitz. Learning by distances. In Proc. 3rd Annu. Workshop on Comput. Learning Theory, pages 232–245. San Mateo, CA, 1990. Morgan Kaufmann.

    Google Scholar 

  6. Shai Ben-David and Gyora M. Benedek. Measurability constraints on pac learnability. Technical report, Technion, Haifa, 1991.

    Google Scholar 

  7. Anselm Blumer, Andrzej Ehrenfeucht, David Haussler, and Manfred K. Warmuth. Occam’s razor. Information Processing Letters, 24:377–380, April 1987.

    Article  MathSciNet  MATH  Google Scholar 

  8. Anselm Blumer, Andrzej Ehrenfeucht, David Haussler, and Manfred K. Warmuth. Learnability and the Vapnik-Chervonenkis dimension. Journal of the ACM, 36(4):929–965, 1989.

    Article  MathSciNet  MATH  Google Scholar 

  9. R. M. Dudley. A course on empirical processes. In Lecture Notes in Mathematics No. 1097. Springer-Verlag, 1984.

    Google Scholar 

  10. Andrzej Ehrenfeucht, David Haussler, Michael Kearns, and Leslie Valiant. A general lower bound on the number of examples needed for learning. Information and Computation, 82:247–261, 1989.

    Article  MathSciNet  MATH  Google Scholar 

  11. E. Mark Gold. Language identification in the limit. Information and Control 10:447–474, 1967.

    Article  MATH  Google Scholar 

  12. Sally A. Goldman and Robert H. Sloan. Can PAC learning algorithms tolerate random attribute noise? Algorithmica, 1995. To appear. Preliminary version available as “The Difficulty of Random Attribute Noise,” Washington University Dept. of Computer Science Tech. Report WUCS-91–92. 1991.

    Google Scholar 

  13. David Haussler, Michael Kearns, Nick Littlestone, and Manfred K. Warmuth. Equivalence of models for polynomial learnability. Information and Computation, 95(2): 129–161, December 1991.

    Article  MathSciNet  MATH  Google Scholar 

  14. David Haussler and Elmo Welzl. Epsilon-nets and simplex range queries. Discrete Computational Geometry, 2:127–151, 1987.

    Article  MathSciNet  MATH  Google Scholar 

  15. David Helmbold, Robert Sloan, and Manfred K. Warmuth. Learning nested differences of intersection-closed concept classes. Machine Learning, 5(2): 165–196, 1990.

    Google Scholar 

  16. David Helmbold, Robert Sloan, and Manfred K. Warmuth. Learning integer lattices. SIAM Journal on Computing, 21(2):240–266, 1992.

    Article  MathSciNet  MATH  Google Scholar 

  17. M. Kearns and M. Li. Learning in the presence of malicious errors. SIAM J. Comput., 22:807–837, 1993.

    Article  MathSciNet  MATH  Google Scholar 

  18. Michael Kearns, Ming Li, Leonard Pitt, and Leslie Valiant. Recent results on boolean concept learning. In Proceedings of the Fourth International Workshop on Machine Learning, pages 337–352, University of California, Irvine, June 1987.

    Google Scholar 

  19. Nathan Linial, Yishay Mansour, and Ronald L. Rivest. Results on learnability and the Vapnik-Chervonenkis dimension. Information and Computation, 90(1):33–49, January 1991.

    Article  MathSciNet  MATH  Google Scholar 

  20. Judea Pearl. On the connection between the complexity and credibility of inferred models. Journal of General Systems, 4:255–264, 1978.

    Article  MathSciNet  MATH  Google Scholar 

  21. Leonard Pitt and Leslie G. Valiant. Computational limitations on learning from examples. Journal of the ACM, 35(4):965–984, 1988.

    Article  MathSciNet  MATH  Google Scholar 

  22. David Pollard. Convergence of Stochastic Processes. Springer-Verlag, 1984.

    MATH  Google Scholar 

  23. Steven Salzberg. A nearest hyperrectangle learning method. Machine Learning, 6(3):251–276, 1991.

    MathSciNet  Google Scholar 

  24. Robert H. Sloan. Four types of noise in data for PAC learning. Information Processing Letters. To appear.

    Google Scholar 

  25. Robert H. Sloan. Types of noise in data for concept learning. In First Workshop on Computational Learning Theory, pages 91–96. Morgan Kaufmann, 1988.

    Google Scholar 

  26. Leslie G. Valiant. A theory of the learnable. Communications of the ACM, 27(11):1134–1142, November 1984.

    Article  MATH  Google Scholar 

  27. Leslie G. Valiant. Learning disjunctions of conjunctions. In Proceedings IJCAI-85, pages 560–566. International Joint Committee for Artificial Intelligence, Morgan Kaufmann, August 1985.

    Google Scholar 

  28. V. N. Vapnik. Estimation of Dependences Based on Empirical Data. Springer-Verlag, New York, 1982.

    MATH  Google Scholar 

  29. V. N. Vapnik and A. Ya. Chervonenkis. On the uniform convergence of relative frequencies of events to their probabilities. Theory of Probability and its Applications, XVI(2):264–280. 1971.

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1996 Birkhäuser Boston

About this chapter

Cite this chapter

Sloan, R.H. (1996). Pac Learning, Noise, and Geometry. In: Kueker, D.W., Smith, C.H. (eds) Learning and Geometry: Computational Approaches. Progress in Computer Science and Applied Logic, vol 14. Birkhäuser Boston. https://doi.org/10.1007/978-1-4612-4088-4_2

Download citation

  • DOI: https://doi.org/10.1007/978-1-4612-4088-4_2

  • Publisher Name: Birkhäuser Boston

  • Print ISBN: 978-1-4612-8646-2

  • Online ISBN: 978-1-4612-4088-4

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics