Machine Learning

, Volume 25, Issue 1, pp 51–70 | Cite as

PAC learning of one-dimensional patterns

  • Paul W. Goldberg
  • Sally A. Goldman
  • Stephen D. Scott


Developing the ability to recognize a landmark from a visual image of a robot's current location is a fundamental problem in robotics. We consider the problem of PAC-learning the concept class of geometric patterns where the target geometric pattern is a configuration ofk points on the real line. Each instance is a configuration ofn points on the real line, where it is labeled according to whether or not it visually resembles the target pattern. To capture the notion of visual resemblance we use the Hausdorff metric. Informally, two geometric patternsP andQ resemble each other under the Hausdorff metric if every point on one pattern is “close” to some point on the other pattern. We relate the concept class of geometric patterns to the landmark matching problem and then present a polynomial-time algorithm that PAC-learns the class of one-dimensional geometric patterns. We also present some experimental results on how our algorithm performs.


PAC learning landmark matching robot navigation 


  1. Anthony, M., & Biggs, N. (1992).Computational Learning Theory: an Introduction. Cambridge University Press.Google Scholar
  2. AslamA., & DecaturS. (1995). Specification and simulation of statistical query algorithms for efficiency and noise tolerance.Proceedings of the Eighth Annual ACM Conference on Computational Learning Theory (pp. 437–446). New York, NY: ACM Press.Google Scholar
  3. BlumerA., EhrenfeuchtA., HausslerD., & WarmuthM. K. (1987). Occam's Razor.Information Processing Letters, 24, 377–380.Google Scholar
  4. BlumerA., EhrenfeuchtA., HausslerD., & WarmuthM. K. (1989). Learnability and the Vapnik-Chervonenkis Dimension.Journal of the Association for Computing Machinery, 36(4), 929–965.Google Scholar
  5. EhrenfeuchtA., HausslerD., KearnsM., & ValiantL. G. (1989). A General Lower Bound on the Number of Examples Needed for Learning.Information and Computation, 82, 247–261.Google Scholar
  6. GoldbergP., & GoldmanS. (1994). Learning one-dimensional geometric patterns under one-sided random misclassification noise.Proceedings of the Seventh Annual ACM Conference on Computational Learning Theory (pp. 246–255). New York, NY: ACM Press.Google Scholar
  7. Goldberg, P. & Jerrum, M. (1995) Bounding the Vapnik-Chervonenkis dimension of concept classes parameterized by real numbers.Machine Learning, 18, 131–148. Special issue for theSixth Annual ACM Conference on Computational Learning Theory.Google Scholar
  8. Goldberg, P. (1992).PAC-Learning Geometrical Figures. PhD thesis, Department of Computer Science, University of Edinburgh.Google Scholar
  9. GoldmanS. & ScottS. (1996). A theoretical and empirical study of a noise-tolerant algorithm to learn geometric patterns.Machine Learning: Proceedings of the Thirteenth International Conference (pp. 191–199). San Francisco, CA: Morgan Kaufmann.Google Scholar
  10. Gruber, P. M. (1983). Approximation of convex bodies. In P. M. Gruber and P. M. Willis, editors,Convexity and its applications. Brikhauser Verlag.Google Scholar
  11. HausslerD., KearnsM., LittlestoneN., & WarmuthM. K. (1991). Equivalence of models for polynomial learnability.Information and Computation, 95(2), 129–161.Google Scholar
  12. Haussler, D., Littlestone, N., & Warmuth, M. K. (1988). Predicting {0, 1} functions on randomly drawn points.Proceedings of the 29th IEEE Symposium on Foundations of Computer Science (pp. 100–109).Google Scholar
  13. HongJ., TanX., PinetteB., WeissR., & RisemanE. M. (1992). Image-based homing.IEEE Control Systems Magazine, 12(1), 38–45.Google Scholar
  14. KearnsM. (1993). Efficient noise-tolerant learning from statistical queries.Proceedings of the 25th Annuual ACM Symposium on Theory of Computing (pp. 392–401). New York, NY: ACM Press.Google Scholar
  15. KearnsM. & VaziraniU. (1994).An Introduction to Computational Learning Theory. Cambridge, MA: MIT Press.Google Scholar
  16. LevittT. S. & LawtonD. T. (1990). Qualitative navigation for mobile robots.Artificial Intelligence, 44(3), 305–360.Google Scholar
  17. LittlestoneN. (1988). Learning when irrelevant attributes abound: A new linear-threshold algorithm.Machine Learning 2, 285–318.Google Scholar
  18. NatarajanB. K. (1991).Machine Learning: A Theoretical Approach. San Francisco, CA: Morgan Kaufman Publishers.Google Scholar
  19. Pinette, B. (1993).Image-Based Navigation Through Large-Scaled Environments. PhD thesis, University of Massachusetts, Amherst.Google Scholar
  20. PittL. & ValiantL. (1988). Computational limitations on learning from examples.J ACM, 35, 965–984.Google Scholar
  21. PittL. & WarmuthM. K. (1990). Prediction preserving reducibility.J. of Comput. Syst. Sci., 41(3), 430–467. Special issue of the for theThird Annual Conference of Structure in Complexity Theory (Washington, DC., June 1988).Google Scholar
  22. RenegarJ. (1992). On the Computational Complexity and Geometry of the First-Order Theory of the Reals. Part 1 (of 3).Journal of Symbolic Computation, 13, 255–299.Google Scholar
  23. SuzukiH. & ArimotoS. (1988). Visual control of autonomous mobile robot based on self-organizing model for pattern learning.Journal of Robotic Systems, 5(5), 453–470.Google Scholar
  24. ValiantL. G. (1984). A Theory of the Learnable.Communications of the ACM, 27(11), 1134–1142.Google Scholar
  25. Valiant, L. G. (1985). Learning Disjunctions of Conjunctions.Procs of the 9th International Joint Conference on AI (pp. 560–566).Google Scholar
  26. ValiantL. G. (1991). A View of Computational Learning Theory.NEC Research Symposium: Computation and Cognition (ed. C. W.Gear), SIAM, Philadelphia.Google Scholar
  27. VapnikV. N., & ChervonenkisA. Ya. (1971). On the uniform convergence of relative frequencies of events to their probabilities.Theory of Probability and its Applications, 16(2), 264–280.Google Scholar

Copyright information

© Kluwer Academic Publishers 1996

Authors and Affiliations

  • Paul W. Goldberg
    • 1
  • Sally A. Goldman
    • 2
  • Stephen D. Scott
    • 3
  1. 1.Dept. of Computer Science and Applied MathematicsAston UniversityBirminghamU.K.
  2. 2.Dept. of Computer ScienceWashington UniversitySt. Louis
  3. 3.Dept. of Computer ScienceWashington UniversitySt. Louis

Personalised recommendations