Advertisement

On Exact Learning Halfspaces with Random Consistent Hypothesis Oracle

  • Nader H. Bshouty
  • Ehab Wattad
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4264)

Abstract

We study exact learning of halfspaces from equivalence queries. The algorithm uses an oracle RCH that returns a random consistent hypothesis to the counterexamples received from the equivalence query oracle. We use the RCH oracle to give a new polynomial time algorithm for exact learning halfspaces from majority of halfspaces and show that its query complexity is less (by some constant factor) than the best known algorithm that learns halfspaces from halfspaces.

Keywords

Polynomial Time Boolean Function Concept Class Query Complexity Equivalence Query 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [A88]
    Angluin, D.: Queries and concept learning. Machine Learning 2, 319–342 (1987)Google Scholar
  2. [B97]
    Bshouty, N.H.: Exact learning of formulas in parallel. Machine Learning 26, 25–41 (1997)MATHCrossRefGoogle Scholar
  3. [BBK97]
    Ben-David, S., Bshouty, N.H., Kushilevitz, E.: A Composition Theorem for Learning Algorithms with Applications to Geometric Concept Classes. In: STOC 1997, pp. 324–333 (1997)Google Scholar
  4. [BC+96]
    Bshouty, N.H., Cleve, R., Gavaldà, R., Kannan, S., Tamon, C.: Oracles and Queries That Are Sufficient for Exact Learning. Journal of Computer and System Sciences 52(3), 421–433 (1996)MATHCrossRefMathSciNetGoogle Scholar
  5. [BEHW89]
    Blumer, A., Ehrenfeucht, A., Haussler, D., Warmuth, M.K.: Learnability and the Vapnik-Chervonenkis dimension. J. ACM 36(4), 929–965 (1989)MATHCrossRefMathSciNetGoogle Scholar
  6. [BG02]
    Bshouty, N.H., Gavinsky, D.: PAC=PAExact and other equivalent models. In: FOCS 2002, pp. 167–176 (2002)Google Scholar
  7. [BJT02]
    Bshouty, N.H., Jackson, J.C., Tamon, C.: Exploring learnability between exact and PAC. In: Kivinen, J., Sloan, R.H. (eds.) COLT 2002. LNCS, vol. 2375, pp. 244–254. Springer, Heidelberg (2002)CrossRefGoogle Scholar
  8. [BV02]
    Bertsimas, D., Vempala, S.: Solving convex programs by random walks. In: STOC 2002, pp. 109–115 (2002)Google Scholar
  9. [DV04]
    Dunagan, J., Vempala, S.: A simple polynomial-time rescaling algorithm for solving linear programs. In: STOC 2004, pp. 315–320 (2004)Google Scholar
  10. [G60]
    Grunbaum, B.: Partitions of mass-distributions and convex bodies by hyperplanes. Pacific J. Math 10, 1257–1261 (1960)MathSciNetGoogle Scholar
  11. [H94]
    Hastad, J.: On the Size of Weights for Threshold Gates. SIAM Journal on Discrete Mathematics 3(7), 484–492 (1994)CrossRefMathSciNetGoogle Scholar
  12. [HW87]
    Haussler, D., Welzl, E.: Epsilon-nets and simplex range queries. Discrete Comput. Geom. 2, 127–151 (1987)MATHCrossRefMathSciNetGoogle Scholar
  13. [L88]
    Littlestone, N.: Learning when irrelevant attributes abound: A new linear-threshold algorithm. Machine Learning 2, 285–318 (1988)Google Scholar
  14. [L98]
    Lovász, L.: Hit-and-run mixes fast. Mathematical Programming 86, 443–461 (1998)Google Scholar
  15. [M94]
    Maass, W.: Perspectives of current research about the complexity of learning on neural nets. In: Roychowdhury, V.P., Siu, K.Y., Orlitsky, A. (eds.) Theoretical Advances in Neural Computation and Learning, pp. 295–336. Kluwer Academic Publishers, Boston (1994)Google Scholar
  16. [MP43]
    McCulloch, W.S., Pitts, W.: A logical calculus of ideas immanent in nervous activity. Bulletin of mathematical biophysics 5, 115–133 (1943)MATHCrossRefMathSciNetGoogle Scholar
  17. [MT94]
    Maass, W., Turan, G.: How fast can a threshold gate learn. In: Hanson, S.J., Drastal, G.A., Rivest, R.L. (eds.) Computational Learning Theory and Natural Learning System: Constraints and Prospects, pp. 381–414. MIT Press, Cambridge (1994)Google Scholar
  18. [P94]
    Parberry, I.: Circuit complexity and neural networks. The MIT press, Cambridge (1994)MATHGoogle Scholar
  19. [R62]
    Rosenblatt, F.: Principles of neurodynamics: Perceptrons and the theory of brain mechanisms. Spartan Books, New York (1962)MATHGoogle Scholar
  20. [S02]
    Servedio, R.: Perceptron, Winnow, and PAC Learning. SIAM Journal on Computing 31(5), 1358–1369 (2002)MATHCrossRefMathSciNetGoogle Scholar
  21. [V96]
    Vaidya, P.M.: New algorithm for minimizing convex functions over convex sets. Mathematical Programming, 291–341 (1996)Google Scholar
  22. [V84]
    Valiant, L.: A theory of the learnable. Communications of the ACM 27(11), 1134–1142 (1984)MATHCrossRefGoogle Scholar
  23. [VC71]
    Vapnik, V.N., Chervonenkis, A.Y.: On the uniform convergence of relative frequencies of events to their probabilities. Theory of Probability and its Applications 16(2), 264–280 (1971)MATHCrossRefMathSciNetGoogle Scholar
  24. [W]
    Weisstein, E.W.: Gamma Function. From MathWorld–A Wolfram Web Resource, http://mathworld.wolfram.com/GammaFunction.html
  25. [WD81]
    Wenocur, R.S., Dudley, R.M.: Some special Vapnik-Chervonenkis classes. Discrete Math. 33, 313–318 (1981)MATHCrossRefMathSciNetGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Nader H. Bshouty
    • 1
  • Ehab Wattad
    • 1
  1. 1.Department of Computer ScienceTechnionHaifaIsrael

Personalised recommendations