Advertisement

A New PAC Bound for Intersection-Closed Concept Classes

  • Peter Auer
  • Ronald Ortner
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3120)

Abstract

For hyper-rectangles in R d Auer et al. [1] proved a PAC bound of \(O(\frac{1}{\epsilon}(d+{\rm log}\frac{1}{\delta}))\), where ε and δ are the accuracy and confidence parameters. It is still an open question whether one can obtain the same bound for intersection-closed concept classes of VC-dimension d in general. We present a step towards a solution of this problem showing on one hand a new PAC bound of \(O(\frac{1}{\epsilon}(d{\rm log}d+\frac{1}{\delta}))\) for arbitrary intersection-closed concept classes complementing the well-known bounds \(O(\frac{1}{\epsilon}({\rm log}\frac{1}{\delta}+d{\rm log}\frac{1}{\epsilon}))\) and \(O(\frac{d}{\epsilon}{\rm log}\frac{1}{\delta})\) of Blumer et al. and Haussler et al. [4,6]. Our bound is established using the closure algorithm, that generates as its hypothesis the smallest concept that is consistent with the positive training examples. On the other hand, we show that maximum intersection-closed concept classes meet the bound of \(O(\frac{1}{\epsilon}(d+{\rm log}\frac{1}{\delta}))\) as well. Moreover, we indicate that our new as well as the conjectured bound cannot hold for arbitrary consistent learning algorithms, giving an example of such an algorithm that needs \(\Omega(\frac{1}{\epsilon}(d+{\rm log}\frac{1}{\epsilon}+{\rm log \frac{1}{\delta}}))\) examples to learn some simple maximum intersection-closed concept class.

Keywords

Learning Algorithm Concept Class Code Word Target Concept Positive Training 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Auer, P.: Learning Nested Differences in the Presence of Malicious Noise. Theor. Comput. Sci. 185(1), 159–175 (1997)zbMATHCrossRefMathSciNetGoogle Scholar
  2. 2.
    Auer, P., Cesa-Bianchi, N.: On-Line Learning with Malicious Noise and the Closure Algorithm. Annals of Mathematics and Artificial Intelligence 23(1-2), 83–99 (1998)zbMATHCrossRefMathSciNetGoogle Scholar
  3. 3.
    Auer, P., Long, P.M., Srinivasan, A.: Approximating Hyper-Rectangles: Learning and Pseudorandom Sets. J. Comput. Syst. Sci. 57(3), 376–388 (1998)zbMATHCrossRefMathSciNetGoogle Scholar
  4. 4.
    Blumer, A., Ehrenfeucht, D., Haussler, M.: Learnability and the Vapnik- Chervonenkis Dimension. J. ACM 36(4), 929–965 (1989)zbMATHCrossRefMathSciNetGoogle Scholar
  5. 5.
    Floyd, S., Warmuth, M.: Sample Compression, Learnability, and the Vapnik- Chervonenkis Dimension. Machine Learning 21(3), 269–304 (1995)Google Scholar
  6. 6.
    Haussler, D., Littlestone, N., Warmuth, M.: Predicting 0,1Functions on Randomly Drawn Points. Inf. Comput. 115(2), 248–292 (1994)zbMATHCrossRefMathSciNetGoogle Scholar
  7. 7.
    Helmbold, D., Sloan, R., Warmuth, M.: Learning Nested Differences of Intersection- Closed Concept Classes. Machine Learning 5, 165–196 (1990)Google Scholar
  8. 8.
    Leighton, F.T., Plaxton, C.G.: Hypercubic Sorting Networks. SIAM J. Comput. 27(1), 1–47 (1998)zbMATHCrossRefMathSciNetGoogle Scholar
  9. 9.
    Sauer, N.: On the Density of Families of Sets. J. Combinatorial Theory (A) 13, 145–147 (1972)zbMATHCrossRefMathSciNetGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2004

Authors and Affiliations

  • Peter Auer
    • 1
  • Ronald Ortner
    • 1
  1. 1.Department of Mathematics and Information TechnologyUniversity of LeobenLeobenAustria

Personalised recommendations