Simple learning algorithms using divide and conquer
- 60 Downloads
This paper investigates what happens when a learning algorithm for a classC attempts to learn target formulas from a different class. In many cases, the learning algorithm will find a “bad attribute” or a property of the target formula which precludes its membership in the classC. To continue the learning process, we proceed by building a decision tree according to the possible values of this attribute (divide) and recursively run the learning algorithm for each value (conquer). This paper shows how to recursively run the learning algorithm for each value using the oracles of the target.
We demonstrate that the application of this idea on some known learning algorithms can both simplify the algorithm and provide additional power to learn more classes. In particular, we give a simple exact learning algorithm, using membership and equivalence queries, for the class of DNF that is “almost” unate, that is, unate with the addition ofO (logn) nonunate variables and a constant number of terms. We also find algorithms in different models for boolean functions that depend onk terms.
Key wordsPAC-learning exact learning divide and conquer queries
Subject classifications06E30 41A05 68T05
Unable to display preview. Download preview PDF.
- D. Angluin, Queries and concept learning.Machine Learning 2:4 (1988), 319–342.Google Scholar
- D. Angluin, L. Hellerstein, andM. Karpinski, Learning read-once formulas with queries.Journal of the ACM 40:1 (1993), 185–210.Google Scholar
- A. Blum, M. Furst, J. Jackson, M. Kearns, Y. Mansour, and S. Rudich, Weakly learning DNF and characterizing statistical queries learning using fourier analysis.Proceedings of the Twenty-sixth Annual ACM Symposium on Theory of Computing, May 1994, 253–262.Google Scholar
- A. Blum and S. Rudich, Fast learning ofk-term DNF formulas with queries. InProceedings of the Twenty Fourth Annual ACM Symposium on Theory of Computing, May 1992, 382–389.Google Scholar
- A. Blum and M. Singh, Learning functions ofk terms. InProceeding of the Third Annual Workshop on Computational Learning Theory, August 1990, 144–153.Google Scholar
- N. H. Bshouty, Exact learning via the monotone theory.Information and Computation 123:1 (1995), 146–153.Google Scholar
- N. H. Bshouty, S. A. Goldman, T. R. Hancock, and S. Matar, Asking questions to minimize errors. InProceeding of the Sixth Annual ACM Workshop on Computational Learning Theory, July 1993, 41–50.Google Scholar
- M. Kearns, Efficient noise-tolerant learning from statistical queries. InProceeding of the Twenty-fifth Annual ACM Symposium on Theory of Computing, May 1993, 392–401.Google Scholar
- M. Kearns, M. Li, L. Pitt, and L. Valiant, On the learnability of boolean formulae.Proceeding of the Ninteenth Annual ACM Symposium on Theory of Computing, May 1987, 285–295.Google Scholar
- M. Kearns andL. Valiant, Cryptographic Limitations on Learning Boolean Formulae and Finite Automata,Journal of the ACM 41 (1994), 67–95.Google Scholar
- E. Kushilevitz andY. Mansour, Learning decision trees using the Fourier spectrum.SIAM Journal of Computing 22:6 (1993), 1331–1348.Google Scholar
- R. Motwani and P. Raghavan,Randomized Algorithms. Cambridge University Press 1995.Google Scholar
- M. Ron Roth andG. Benedek, Interpolation and approximation of sparse multivariate polynomials over gf(2).SIAM Journal of Computing 20:2 (1991), 291–314.Google Scholar
- Y. Sakai and A. Maruoka, Learning monotone log-term DNF formula.Proceeding of the Seventh Annual ACM Workshop on Computational Learning Theory, July 1994, 165–172.Google Scholar
- L. Valiant, A theory of the learnable.Communications of the ACM 27:11 (1984), 1134–1142.Google Scholar