Abstract
We describe a simple reduction from the problem of PAC-learning from multiple-instance examples to that of PAC-learning with one-sided random classification noise. Thus, all concept classes learnable with one-sided noise, which includes all concepts learnable in the usual 2-sided random noise model plus others such as the parity function, are learnable from multiple-instance examples. We also describe a more efficient (and somewhat technically more involved) reduction to the Statistical-Query model that results in a polynomial-time algorithm for learning axis-parallel rectangles with sample complexity Õ(d2r/ε2) , saving roughly a factor of r over the results of Auer et al. (1997).
Article PDF
Similar content being viewed by others
References
Auer, P. (1997). On learning from multi-instance examples: Empirical evaluation of a theoretical approach. In Proceedings of the Fourteenth International Conference on Machine Learning.
Auer, P., Long, P., and Srinivasan, A. (1997). Approximating hyper-rectangles: Learning and pseudo-random sets. In Proceedings of the 29th Annual ACM Symposium on Theory of Computing. To appear.
Dietterich, T. G., Lanthrop, R. H., and Lozano-Perez, T. (1997). Solving the multiple-instance problem with axis-parallel rectangles. Artifical Intelligence, 89(1-2):31–71.
Kearns, M. (1993). Efficient noise-tolerant learning from statistical queries. In Proceedings of the Twenty-Fifth Annual ACM Symposium on Theory of Computing, pages 392–401.
Long, P. and Tan, L. (1996). PAC learning axis-aligned rectangles with respect to product distributions from multiple-instance examples. In Proceedings of the 9th Annual Conference on Computational Learning Theory, pages 228–234.
Author information
Authors and Affiliations
Rights and permissions
About this article
Cite this article
Blum, A., Kalai, A. A Note on Learning from Multiple-Instance Examples. Machine Learning 30, 23–29 (1998). https://doi.org/10.1023/A:1007402410823
Issue Date:
DOI: https://doi.org/10.1023/A:1007402410823