# A Note on Learning from Multiple-Instance Examples

Article

- 401 Downloads
- 42 Citations

## Abstract

We describe a simple reduction from the problem of PAC-learning from multiple-instance examples to that of PAC-learning with one-sided random classification noise. Thus, all concept classes learnable with one-sided noise, which includes all concepts learnable in the usual 2-sided random noise model plus others such as the parity function, are learnable from multiple-instance examples. We also describe a more efficient (and somewhat technically more involved) reduction to the Statistical-Query model that results in a polynomial-time algorithm for learning axis-parallel rectangles with sample complexity Õ(d^{2}r/ε^{2}) , saving roughly a factor of *r* over the results of Auer et al. (1997).

Multiple-instance examples classification noise statistical queries

Download
to read the full article text

## References

- Auer, P. (1997). On learning from multi-instance examples: Empirical evaluation of a theoretical approach. In
*Proceedings of the Fourteenth International Conference on Machine Learning*.Google Scholar - Auer, P., Long, P., and Srinivasan, A. (1997). Approximating hyper-rectangles: Learning and pseudo-random sets. In
*Proceedings of the 29th Annual ACM Symposium on Theory of Computing*. To appear.Google Scholar - Dietterich, T. G., Lanthrop, R. H., and Lozano-Perez, T. (1997). Solving the multiple-instance problem with axis-parallel rectangles.
*Artifical Intelligence*, 89(1-2):31–71.Google Scholar - Kearns, M. (1993). Efficient noise-tolerant learning from statistical queries. In
*Proceedings of the Twenty-Fifth Annual ACM Symposium on Theory of Computing*, pages 392–401.Google Scholar - Long, P. and Tan, L. (1996). PAC learning axis-aligned rectangles with respect to product distributions from multiple-instance examples. In
*Proceedings of the 9th Annual Conference on Computational Learning Theory*, pages 228–234.Google Scholar

## Copyright information

© Kluwer Academic Publishers 1998