Machine Learning

, Volume 17, Issue 2, pp 115–141

Toward efficient agnostic learning

  • Michael J. Kearns
  • Robert E. Schapire
  • Linda M. Sellie
Article

DOI: 10.1007/BF00993468

Cite this article as:
Kearns, M.J., Schapire, R.E. & Sellie, L.M. Mach Learn (1994) 17: 115. doi:10.1007/BF00993468

Abstract

In this paper we initiate an investigation of generalizations of the Probably Approximately Correct (PAC) learning model that attempt to significantly weaken the target function assumptions. The ultimate goal in this direction is informally termedagnostic learning, in which we make virtually no assumptions on the target function. The name derives from the fact that as designers of learning algorithms, we give up the belief that Nature (as represented by the target function) has a simple or succinct explanation. We give a number of positive and negative results that provide an initial outline of the possibilities for agnostic learning. Our results include hardness results for the most obvious generalization of the PAC model to an agnostic setting, an efficient and general agnostic learning method based on dynamic programming, relationships between loss functions for agnostic learning, and an algorithm for a learning problem that involves hidden variables.

Keywords

machine learning agnostic learning PAC learning computational learning theory 
Download to read the full article text

Copyright information

© Kluwer Academic Publishers 1994

Authors and Affiliations

  • Michael J. Kearns
    • 1
  • Robert E. Schapire
    • 1
  • Linda M. Sellie
    • 2
  1. 1.AT&T Bell LaboratoriesMurray Hill
  2. 2.Department of Computer ScienceUniversity of ChicagoChicago

Personalised recommendations