Encyclopedia of Algorithms

Editors: Ming-Yang Kao

Active Learning – Modern Learning Theory

Reference work entry
DOI: https://doi.org/10.1007/978-1-4939-2864-4_769

Years and Authors of Summarized Original Work

  • 2006; Balcan, Beygelzimer, Langford

  • 2007; Balcan, Broder, Zhang

  • 2007; Hanneke

  • 2013; Urner, Wulff, Ben-David

  • 2014; Awashti, Balcan, Long

Problem Definition

Most classic machine learning methods depend on the assumption that humans can annotate all the data available for training. However, many modern machine learning applications (including image and video classification, protein sequence classification, and speech processing) have massive amounts of unannotated or unlabeled data. As a consequence, there has been tremendous interest both in machine learning and its application areas in designing algorithms that most efficiently utilize the available data while minimizing the need for human intervention. An extensively used and studied technique is active learning, where the algorithm is presented with a large pool of unlabeled examples (such as all images available on the web) and can interactively ask for the labels of examples of its own...

Keywords

Active learning Computational complexity Learning theory Sample complexity 
This is a preview of subscription content, log in to check access

Recommended Reading

  1. 1.
    Awasthi P, Balcan M-F, Long PM (2014) The power of localization for efficiently learning linear separators with noise. In: Proceedings of the 46th annual symposium on the theory of computing (STOC), New YorkGoogle Scholar
  2. 2.
    Balcan MF, Beygelzimer A, Langford J (2006) Agnostic active learning. In: Proceedings of the 23rd international conference on machine learning (ICML), PittsburghGoogle Scholar
  3. 3.
    Balcan M-F, Broder A, Zhang T (2007) Margin based active learning. In: Proceedings of the 20th annual conference on computational learning theory (COLT), San DiegoGoogle Scholar
  4. 4.
    Balcan M-F, Long PM (2013) Active and passive learning of linear separators under log-concave distributions. In: Proceedings of the 26th conference on learning theory (COLT), PrincetonGoogle Scholar
  5. 5.
    Beygelzimer A, Hsu D, Langford J, Zhang T (2010) Agnostic active learning without constraints. In: Advances in neural information processing systems (NIPS), VancouverGoogle Scholar
  6. 6.
    Cohn D, Atlas L, Ladner R (1994) Improving generalization with active learning. In: Proceedings of the 11th international conference on machine learning (ICML), New BrunswickGoogle Scholar
  7. 7.
    Dasgupta S, Hsu D (2008) Hierarchical sampling for active learning. In: Proceedings of the 25th international conference on machine learning (ICML), HelsinkiGoogle Scholar
  8. 8.
    Dasgupta S, Hsu DJ, Monteleoni C (2007) A general agnostic active learning algorithm. In: Advances in neural information processing systems (NIPS), VancouverGoogle Scholar
  9. 9.
    Hanneke S (2007) A bound on the label complexity of agnostic active learning. In: Proceedings of the 24th international conference on machine learning (ICML), CorvallisGoogle Scholar
  10. 10.
    Kearns MJ, Vazirani UV (1994) An introduction to computational learning theory. MIT, CambridgeGoogle Scholar
  11. 11.
    Koltchinskii V (2010) Rademacher complexities and bounding the excess risk in active learning. J Mach Learn 11:2457–2485MathSciNetMATHGoogle Scholar
  12. 12.
    Urner R, Wullf S, Ben-David S (2013) Plal: cluster-based active learning. In: Proceedings of the 26th conference on learning theory (COLT), PrincetonGoogle Scholar
  13. 13.
    Vapnik VN (1998) Statistical learning theory. Wiley, New YorkMATHGoogle Scholar
  14. 14.
    Zhang C, Chaudhuri K (2014) Beyond disagreement-based agnostic active learning. In: Advances in neural information processing systems (NIPS), MontrealGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2016

Authors and Affiliations

  1. 1.Department of Machine Learning Carnegie Mellon UniversityPittsburghUSA