A Bayesian Active Learning Framework for a Two-Class Classification Problem

  • Pablo Ruiz
  • Javier Mateos
  • Rafael Molina
  • Aggelos K. Katsaggelos
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7252)


In this paper we present an active learning procedure for the two-class supervised classification problem. The utilized methodology exploits the Bayesian modeling and inference paradigm to tackle the problem of kernel-based data classification. This Bayesian methodology is appropriate for both finite and infinite dimensional feature spaces. Parameters are estimated, using the kernel trick, following the evidence Bayesian approach from the marginal distribution of the observations. The proposed active learning procedure uses a criterion based on the entropy of the posterior distribution of the adaptive parameters to select the sample to be included in the training set. A synthetic dataset as well as a real remote sensing classification problem are used to validate the followed approach.


Posterior Distribution Active Learning Multispectral Image Relevance Vector Machine Kappa Index 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Babacan, D., Molina, R., Katsaggelos, A.: Bayesian compressive sensing using Laplace priors. IEEE Transactions on Image Processing 19(1), 53–63 (2010)MathSciNetCrossRefGoogle Scholar
  2. 2.
    Bishop, C.M.: Pattern Recognition and Machine Learning (Information Science and Statistics). Springer (2007)Google Scholar
  3. 3.
    Burges, C.J.: A tutorial on support vector machines for pattern recognition. Data Mining and Knowledge Discovery 2, 121–167 (1998)CrossRefGoogle Scholar
  4. 4.
    Elad, M.: Sparse and Redundant Representations - From Theory to Applications in Signal and Image Processing. Springer (2010)Google Scholar
  5. 5.
    MacKay, D.J.C.: Information-based objective functions for active data selection. Neural Computation 4(4), 590–604 (1992)CrossRefGoogle Scholar
  6. 6.
    Paisley, J., Liao, X., Carin, L.: Active learning and basis selection for kernel-based linear models: A Bayesian perspective. IEEE Transactions on Signal Processing 58, 2686–2700 (2010)MathSciNetCrossRefGoogle Scholar
  7. 7.
    Rasmussen, C.E., Williams, C.K.: Gaussian Processes for Machine Learning. MIT Press, NY (2006)zbMATHGoogle Scholar
  8. 8.
    Schölkopf, B., Smola, A.: Learning with Kernels. MIT Press, Cambrigde (2002)Google Scholar
  9. 9.
    Seeger, M.W., Nickisch, H.: Compressed sensing and Bayesian experimental design. In: International Conference on Machine Learning 25 (2008)Google Scholar
  10. 10.
    Settles, B.: Active learning literature survey. Computer Sciences Technical Report 1648, University of Wisconsin–Madison (2009)Google Scholar
  11. 11.
    Shawe-Taylor, J., Cristianini, N.: Kernel Methods for Pattern Analysis. Cambridge Univ. Press (2004)Google Scholar
  12. 12.
    Suykens, J.A.K., Van Gestel, T., De Brabanter, J., De Moor, B., Vandewalle, J.: Least Squares Support Vector Machines. World Scientific, Singapore (2002)zbMATHCrossRefGoogle Scholar
  13. 13.
    Tipping, M.E.: The relevance vector machine. Journal of Machine Learning Research 1, 211–244 (2001)MathSciNetzbMATHGoogle Scholar
  14. 14.
    Tuia, D., Volpi, M., Copa, L., Kanevski, M., Muñoz-Marí, J.: A survey of active learning algorithms for supervised remote sensing image classification. IEEE J. Sel. Topics Signal Proc. 4, 606–617 (2011)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Pablo Ruiz
    • 1
  • Javier Mateos
    • 1
  • Rafael Molina
    • 1
  • Aggelos K. Katsaggelos
    • 2
  1. 1.University of GranadaGranadaSpain
  2. 2.Northwestern UniversityEvanstonUSA

Personalised recommendations