Abstract

Gaussian process classifiers (GPCs) are a fully statistical model for kernel classification. We present a form of GPC which is robust to labeling errors in the data set. This model allows label noise not only near the class boundaries, but also far from the class boundaries which can result from mistakes in labelling or gross errors in measuring the input features. We derive an outlier robust algorithm for training this model which alternates iterations based on the EP approximation and hyperparameter updates until convergence. We show the usefulness of the proposed algorithm with model selection method through simulation results.

Keywords

Gaussian Process Class Boundary Expectation Propagation Outlier Model Gaussian Process Model 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Williams, C.K.I., Barber, D.: Bayesian classification with Gaussian processes. IEEE Transactions on Pattern Analysis Machine Intelligence 20, 1342–1351 (1998)CrossRefGoogle Scholar
  2. 2.
    Gibbs, M., MacKay, D.J.C.: Variational Gaussian process classifiers. IEEE Transcations on Neural Networks 11(6), 1458 (2000)CrossRefGoogle Scholar
  3. 3.
    Neal, R.: Monte Carlo implementation of Gaussian process models for Bayesian regression and classification. Technical Report CRG–TR–97–2, Dept. of Computer Science, University of Toronto (1997)Google Scholar
  4. 4.
    Opper, M., Winther, O.: Gaussian processes for classification: Mean field algorithms. Neural Computation 12, 2655–2684 (2000)CrossRefGoogle Scholar
  5. 5.
    Minka, T.: A family of algorithms for approximate Bayesian inference. PhD thesis, MIT (2001)Google Scholar
  6. 6.
    Williams, C.K.I., Rasmussen, C.E.: Gaussian processes for regression. In: NIPS, vol. 8. MIT Press, Cambridge (1995)Google Scholar
  7. 7.
    Seeger, M., Lawrence, N., Herbrich, R.: Sparse representation for Gaussian process models. In: NIPS, vol. 15 (2002)Google Scholar
  8. 8.
    Jordan, M.I., Ghahramani, Z., Jaakkola, T.S., Saul, L.K.: An introduction to variational methods for graphical models. Machine Learning 37, 183–233 (1999)CrossRefMATHGoogle Scholar
  9. 9.
    Neal, R., Hinton, G.: A view of the EM algorithm that justifies incremental, sparse, and other variants. In: Jordan, M.I. (ed.) Learning in Graphical Models. Kluwer, Dordrecht (1998)Google Scholar
  10. 10.
    MacKay, D.J.C.: Bayesian interpolation. Neural Computation 4(3), 415–447 (1992)CrossRefMATHGoogle Scholar
  11. 11.
    Kim, H.C., Ghahramani, Z.: The EM-EP algorithm for Gaussian process classification. In: Proceedings of the Workshop on Probabilistic Graphical Models for Classification (ECML) (2003)Google Scholar
  12. 12.
    Kim, H.C., Ghahramani, Z.: Bayesian Gaussian process classification with the EM-EP algorithm. IEEE Transactions on Pattern Analysis and Machine Intelligence 28(12), 1948–1959 (2006)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2008

Authors and Affiliations

  • Hyun-Chul Kim
    • 1
  • Zoubin Ghahramani
    • 2
  1. 1.Yonsei UniversitySeoulKorea
  2. 2.University of CambridgeCambridgeUK

Personalised recommendations