Inductive Confidence Machines for Regression

  • Harris Papadopoulos
  • Kostas Proedrou
  • Volodya Vovk
  • Alex Gammerman
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2430)


The existing methods of predicting with confidence give good accuracy and confidence values, but quite often are computationally inefficient. Some partial solutions have been suggested in the past. Both the original method and these solutions were based on transductive inference. In this paper we make a radical step of replacing transductive inference with inductive inference and define what we call the Inductive Confidence Machine (ICM); our main concern in this paper is the use of ICM in regression problems. The algorithm proposed in this paper is based on the Ridge Regression procedure (which is usually used for outputting bare predictions) and is much faster than the existing transductive techniques. The inductive approach described in this paper may be the only option available when dealing with large data sets.


Ridge Regression Inductive Inference Kolmogorov Complexity True Label Tolerance Region 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    Cristianini, N., & Shawe-Taylor, J. (2000). Support Vector Machines and Other Kernel-based Learning Methods. Cambridge: Cambridge University Press.Google Scholar
  2. 2.
    Fraser, D. A. S. (1957). Non-parametric Methods in Statistics. New York: Wiley.Google Scholar
  3. 3.
    Gammerman, A., Vapnik, V., & Vovk, V. (1998). Learning by transduction. Proceedings of the Fourteenth Conference on Uncertainty in Artificial Intelligence (pp. 148–156). San Francisco: Morgan Kaufmann.Google Scholar
  4. 4.
    Li, M., & Vitányi, P. (1997). An Introduction to Kolmogorov Complexity and Its Applications. Second edition. New York: Springer.zbMATHGoogle Scholar
  5. 5.
    Melluish, T., Saunders, C., Nouretdinov, I., & Vovk, V. (2001). Comparing the Bayes and typicalness frameworks. ECML’01.Google Scholar
  6. 6.
    Melluish, T., Vovk, V., & Gammerman, A. (1999). Transduction for Regression Estimation with Confidence. NIPS’99.Google Scholar
  7. 7.
    Nouretdinov, I., Melluish, T., & Vovk, V. (1999). Ridge Regression Confidence Machine. Proceedings of the 18th International Conference on Machine Learning.Google Scholar
  8. 8.
    Nouretdinov, I., Vovk, V., V'yugin, V., & Gammerman, A. (2001). Transductive Confidence Machine is universal. Work in progress.Google Scholar
  9. 9.
    Proedrou, K., Nouretdinov, I., Vovk, V., & Gammerman, A. (2001). Transductive Confidence Machines for Pattern Recognition. Proceedings of the 13th European Conference on Machine Learning.Google Scholar
  10. 10.
    Saunders, C., Gammerman, A., & Vovk, V. (1999). Transduction with confidence and credibility. Proceedings of the 16th International Joint Conference on Artificial Intelligence (pp. 722–726).Google Scholar
  11. 11.
    Saunders, C., Gammerman, A., & Vovk, V. (2000). Computationally efficient transductive machines. ALT’00 Proceedings.Google Scholar
  12. 12.
    Vapnik, V. (1998). Statistical Learning Theory. New York: Wiley.zbMATHGoogle Scholar
  13. 13.
    Vovk, V., Gammerman, A., & Saunders, C. (1999). Machine-learning applications of algorithmic randomness. Proceedings of the 16th International Conference on Machine Learning (pp. 444–453).Google Scholar
  14. 14.
    Vovk, V., & Gammerman, A. (2001). Algorithmic Theory of Randomness and its Computer Applications. Manuscript.Google Scholar
  15. 15.
    Vovk, V., and Gammerman, A. (1999). Statistical applications of algorithmic randomness. Bulletin of the International Statistical Institute. The 52nd Session. Contributed Papers. Tome LVIII. Book 3 (pp. 469–470).Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2002

Authors and Affiliations

  • Harris Papadopoulos
    • 1
  • Kostas Proedrou
    • 1
  • Volodya Vovk
    • 1
  • Alex Gammerman
    • 1
  1. 1.Department of Computer Science, Royal HollowayUniversity of LondonEghamEngland

Personalised recommendations