Abstract
The existing methods of predicting with confidence give good accuracy and confidence values, but quite often are computationally inefficient. Some partial solutions have been suggested in the past. Both the original method and these solutions were based on transductive inference. In this paper we make a radical step of replacing transductive inference with inductive inference and define what we call the Inductive Confidence Machine (ICM); our main concern in this paper is the use of ICM in regression problems. The algorithm proposed in this paper is based on the Ridge Regression procedure (which is usually used for outputting bare predictions) and is much faster than the existing transductive techniques. The inductive approach described in this paper may be the only option available when dealing with large data sets.
Chapter PDF
Similar content being viewed by others
Keywords
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
Cristianini, N., & Shawe-Taylor, J. (2000). Support Vector Machines and Other Kernel-based Learning Methods. Cambridge: Cambridge University Press.
Fraser, D. A. S. (1957). Non-parametric Methods in Statistics. New York: Wiley.
Gammerman, A., Vapnik, V., & Vovk, V. (1998). Learning by transduction. Proceedings of the Fourteenth Conference on Uncertainty in Artificial Intelligence (pp. 148–156). San Francisco: Morgan Kaufmann.
Li, M., & Vitányi, P. (1997). An Introduction to Kolmogorov Complexity and Its Applications. Second edition. New York: Springer.
Melluish, T., Saunders, C., Nouretdinov, I., & Vovk, V. (2001). Comparing the Bayes and typicalness frameworks. ECML’01.
Melluish, T., Vovk, V., & Gammerman, A. (1999). Transduction for Regression Estimation with Confidence. NIPS’99.
Nouretdinov, I., Melluish, T., & Vovk, V. (1999). Ridge Regression Confidence Machine. Proceedings of the 18th International Conference on Machine Learning.
Nouretdinov, I., Vovk, V., V'yugin, V., & Gammerman, A. (2001). Transductive Confidence Machine is universal. Work in progress.
Proedrou, K., Nouretdinov, I., Vovk, V., & Gammerman, A. (2001). Transductive Confidence Machines for Pattern Recognition. Proceedings of the 13th European Conference on Machine Learning.
Saunders, C., Gammerman, A., & Vovk, V. (1999). Transduction with confidence and credibility. Proceedings of the 16th International Joint Conference on Artificial Intelligence (pp. 722–726).
Saunders, C., Gammerman, A., & Vovk, V. (2000). Computationally efficient transductive machines. ALT’00 Proceedings.
Vapnik, V. (1998). Statistical Learning Theory. New York: Wiley.
Vovk, V., Gammerman, A., & Saunders, C. (1999). Machine-learning applications of algorithmic randomness. Proceedings of the 16th International Conference on Machine Learning (pp. 444–453).
Vovk, V., & Gammerman, A. (2001). Algorithmic Theory of Randomness and its Computer Applications. Manuscript.
Vovk, V., and Gammerman, A. (1999). Statistical applications of algorithmic randomness. Bulletin of the International Statistical Institute. The 52nd Session. Contributed Papers. Tome LVIII. Book 3 (pp. 469–470).
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2002 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Papadopoulos, H., Proedrou, K., Vovk, V., Gammerman, A. (2002). Inductive Confidence Machines for Regression. In: Elomaa, T., Mannila, H., Toivonen, H. (eds) Machine Learning: ECML 2002. ECML 2002. Lecture Notes in Computer Science(), vol 2430. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-36755-1_29
Download citation
DOI: https://doi.org/10.1007/3-540-36755-1_29
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-44036-9
Online ISBN: 978-3-540-36755-0
eBook Packages: Springer Book Archive