Abstract
In statistical setting of the pattern recognition problem the number of examples required to approximate an unknown labelling function is linear in the VC dimension of the target learning class. In this work we consider the question whether such bounds exist if consider only computable pattern recognition methods, assuming that the unknown labelling function is also computable. We find that in this case the number of examples required for a computable method to approximate the labelling function not only is not linear, but grows faster (in the VC dimension of the class) than any computable function. No time or space constraints are put on the predictors or target functions; the only resource we consider is the training examples.
The task of pattern recognition is considered in conjunction with another learning problem — data compression. An impossibility result for the task of data compression allows us to estimate the sample complexity for pattern recognition.
This work was supported by SNF grant 200020-107616.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Devroye, L., Györfi, L., Lugosi, G.: A probabilistic theory of pattern recognition. Springer, New York (1996)
Jain, S., Osherson, D., Royer, J., Sharma, A.: Systems That Learn: An Introduction to Learning Theory, 2nd edn. The MIT Press, Cambridge (1999)
Kearns, M., Vazirani, U.: An Introduction to Computational Learning Theory. The MIT Press, Cambridge (1994)
Li, M., Vitányi, P.: An introduction to Kolmogorov complexity and its applications, 2nd edn. Springer, Heidelberg (1997)
Menzel, W., Stephan, F.: Inductive versus approximative learning. In: Kuehn, R., et al. (eds.) Perspectives of Adaptivity and learning, pp. 187–209. Springer, Heidelberg (2003)
Rogers, H.: Theory of recursive functions and effective computability. McGraw-Hill Book Company, New York (1967)
Vapnik, V., Chervonenkis, A.: Theory of Pattern Recognition, Nauka, Moscow (1974)
Vapnik, V.: Statistical Learning Theory. John Wiley & Sons, Inc., New York (1998)
Zvonkin, A.K., Levin, L.A.: The complexity of finite objects and the development of the concepts of information and randomness by means of the theory of algorithms. Russian Math. Surveys 25(6), 83–124 (1970)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2005 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Ryabko, D. (2005). On Computability of Pattern Recognition Problems. In: Jain, S., Simon, H.U., Tomita, E. (eds) Algorithmic Learning Theory. ALT 2005. Lecture Notes in Computer Science(), vol 3734. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11564089_13
Download citation
DOI: https://doi.org/10.1007/11564089_13
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-29242-5
Online ISBN: 978-3-540-31696-1
eBook Packages: Computer ScienceComputer Science (R0)