Abstract
We compare and give some practical insights about several complexity selection approaches (under PAC model) based on: well known VC-dimension, and more recent ideas of Rademacher complexity and covering numbers. The classification task that we consider is carried out by polynomials. Additionally, we compare results of non-regularized and L 2-regularized learning and its influence on complexity.
This work has been financed by the Polish Government, Ministry of Science and Higher Education from the sources for science within years 2010–2012. Research project no.: N N516 424938.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Anthony, M., Bartlett, P.L.: Neural Network Learning: Theoretical Foundations. Cambridge University Press, Cambridge (2009)
Bartlett, P.L.: The sample complexity of pattern classification with neural networks: the size of the weights is more important than the size of the network. IEEE Transactions on Information Theory 44(2), 525–536 (1998)
Bartlett, P.L., Mendelson, S.: Rademacher and gaussian complexities: Risk bounds and structural results. Journal of Machine Learning Research 3, 463–482 (2002)
Bishop, C.M.: Pattern Recognition and Machine Learning. Springer, New York (2006)
Cherkassky, V., Mulier, F.: Learning from data. Adaptive and Learning Systems for Signal Processing, Communications and Control. John Wiley & Sons, Inc. (1998)
Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning. Springer, New York (2009)
Haussler, D., Long, P.: A generalization of Sauer’s lemma. Journal of Combinatorial Theory, Series A 71(2), 219–240 (1995)
Holden, S.B.: Cross-validation and the PAC learning model. Technical report, Dept. of CS, University College, London, Research Note, RN/96/64 (1996)
Ng, A.Y.: Feature selection, l1 vs. l2 regularization, and rotational invariance. In: ICML 2004: Proceedings of the Twenty-First International Conference on Machine Learning, p. 78. ACM, New York (2004)
Vapnik, V.N.: The Nature of Statistical Learning Theory. Springer, New York (1995)
Vapnik, V.N.: Statistical Learning Theory: Inference from Small Samples. Wiley, New York (1995)
Vapnik, V.N., Chervonenkis, A.J.: On the uniform convergence of relative frequencies of events to their probabilities. Theory of Probability and its Applications 16(2), 264–280 (1971)
Zhang, T.: Covering number bounds of certain regularized linear function classes. Journal of Machine Learning Research 2, 527–550 (2002)
Zou, H., Hastie, T.: Regularization and variable selection via the elastic net. J. R. Statist. Soc. B 67(2), 301–320 (2005)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Klęsk, P. (2012). A Comparison of Complexity Selection Approaches for Polynomials Based on: Vapnik-Chervonenkis Dimension, Rademacher Complexity and Covering Numbers. In: Rutkowski, L., Korytkowski, M., Scherer, R., Tadeusiewicz, R., Zadeh, L.A., Zurada, J.M. (eds) Artificial Intelligence and Soft Computing. ICAISC 2012. Lecture Notes in Computer Science(), vol 7268. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-29350-4_12
Download citation
DOI: https://doi.org/10.1007/978-3-642-29350-4_12
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-29349-8
Online ISBN: 978-3-642-29350-4
eBook Packages: Computer ScienceComputer Science (R0)