Abstract
Neyman-Pearson(NP) criterion is one of the most important ways in hypothesis testing. It is also a criterion for classification. This paper addresses the problem of bounding the estimation error of NP classification, in terms of Rademacher averages. We investigate the behavior of the global and local Rademacher averages, and present new NP classification error bounds which are based on the localized averages, and indicate how the estimation error can be estimated without a priori knowledge of the class at hand.
Similar content being viewed by others
References
Lehmann, E.: Testing statistical hypothesis, New York: Wiley, 1986
Vapnik, V.: The nature of statistical learning theory, New York: Springer-Verlag, 1995
Cannon, A., Howse, J., Hush, D., Scovel, C.: Learning with the Neyman-Pearson and mim-max criteria, Tech. Rep. LA-UR 02-2951, Los Alamos National Laboratory, 2002
Devroye, L., Györfi, L., Lugosi, G.: A probabilistic theory of pattern recognition, New York: Springer, 1996
Vapnik, V., Chervoneukis, A.: Theory of pattern recognition, Moscow: Nauka, 1974
Vapnik, V., Chervoneukis, A.: On the uniform convergence of relative frequencies of events to their probabilities. Theory of Probability and its Applications, 16(2), 264–280 (1971)
Cannon, A., Howse, J., Hush, D., Scovel, C.: Simple classifiers, Tech. Rep. LA-UR03-0193, Los Alamos National Laboratory, 2003
Scott, C., Nowak, R.: A Neyman-Pearson approach to statistical learning. IEEE Transactions on Information Theory, 51(11), 3806–3819 (2005)
Vapnik, V.: Estimation of dependencies based on empirical data, New York: Springer-Verlag, 1982
Scott, C.: Performance measures for Neyman-Pearson classification, Preprint, 2005
Pollard, D.: Convergence of stochastic process, New York: Springer-Verlag, 1984
Koltchinskii, V.: Rademacher penalties and structural risk minimization. IEEE Transaction on Information Theory, 47(5), 1902–1914 (2001)
Bartlett, P. L., Boucheron, S., Lugosi, G.: Model selection and error estimation. Machine Learning, 48, 85–113 (2002)
Mendelson, S.: Rademacher averages and phase transition in Glivenko-Cantelli classes. IEEE Transaction on Information Theory, 48(1), 251–263 (2002)
Bartlett, P. L., Mendelson, S.: Rademacher and Gaussian complexities: risk bounds and structural results. Journal of Machine Learning Research, 3, 463–482 (2002)
Bartlett, P. L., Bousquet, O., Mendelson, S.: Local Rademacher Complexities. Annals of Statistics, 33(4), 1497–1537 (2005)
Bousquet, O., Koltchinskii, V., Panchenko, D.: Some local measures of complexity of convex hulls and generalization bounds, In U. Kivinen and R. H. Sloan, editors, Proceeding of the 15th Annual Conference on Computational Learning Theory, 2002, 59–73
Lugosi, G., Wagkamp, M.: Complexity regularization via localized random penalties. Annals of Statistics, 32(4), 1679–1697 (2004)
Bousquet, O.: Concentration inequalities and empirical processes theory applied to the analysis of learning algorithms, Degree dissertation of PH.D., 2002
Author information
Authors and Affiliations
Corresponding author
Additional information
Research supported in part by NSF of China under Grant Nos. 10801004, 10871015; supported in part by Startup Grant for Doctoral Research of Beijing University of Technology
Rights and permissions
About this article
Cite this article
Han, M., Chen, D.R. & Sun, Z.X. Rademacher complexity in Neyman-Pearson classification. Acta. Math. Sin.-English Ser. 25, 855–868 (2009). https://doi.org/10.1007/s10114-008-6210-8
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10114-008-6210-8