Exact Rate of Convergence of Kernel-Based Classification Rule
A binary classification problem is considered, where the posteriori probability is estimated by the nonparametric kernel regression estimate with naive kernel. The excess error probability of the corresponding plug-in decision classification rule according to the error probability of the Bayes decision is studied such that the excess error probability is decomposed into approximation and estimation error. A general formula is derived for the approximation error. Under a weak margin condition and various smoothness conditions, tight upper bounds are presented on the approximation error. By a Berry-Esseen type central limit theorem a general expression for the estimation error is shown.
KeywordsLower bound Upper bound Classification error probability Kernel rule Margin condition
- 1.Audibert J-Y, Tsybakov AB (2007) Fast learning rates for plug-in classifiers, Ann Stat 35:608–633Google Scholar
- 8.Krzyżak A (1986) The rates of convergence of kernel regression estimates and classification rules. IEEE Trans Inf Theory IT-32:668–679Google Scholar
- 9.Krzyżak A, Pawlak M (1984) Distribution-free consistency of a nonparametric kernel regression estimate and classification. IEEE Trans Inf Theory IT-30:78–81Google Scholar
- 11.Michel R (1981) On the constant in the non-uniform version of the Berry-Esseen theorem. Z Wahrsch Verw Gebiete 55:109–117Google Scholar