Skip to main content
Log in

Moderating k-NN Classifiers

  • Published:
Pattern Analysis & Applications Aims and scope Submit manuscript

Abstract

: The performance of a multiple classifier system combining the soft outputs of k-Nearest Neighbour (k-NN) Classifiers by the product rule can be degraded by the veto effect. This phenomenon is caused by k-NN classifiers estimating the class a posteriori probabilities using the maximum likelihood method. We show that the problem can be minimised by marginalising the k-NN estimates using the Bayesian prior. A formula for the resulting moderated k-NN estimate is derived. The merits of moderation are examined on real data sets. Tests with different bagging procedures indicate that the proposed moderation method improves the performance of the multiple classifier system significantly.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

Author information

Authors and Affiliations

Authors

Additional information

Received: 21 March 2001, Received in revised form: 04 September 2001, Accepted: 20 September 2001

Rights and permissions

Reprints and permissions

About this article

Cite this article

Alkoot, F., Kittler, J. Moderating k-NN Classifiers. Pattern Anal Appl 5, 326–332 (2002). https://doi.org/10.1007/s100440200029

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s100440200029

Navigation