Combining One-Class Classifiers
In the problem of one-class classification target objects should be distinguished from outlier objects. In this problem it is assumed that only information of the target class is available while nothing is known about the outlier class. Like standard two-class classifiers, one-class classifiers hardly ever fit the data distribution perfectly. Using only the best classifier and discarding the classifiers with poorer performance might waste valuable information. To improve performance the results of different classifiers (which may differ in complexity or training algorithm) can be combined. This can not only increase the performance but it can also increase the robustness of the classification. Because for one-class classifiers only information of one of the classes is present, combining one-class classifiers is more difficult. In this paper we investigate if and how one-class classifiers can be combined best in a handwritten digit recognition problem.
KeywordsTarget Object Target Class Combination Rule Weighted Vote Support Vector Data Description
Unable to display preview. Download preview PDF.
- 5.R.P.W. Duin. UCI dataset, multiple features database. Available from ftp://ftp.ics.uci.edu/pub/machine-learning-databases/mfeat/, 1999.
- 6.N. Japkowicz. Concept-Learning in the absence of counter-examples: an autoassociation-based approach to classification. PhD thesis, New Brunswick Rutgers, The State University of New Jersey, 1999.Google Scholar
- 8.J. Kittler, A. Hojjatoleslami, and T. Windeatt. Weighting factors in multiple expert fusion. In Clark A.F., editor, Proceedings of the 8th British Machine Vision Conference 1997, pages 41–50. University of Essex Printing Service, 1997.Google Scholar
- 9.M.A. Kraaijveld and R.P.W. Duin. A criterion for the smoothing parameter for parzen-estimators of probability density functions. Technical report, Delft University of Technology, September 1991.Google Scholar
- 10.M.R. Moya, M.W. Koch, and L.D. Hostetler. One-class classifier networks for target recognition applications. In Proceedings world congress on neural networks, pages 797–801, Portland, OR, 1993. International Neural Network Society, INNS.Google Scholar
- 13.D.M.J. Tax and R.P.W Duin. Data domain description using support vectors. In M. Verleysen, editor, Proceedings of the European Symposium on Artificial Neural Networks 1999, pages 251–256. D.Facto, Brussel, April 1999.Google Scholar
- 15.A. Ypma and R.P.W. Duin. Support objects for domain approximation. In ICANN’98, Skovde (Sweden), September 1998.Google Scholar