Pruned Random Subspace Method for One-Class Classifiers
The goal of one-class classification is to distinguish the target class from all the other classes using only training data from the target class. Because it is difficult for a single one-class classifier to capture all the characteristics of the target class, combining several one-class classifiers may be required. Previous research has shown that the Random Subspace Method (RSM), in which classifiers are trained on different subsets of the feature space, can be effective for one-class classifiers. In this paper we show that the performance by the RSM can be noisy, and that pruning inaccurate classifiers from the ensemble can be more effective than using all available classifiers. We propose to apply pruning to RSM of one-class classifiers using a supervised area under the ROC curve (AUC) criterion or an unsupervised consistency criterion. It appears that when the AUC criterion is used, the performance may be increased dramatically, while for the consistency criterion results do not improve, but only become more predictable.
KeywordsOne-class classification Random Subspace Method Ensemble learning Pruning Ensembles
Unable to display preview. Download preview PDF.
- 6.Cheplygina, V.: Random subspace method for one-class classifiers. Master’s thesis, Delft University of Technology (2010)Google Scholar
- 7.DD_tools, the Data Description toolbox for Matlab, http://prlab.tudelft.nl/david-tax/dd_tools.html
- 12.Lazarevic, A., Kumar, V.: Feature bagging for outlier detection. In: 11th ACM SIGKDD Int. Conf. on Knowledge Discovery in Data Mining, pp. 157–166 (2005)Google Scholar
- 14.Nemenyi, P.: Distribution-free multiple comparisons. Ph.D. thesis, Princeton (1963)Google Scholar
- 15.OC classifier results, http://homepage.tudelft.nl/n9d04/occ/index.html
- 17.PRtools, Matlab toolbox for Pattern Recognition, http://www.prtools.org
- 19.Schapire, R., Freund, Y.: Experiments with a new boosting algorithm. In: 13th Int. Conf. on Machine Learning, p. 148. Morgan Kaufmann, San Francisco (1996)Google Scholar
- 21.Tax, D.: One-class classification; Concept-learning in the absence of counter-examples. Ph.D. thesis, Delft University of Technology (June 2001)Google Scholar
- 22.Tax, D., Muller, K.: A consistency-based model selection for one-class classification. In: 17th Int. Conf. on Pattern Recognition, pp. 363–366. IEEE, Los Alamitos (2004)Google Scholar
- 23.UCI Machine Learning Repository, http://archive.ics.uci.edu/ml/