Bagging, Random Subspace Method and Biding
In recent years, many approaches for achieving high performance by combining some classifiers have been proposed. We exploit many random replicates of samples in the bagging, and randomly chosen feature subsets in the random subspace method. In this paper, we introduce a method for selecting both samples and features at the same time and demonstrate the effectiveness of the method. This method includes a parametric bagging and a parametric random subspace method as special cases. In some experiments, this method and the parametric random subspace method showed the best performance.
KeywordsSupport Vector Machine Recognition Rate Majority Vote Feature Subset True Label
- 5.Jiang, Y., Ling, J.J., Li, G., Dai, H., Zhou, Z.: Dependency Bagging. LNCS, pp. 491–500 (2005)Google Scholar
- 11.Skalak, D.: The sources of increased accuracy for two proposed boosting algorithms. In: Proc. American Association for Artificial Intelligence, AAAI 1996, Integrating Multiple Learned Models workshop (1996)Google Scholar
- 12.Blake, C., Merz, C.: UCI repository of machine learning databases (1998), http://www.ics.uci.edu/~mlearn/mlrepository.html