Parallel Perceptrons, Activation Margins and Imbalanced Training Set Pruning
A natural way to deal with training samples in imbalanced class problems is to prune them removing redundant patterns, easy to classify and probably over represented, and label noisy patterns that belonging to one class are labelled as members of another. This allows classifier construction to focus on borderline patterns, likely to be the most informative ones. To appropriately define the above subsets, in this work we will use as base classifiers the so–called parallel perceptrons, a novel approach to committee machine training that allows, among other things, to naturally define margins for hidden unit activations. We shall use these margins to define the above pattern types and to iteratively perform subsample selections in an initial training set that enhance classification accuracy and allow for a balanced classifier performance even when class sizes are greatly different.
KeywordsNear Neighbor Minority Class Activation Margin Negative Pattern Positive Pattern
Unable to display preview. Download preview PDF.
- 2.Breiman, L., Friedman, J.H., Olshen, R.A., Stone, C.J.: Classification and Regression Trees. Wadsworth (1983)Google Scholar
- 7.Kubat, M., Matwin, S.: Addressing the Curse of Imbalanced Training Sets: One- Sided Selection. In: Proceedings of the 14th International Conference on Machine Learning, ICML 1997, Nashville, TN, U.S.A., pp. 179–186 (1997)Google Scholar
- 8.Maloof, M.A.: Learning when data sets are imbalanced and when costs are unequal and unknown. In: ICML-2003 Workshop on Learning from Imbalanced Data Sets II (2003)Google Scholar
- 9.Murphy, P., Aha, D.: UCI Repository of Machine Learning Databases, Tech. Report, University of Califonia, Irvine (1994)Google Scholar
- 12.Weiss, G.M., Provost, F.: The effect of class distribution on classifier learning, Technical Report ML-TR 43, Department of Computer Science, Rutgers University (2001)Google Scholar