Abstract
Multiple classifier systems have been originally proposed for supervised classification tasks, and few works have dealt with semi-supervised multiple classifiers. However, there are important pattern recognition applications, such as multi-sensor remote sensing and multi-modal biometrics, which demand semi-supervised multiple classifier systems able to exploit both labelled and unlabelled data. In this paper, the use, in multiple classifier systems, of two well known semi-supervised learning methods, namely, co-training and self-training, is investigated by experiments. Reported results on benchmarking data sets show that co-training and self-training allow exploiting unlabelled data in different types of multiple classifiers systems.
Chapter PDF
Similar content being viewed by others
Keywords
- Feature Subset
- Multiple Classifier
- Classifier Ensemble
- Multiple Classifier System
- Supervise Classification Task
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
References
Roli, F.: Semi-supervised Multiple Classifier Systems: Background and Research Directions. In: Oza, N.C., Polikar, R., Kittler, J., Roli, F. (eds.) MCS 2005. LNCS, vol. 3541, pp. 1–11. Springer, Heidelberg (2005)
D’Alchè-Buc, F., Grandvalet, Y., Ambroise, C.: Semi-supervised marginboost. In: Neural Information Processing Systems Foundation, NIPS 2002 (2002)
Bennet, K., Demiriz, A., Maclin, R.: Exploiting unlabeled data in ensemble methods. In: Proc. 8th ACM SIGKDD Int. Conf. on Knowledge Discovery and Data Mining, pp. 289–296 (2002)
El Gayar, N.: An Experimental Study of a Self-Supervised Classifier Ensemble. International Journal of Information Technology 1(1) (2004)
Zhou, Y., Goldman, S.: Democratic Co-Learning. In: Proceedings of the 16th IEEE International Conference on Tools with Artificial Intelligence (ICTAI 2004), pp. 594–602 (2004)
Zhu, X.: Semi-supervised learning literature survey, Technical report, Computer Sciences TR 1530, Univ. Wisconsis, Madison, USA (January 2006)
Blum, A., Mitchell, T.: Combining labeled and unlabeled data with co-training. In: Proc. of the Workshop on Computational Learning Theory, pp. 92–100 (1998)
Kittler, J., Hatef, M., Duin, R.P.W., Matas, J.: On Combining Classifiers. IEEE Trans. on Patt. Anal. and Machine Intell. 20(3), 226–239 (1998)
Skurichina, M., Duin, R.P.W.: Bagging, Boosting and the Random Subspace Method for Linear Classifiers. Pattern Analysis & Applications 5, 121–135 (2002)
Giacinto, G., Roli, F., Bruzzone, L.: Combination of neural and statistical algorithms for supervised classification of remote-sensing images. Pattern Recongition Letters 21(5), 385–397 (2000)
Solyman, M., El Gayar, N.F.: A Co-training Approach for Semi-supervised Multiple Classifiers. In: INFO 2006, 4th Int. Conference, Cairo, Egypt, March 25-27 (in press, 2006)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Didaci, L., Roli, F. (2006). Using Co-training and Self-training in Semi-supervised Multiple Classifier Systems. In: Yeung, DY., Kwok, J.T., Fred, A., Roli, F., de Ridder, D. (eds) Structural, Syntactic, and Statistical Pattern Recognition. SSPR /SPR 2006. Lecture Notes in Computer Science, vol 4109. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11815921_57
Download citation
DOI: https://doi.org/10.1007/11815921_57
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-37236-3
Online ISBN: 978-3-540-37241-7
eBook Packages: Computer ScienceComputer Science (R0)