Comparing Block Ensembles for Data Streams with Concept Drift
Three block based ensembles, AWE, BWE and ACE, are considered in the perspective of learning from data streams with concept drift. AWE updates the ensemble after processing each successive block of incoming examples, while the other ensembles are additionally extended by different drift detectors. Experiments show that these extensions improve classification accuracy, in particular for sudden changes occurring within the block, as well as reduce computational costs.
Unable to display preview. Download preview PDF.
- 1.Bifet, A., Holmes, G., Kirkby, R., Pfahringer, B.: MOA: Massive Online Analysis. Journal of Machine Learning Research (JMLR) 11, 1601–1604 (2010)Google Scholar
- 2.Bifet, A., Kirkby, R.: Massive Online Analysis Manual. COSI (2009)Google Scholar
- 5.Gama, J.: Knowledge Discovery from Data Streams. CRC Publishers (2010)Google Scholar
- 8.Kuncheva, L.I.: Classifier ensembles for detecting concept change in streaming data: Overview and perspectives. In: Proc. 2nd Workshop SUEMA 2008 (ECAI 2008), Greece, pp. 5–10 (2008)Google Scholar
- 10.Tsymbal, A.: The problem of concept drift: Definitions and related work. Technical Report, Trinity College, Dublin, Ireland (2004)Google Scholar
- 11.Wang, H., Fan, W., Yu, P.S., Han, J.: Mining concept-drifting data streams using ensemble classifiers. In: Proceedings ACM SIGKDD, pp. 226–235 (2003)Google Scholar
- 12.Zliobaite, I.: Learning under Concept Drift: an Overview. Technical Report, Vilnius University, Lithuania (2009)Google Scholar