Online Non-stationary Boosting
Oza’s Online Boosting algorithm provides a version of AdaBoost which can be trained in an online way for stationary problems. One perspective is that this enables the power of the boosting framework to be applied to datasets which are too large to fit into memory. The online boosting algorithm assumes the data distribution to be independent and identically distributed (i.i.d.) and therefore has no provision for concept drift. We present an algorithm called Online Non-Stationary Boosting (ONSBoost) that, like Online Boosting, uses a static ensemble size without generating new members each time new examples are presented, and also adapts to a changing data distribution. We evaluate the new algorithm against Online Boosting, using the STAGGER dataset and three challenging datasets derived from a learning problem inside a parallelising virtual machine. We find that the new algorithm provides equivalent performance on the STAGGER dataset and an improvement of up to 3% on the parallelisation datasets.
Unable to display preview. Download preview PDF.
- 3.Freund, Y., Schapire, R.: Experiments with a new boosting algorithm. In: Proc. of the Thirteenth International Conference on Machine Learning, pp. 148–156 (1996)Google Scholar
- 4.Kuncheva, L.: Classifier Ensembles for Changing Environments. In: Roli, F., Kittler, J., Windeatt, T. (eds.) MCS 2004. LNCS, vol. 3077, pp. 1–15. Springer, Heidelberg (2004)Google Scholar
- 5.Kuncheva, L.: Classifier ensembles for detecting concept change in streaming data: Overview and perspectives. In: Proc. of the 2nd Workshop SUEMA 2008, ECAI 2008 (2008)Google Scholar
- 6.Li, S., Zhang, Z., Shum, H., Zhang, H.: FloatBoost learning for classification. In: Advances in Neural Information Processing Systems, pp. 1017–1024 (2003)Google Scholar
- 8.Oza, N.C.: Online Ensemble Learning. PhD thesis, The University of California, Berkeley, CA (September 2001)Google Scholar
- 10.Scholz, M., Klinkenberg, R.: Boosting classifiers for drifting concepts. Intelligent Data Analysis 11, 3–28 (2007)Google Scholar
- 11.Singer, J., Pocock, A., Yiapanis, P., Brown, G., Luján, M.: Fundamental Nano-Patterns to Characterize and Classify Java Methods. In: Proc. Workshop on Language Descriptions, Tools and Applications (2009)Google Scholar
- 12.Widmer, G., Kubat, M.: Learning in the presence of concept drift and hidden contexts. Machine learning 23, 69–101 (1996)Google Scholar