A Dynamic Logistic Multiple Classifier System for Online Classification

  • Amber Tomas
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6713)


We consider the problem of online classification in nonstationary environments. Specifically, we take a Bayesian approach to sequential parameter estimation of a logistic MCS, and compare this method with other algorithms for nonstationary classification. We comment on several design considerations.


Decision Boundary Dynamic Logistic Sequential Monte Carlo Average Error Rate Bayesian Forecast 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Breiman, L.: Bagging predictors. Machine Learning 26, 123–140 (1996)zbMATHGoogle Scholar
  2. 2.
    Breiman, L.: Random forests. Machine Learning 45, 5–32 (2001)CrossRefzbMATHGoogle Scholar
  3. 3.
    Demšar, J.: Statistical comparisons of classifiers over multiple data sets. Journal of Machine Learning Research 7, 1–30 (2006)zbMATHGoogle Scholar
  4. 4.
    Doucet, A., Godshill, S., Andrieu, C.: On sequential Monte Carlo sampling methods for Bayesian filtering. Statistics and Computing 10, 197–208 (2000)CrossRefGoogle Scholar
  5. 5.
    de Freitas, J.F.G., Niranjan, M., Gee, A.H.: Hierarchical Bayesian models for regularization in sequential learning. Neural Computation 12, 933–953 (2000)CrossRefGoogle Scholar
  6. 6.
    Højen-Sørensen, P., de Freitas, N., Fog, T.: On-line probabilistic classification with particle filters. Neural Networks for Signal Processing X, 2000. In: Proceedings of the 2000 IEEE Signal Processing Society Workshop, vol. 1, pp. 386–395 (2000),
  7. 7.
    Kuncheva, L.I.: Classifier ensembles for changing environments. In: Roli, F., Kittler, J., Windeatt, T. (eds.) MCS 2004. LNCS, vol. 3077, pp. 1–15. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  8. 8.
    Kuncheva, L.I., Plumpton, C.O.: Adaptive learning rate for online linear discriminant classifiers. In: da Vitoria Lobo, N., Kasparis, T., Roli, F., Kwok, J.T., Georgiopoulos, M., Anagnostopoulos, G.C., Loog, M. (eds.) S+SSPR 2008. LNCS, vol. 5342, pp. 510–519. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  9. 9.
    Littlestone, N.: Learning quickly when irrelevant attributes abound: A new linear-threshold algorithm. Machine Learning 2, 285–318 (1998)Google Scholar
  10. 10.
    McCormick, T.H., Raftery, A.E., Madigan, D., Burd, R.S.: Dynamic logistic regression and dynamic model averaging for binary classification (submitted)Google Scholar
  11. 11.
    Muhlbaier, M.D., Polikar, R.: An ensemble approach for incremental learning in nonstationary environments. In: Haindl, M., Kittler, J., Roli, F. (eds.) MCS 2007. LNCS, vol. 4472, pp. 490–500. Springer, Heidelberg (2007)CrossRefGoogle Scholar
  12. 12.
    Penny, W.D., Roberts, S.J.: Dynamic logistic regression. In: International Joint Conference on Neural Networks, IJCNN 1999, vol. 3, pp. 1562–1567 (1999)Google Scholar
  13. 13.
    Schapire, R.: The strength of weak learnability. Machine Learning 5, 197–227 (1990)Google Scholar
  14. 14.
    Tomas, A.: A Dynamic Logistic Model for Combining Classifier Outputs. Ph.D. thesis, The University of Oxford (2009)Google Scholar
  15. 15.
    West, M., Harrison, J.: Bayesian Forecasting and Dynamic Models, 2nd edn. Springer Series in Statistics. Springer, Heidelberg (1997)zbMATHGoogle Scholar
  16. 16.
    West, M., Harrison, P.J., Migon, H.S.: Dynamic generalized linear models and Bayesian forecasting. Journal of the American Statistical Association 80(389), 73–83 (1985)CrossRefzbMATHGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Amber Tomas
    • 1
  1. 1.Department of StatisticsThe University of OxfordUK

Personalised recommendations