Incremental Learning of New Classes in Unbalanced Datasets: Learn + + .UDNC

  • Gregory Ditzler
  • Michael D. Muhlbaier
  • Robi Polikar
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5997)

Abstract

We have previously described an incremental learning algorithm, Learn + + .NC, for learning from new datasets that may include new concept classes without accessing previously seen data. We now propose an extension, Learn + + .UDNC, that allows the algorithm to incrementally learn new concept classes from unbalanced datasets. We describe the algorithm in detail, and provide some experimental results on two separate representative scenarios (on synthetic as well as real world data) along with comparisons to other approaches for incremental and/or unbalanced dataset approaches.

Keywords

Incremental Learning Ensembles of Classifiers Learn++ Unbalanced Data 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Grossberg, S.: Nonlinear neural networks: principles, mechanisms, and architectures. Neural Networks 1, 17–61 (1988)CrossRefGoogle Scholar
  2. 2.
    Muhlbaier, M., Topalis, A., Polikar, R.: Learn++.NC: Combining Ensembles of Classifiers with Dynamically Weighted Consult-and-Vote for Efficient Incremental Learning of New Classes. IEEE Transactions on Neural Networks 20(1), 152–168 (2009)CrossRefGoogle Scholar
  3. 3.
    Carpenter, G., Grossberg, S., Markuzon, N., Reynolds, J.H., Rosen, D.: Fuzzy ARTMAP: A neural network architecture for incremental supervised learning of analog multidimensional maps. IEEE Transactions on Neural Networks 3, 698–713 (1992)CrossRefGoogle Scholar
  4. 4.
    Kolter, J.Z., Maloof, M.A.: Dynamic weighted majority: an ensemble method for drifting concepts. Journal of Machine Learning Research 8, 2755–2790 (2007)Google Scholar
  5. 5.
    Muhlbaier, M., Topalis, A., Polikar, R.: Incremental learning from unbalanced data. In: Proc. of Int. Joint Conference on Neural Networks (IJCNN 2004), Budapest, Hungary, July 2004, pp. 1057–1062 (2004)Google Scholar
  6. 6.
    Chawla, N., Japkowicz, N., Kotcz, A.: Editorial: special issue on learning from imbalanced data sets. ACM SIGKDD Explorations Newsletter 6(1), 1–6 (2004)CrossRefGoogle Scholar
  7. 7.
    Kubat, M., Holte, R.C., Matwin, S.: Machine Learning for the Detection of Oil Spills in Satellite Radar Images. Machine Learning 30, 195–215 (1998)CrossRefGoogle Scholar
  8. 8.
    Chawla, N., Bowyer, K., Hall, L., Kegelmeyer, W.: SMOTE: Synthetic Minority Over-sampling Technique. Journal of Artificial Intelligence Research 16, 321–357 (2002)MATHGoogle Scholar
  9. 9.
    Chawla, N., Lazarevic, A., Hall, L., Bowyer, K.: SMOTEBoost: Improving Prediction of the Minority Class in Boosting. In: Lavrač, N., Gamberger, D., Todorovski, L., Blockeel, H. (eds.) PKDD 2003. LNCS (LNAI), vol. 2838, pp. 107–119. Springer, Heidelberg (2003)Google Scholar
  10. 10.
    Freund, Y., Schapire, R.E.: Decision-theoretic generalization of on-line learning and an application to boosting. Journal of Computer and System Sciences 55(1), 119–139 (1997)MATHCrossRefMathSciNetGoogle Scholar
  11. 11.
    Asuncion, A., Newman, D.J.: UCI Repository of Machine Learning (November 2009), http://www.ics.uci.edu/~mlearn/MLReporitory.html

Copyright information

© Springer-Verlag Berlin Heidelberg 2010

Authors and Affiliations

  • Gregory Ditzler
    • 1
  • Michael D. Muhlbaier
    • 1
  • Robi Polikar
    • 1
  1. 1.Signal Processing and Pattern Recognition Laboratory Electrical and Computer EngineeringRowan UniversityGlassboroUSA

Personalised recommendations