Skip to main content

Central-Diffused Instance Generation Method in Class Incremental Learning

  • Conference paper
  • First Online:
Book cover Artificial Neural Networks and Machine Learning – ICANN 2019: Deep Learning (ICANN 2019)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 11728))

Included in the following conference series:

  • 3891 Accesses

Abstract

Class incremental learning is widely applied in the classification scenarios as the number of classes is usually dynamically changing. Meanwhile, class imbalance learning often occurs simultaneously in class incremental learning when the new class emerges. Previous studies mainly proposed different methods to handle this problem. But these methods focus on classification tasks with a fixed class set and cannot adjust the peripheral contour features of the original instance distribution. As a result, the classification performance degrades seriously in an open dynamic environment, and the synthetic instances are always clustered within the original distribution. In order to solve class imbalance learning effectively in class incremental learning, we propose a Central-diffused Instance Generation Method to generate the instances of minority class as the new class emerging, called CdIGM. The key is to randomly shoot direction vectors of fixed length from the center of new class instances to expand the instance distribution space. The vectors diffuse to form a distribution which is optimized to satisfy properties that produce a multi-classification discriminative classifier with good performance. We conduct the experiments on both artificial data streams with different imbalance rates and real-world ones to compare CdIGM with some other proposed methods, e.g. SMOTE, OPCIL, OB and SDCIL. The experiment results show that CdIGM averagely achieves more than 4.01%, 4.49%, 8.81% and 9.76% performance improvement over SMOTE, OPCIL, OB and SDCIL, respectively, and outperforms in terms of overall and real-time accuracy. Our method is proved to possess the strength of class incremental learning and class imbalance learning with good accuracy and robustness.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Chawla, N.V., Bowyer, K.W., Hall, L.O., Kegelmeyer, W.P.: Smote: synthetic minority over-sampling technique. J. Artif. Intell. Res. 16(1), 321–357 (2002). https://doi.org/10.1613/jair.953

    Article  MATH  Google Scholar 

  2. Dong, Q., Gong, S., Zhu, X.: Imbalanced deep learning by minority class incremental rectification. IEEE Trans. Pattern Anal. Mach. Intell. PP(99), 1 (2018). https://doi.org/10.1109/tpami.2018.2832629

    Article  Google Scholar 

  3. Huang, C., Li, Y., Chen, C.L., Tang, X.: Learning deep representation for imbalanced classification. In: Computer Vision & Pattern Recognition (2016). https://doi.org/10.1109/cvpr.2016.580

  4. Khan, S.S., Madden, M.G.: One-class classification: taxonomy of study and review of techniques. Knowl. Eng. Rev. 29(3), 345–374 (2014). https://doi.org/10.1017/s026988891300043x

    Article  Google Scholar 

  5. Khoshgoftaar, T.M., Golawala, M., Hulse, J.V.: An empirical study of learning from imbalanced data using random forest. In: IEEE International Conference on TOOLS with Artificial Intelligence, pp. 310–317 (2008). https://doi.org/10.1109/ictai.2007.46

  6. Kuzborskij, I., Orabona, F., Caputo, B.: From n to n+1: Multiclass transfer incremental learning, pp. 3358–3365 (2013). https://doi.org/10.1109/cvpr.2013.431

  7. Lake, B.M., Salakhutdinov, R., Tenenbaum, J.B.: One-shot learning by inverting a compositional causal process. In: International Conference on Neural Information Processing Systems (2013)

    Google Scholar 

  8. Li, F.F., Rob, F., Pietro, P.: One-shot learning of object categories. IEEE Trans. Pattern Anal. Mach. Intell. 28(4), 594–611 (2006). https://doi.org/10.1109/tpami.2006.79

    Article  Google Scholar 

  9. Mu, X., Kai, M.T., Zhou, Z.H.: Classification under streaming emerging new classes: a solution using completely-random trees. IEEE Trans. Knowl. Data Eng. 29(8), 1605–1618 (2016). https://doi.org/10.1109/tkde.2017.2691702

    Article  Google Scholar 

  10. Mu, X., Zhu, F., Du, J., Lim, E.P., Zhou, Z.H.: Streaming classification with emerging new class by class matrix sketching. In: AAAI, pp. 2373–2379 (2017)

    Google Scholar 

  11. Muhlbaier, M.D., Topalis, A., Polikar, R.: Learn ++.nc: combining ensemble of classifiers with dynamically weighted consult-and-vote for efficient incremental learning of new classes. IEEE Trans. Neural Netw. 20(1), 152 (2009). https://doi.org/10.1109/tnn.2008.2008326

    Article  Google Scholar 

  12. Pruengkarn, R., Wong, K.W., Fung, C.C.: Imbalanced data classification using complementary fuzzy support vector machine techniques and smote. In: IEEE International Conference on Systems (2017). https://doi.org/10.1109/smc.2017.8122737

  13. Rabaoui, A., Davy, M., Rossignol, S., Lachiri, Z.: Improved one-class SVM classifier for sounds classification. In: 2007 IEEE Conference on Advanced Video and Signal Based Surveillance, AVSS 2007, pp. 117–122 (2007). https://doi.org/10.1109/avss.2007.4425296

  14. Sebastiani, F.: Machine learning in automated text categorization. ACM Comput. Surv. 34(1), 1–47 (2002). https://doi.org/10.1145/505282.505283

    Article  Google Scholar 

  15. Sheng, C., He, H., Kang, L., Desai, S.: MuSeRA: multiple selectively recursive approach towards imbalanced stream data mining. In: International Joint Conference on Neural Networks (2010). https://doi.org/10.1109/ijcnn.2010.5596538

  16. Wang, S., Minku, L.L., Yao, X.: A learning framework for online class imbalance learning. In: Computational Intelligence & Ensemble Learning (2013). https://doi.org/10.1109/ciel.2013.6613138

  17. Wang, Y., Li, S.: Research and performance evaluation of data replication technology in distributed storage systems. Comput. Math. Appl. 51(11), 1625–1632 (2006). https://doi.org/10.1016/j.camwa.2006.05.002

    Article  Google Scholar 

  18. Wang, Y., Li, X., Li, X., Wang, Y.: A survey of queries over uncertain data. Knowl. Inf. Syst. 37(3), 485–530 (2013). https://doi.org/10.1007/s10115-013-0638-6

    Article  Google Scholar 

  19. Wang, Y., Ma, X.: A general scalable and elastic content-based publish/subscribe service. IEEE Trans. Parallel Distrib. Syst. 26(8), 2100–2113 (2014). https://doi.org/10.1109/tpds.2014.2346759

    Article  Google Scholar 

  20. Wang, Y., Pei, X., Ma, X., Xu, F.: TA-update: an adaptive update scheme with tree-structured transmission in erasure-coded storage systems. IEEE Trans. Parallel Distrib. Syst. 29(8), 1893–1906 (2017). https://doi.org/10.1109/tpds.2017.2717981

    Article  Google Scholar 

  21. Yao, C., Zou, J., Luo, Y., Li, T., Bai, G.: A class-incremental learning method based on one class support vector machine (2018). https://doi.org/10.1088/1742-6596/1267/1/012007

    Google Scholar 

  22. Zhu, Y., Kai, M.T., Zhou, Z.H.: New class adaptation via instance generation in one-pass class incremental learning. In: IEEE International Conference on Data Mining, pp. 1207–1212 (2017). https://doi.org/10.1109/icdm.2017.163

  23. Zhu, Y., Ting, K.M., Zhou, Z.H.: Multi-label learning with emerging new labels. In: IEEE International Conference on Data Mining, pp. 1371–1376 (2017). https://doi.org/10.1109/icdm.2016.0188

Download references

Acknowledgements

This work is supported by the National Key Research and Development Program of China (2016YFB1000101), the National Natural Science Foundation of China (Grant No.61379052), the Science Foundation of Ministry of Education of China (Grant No. 2018A02002), the Natural Science Foundation for Distinguished Young Scholars of Hunan Province (Grant No. 14JJ1026). The corresponding author is Yijie Wang and her email is wangyijie@nudt.edu.cn.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yijie Wang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Liu, M., Wang, Y. (2019). Central-Diffused Instance Generation Method in Class Incremental Learning. In: Tetko, I., Kůrková, V., Karpov, P., Theis, F. (eds) Artificial Neural Networks and Machine Learning – ICANN 2019: Deep Learning. ICANN 2019. Lecture Notes in Computer Science(), vol 11728. Springer, Cham. https://doi.org/10.1007/978-3-030-30484-3_37

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-30484-3_37

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-30483-6

  • Online ISBN: 978-3-030-30484-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics