Skip to main content
Log in

Learning a dual-branch classifier for class incremental learning

  • Published:
Applied Intelligence Aims and scope Submit manuscript

Abstract

Catastrophic forgetting is a non-trivial challenge for class incremental learning, which is caused by new knowledge learning and data imbalance between old and new classes. To alleviate this challenge, we propose a class incremental learning method with dual-branch classifier. First, inspired by ensemble learning, the proposed method constructs a dual network consisting of two complementary branches to alleviate the impact of data imbalance. Second, activation transfer loss is employed to reduce the catastrophic forgetting from the view of feature representation, preserving the feature separability of old classes. Third, we use the nearest class mean classifier with natural advantages for classification. Moreover, we formulate an end-to-end training algorithm for the feature extraction and classifier, to boost module matching degree. Extensive evaluation results show our proposed method achieves nice incremental recognition ability with less training time. Moreover, the ablation study shows the importance and necessity of dual-branch structure, end-to-end training, and activation transfer loss.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Similar content being viewed by others

References

  1. Zhao B, Xiao X, Gan G, Zhang B, Xia S-T (2020) Maintaining discrimination and fairness in class incremental learning. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp 13208–13217

  2. Kirkpatrick J, Pascanu R, Rabinowitz N, Veness J, Desjardins G, Rusu A A, Milan K, Quan J, Ramalho T, Grabska-Barwinska A, et. al (2017) Overcoming catastrophic forgetting in neural networks. Proceedings of the National Academy of Sciences 114(13):3521–3526

    Article  MATH  Google Scholar 

  3. Mallya A, Lazebnik S (2018) Packnet: Adding multiple tasks to a single network by iterative pruning. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp 7765–7773

  4. Zenke F, Poole B, Ganguli S (2017) Continual learning through synaptic intelligence. In: International Conference on Machine Learning, PMLR, pp 3987–3995

  5. Aljundi R, Babiloni F, Elhoseiny M, Rohrbach M, Tuytelaars T (2018) Memory aware synapses: Learning what (not) to forget. In: Proceedings of the European Conference on Computer Vision, pp 139–154

  6. Mallya A, Davis D, Lazebnik S (2018) Piggyback: Adapting a single network to multiple tasks by learning to mask weights Proceedings of the European Conference on Computer Vision, pp 67–82

  7. Rosenfeld A, Tsotsos J K (2018) Incremental learning through deep adaptation. IEEE Transactions on Pattern Analysis and Machine Intelligence 42(3):651–663

    Article  Google Scholar 

  8. Li Z, Hoiem D (2017) Learning without forgetting. IEEE Transactions on Pattern Analysis and Machine Intelligence 40(12):2935–2947

    Article  Google Scholar 

  9. Zhou P, Mai L, Zhang J, Xu N, Wu Z, Davis L S (2019) M2kd: Multi-model and multi-level knowledge distillation for incremental learning. arXiv:1904.01769

  10. Rebuffi S-A, Kolesnikov A, Sperl G, Lampert C H (2017) icarl: Incremental classifier and representation learning. In: Proceedings of the IEEE conference on Computer Vision and Pattern Recognition, pp 2001–2010

  11. Castro F M, Marín-Jiménez M J, Guil N, Schmid C, Alahari K (2018) End-to-end incremental learning. In: Proceedings of the European Conference on Computer Vision, pp 233–248

  12. He C, Wang R, Shan S, Chen X (2018) Exemplar-supported generative reproduction for class incremental learning.. In: British Machine Vision Conference, pp 1–13

  13. Hou S, Pan X, Loy C C, Wang Z, Lin D (2019) Learning a unified classifier incrementally via rebalancing. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp 831–839

  14. Belouadah E, Popescu A (2019) Il2m: Class incremental learning with dual memory. In: Proceedings of the IEEE/CVF International Conference on Computer Vision, pp 583–592

  15. Wu Y, Chen Y, Wang L, Ye Y, Liu Z, Guo Y, Fu Y (2019) Large scale incremental learning. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp 374–382

  16. Hu X, Tang K, Miao C, Hua X-S, Zhang H (2021) Distilling causal effect of data in class-incremental learning. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp 3957–3966

  17. Levi G, Hassner T (2015) Age and gender classification using convolutional neural networks. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshops, pp 34–42

  18. Seng Z, Kareem S A, Varathan K D (2021) A neighborhood undersampling stacked ensemble (nus-se) in imbalanced classification. Expert Syst Appl 168:114246

    Article  Google Scholar 

  19. Shen L, Lin Z, Huang Q (2016) Relay backpropagation for effective learning of deep convolutional neural networks. In: European conference on computer vision, Springer, pp 467–482

  20. Lin T-Y, Goyal P, Girshick R, He K, Dollár P (2017) Focal loss for dense object detection. In: Proceedings of the IEEE International Conference on Computer Vision, pp 2980–2988

  21. Cui Y, Jia M, Lin T-Y, Song Y, Belongie S (2019) Class-balanced loss based on effective number of samples. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp 9268–9277

  22. Cao K, Wei C, Gaidon A, Arechiga N, Ma T (2019) Learning imbalanced datasets with label-distribution-aware margin loss. arXiv:1906.07413

  23. Huang C, Li Y, Loy C C, Tang X (2016) Learning deep representation for imbalanced classification. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp 5375–5384

  24. Dong Q, Gong S, Zhu X (2017) Class rectification hard mining for imbalanced deep learning. In: Proceedings of the IEEE International Conference on Computer Vision, pp 1851–1860

  25. Huang C, Li Y, Loy C C, Tang X (2019) Deep imbalanced learning for face recognition and attribute prediction. IEEE Transactions on Pattern Analysis and Machine Intelligence 42(11):2781–2794

    Article  Google Scholar 

  26. Zhou B, Cui Q, Wei X-S, Chen Z-M (2020) Bbn: Bilateral-branch network with cumulative learning for long-tailed visual recognition. In: Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp 9719–9728

  27. Heo B, Lee M, Yun S, Choi J Y (2019) Knowledge transfer via distillation of activation boundaries formed by hidden neurons. In: Proceedings of the AAAI Conference on Artificial Intelligence, vol 33, pp 3779–3787

  28. Khosla A, Jayadevaprakash N, Yao B, Li F-F (2011) Novel dataset for fine-grained image categorization: Stanford dogs. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition Workshop on Fine-Grained Visual Categorization (FGVC), vol 2, Citeseer, pp 1–2

  29. Griffin G, Holub A, Perona P (2007) Caltech-256 object category dataset, 1–20

  30. Quattoni A, Torralba A (2009) Recognizing indoor scenes. In: 2009 IEEE Conference on Computer Vision and Pattern Recognition, IEEE, pp 413–420

  31. Khan A, Chefranov A G, Demirel H (2020) Texture gradient and deep features fusion-based image scene geometry recognition system using extreme learning machine. In: 2020 3rd International Conference on Intelligent Robotic and Control Engineering (IRCE), IEEE, pp 37–41

  32. Khan A, Chefranov A, Demirel H (2021) Image scene geometry recognition using low-level features fusion at multi-layer deep cnn. Neurocomputing 440:111–126

    Article  Google Scholar 

Download references

Acknowledgements

This research was supported by the Key Research and Development Plan of Shanxi Province under Grant 201803D421039, the National Natural Science Foundation of China under Grant 61973226, the Special Project for Transformation and Guidance of Scientific and Technological Achievements of Shanxi Province under Grant 201904D131023, the Key Research and Development Plan of Shanxi Province under Grant 201903D121143.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Gang Xie.

Ethics declarations

Conflict of Interests

The authors declare that they have no conflict of interest.

Additional information

Publisher’s note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Guo, L., Xie, G., Qu, Y. et al. Learning a dual-branch classifier for class incremental learning. Appl Intell 53, 4316–4326 (2023). https://doi.org/10.1007/s10489-022-03556-7

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10489-022-03556-7

Keywords

Navigation