Skip to main content

A Simple Method to Directly Determine Novel Class Weight of CNN for Generalized Few-Shot Learning

  • Conference paper
  • First Online:
Pattern Recognition (ACPR 2023)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 14406))

Included in the following conference series:

  • 263 Accesses

Abstract

This paper addresses the problem of machine learning for image classification, where a convolutional neural network (CNN) that has already learned the base classes learns novel classes from a small number of examples. Machine learning from a small number of examples is called few-shot learning (FSL). FSL that focuses on the classification performance of both base and novel classes in the joint space of these two classes is a challenging problem called generalized few-shot learning (GFSL). We propose a simple method for GFSL for image classification that adds the output unit of a novel class to the final layer of a CNN that has already trained on base classes and directly sets the weight vector of the output unit. Compared to existing methods, the proposed method has the advantage that the norms of the weight vectors of the final layer of the CNN need not be normalized during the training process on the base classes. We demonstrate the effectiveness of the proposed method on miniImageNet and tieredImageNet, two representative benchmark datasets for FSL.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 59.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 79.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Chen, Z., Fu, Y., Wang, Y.X., Ma, L., Liu, W., Hebert, M.: Image deformation meta-networks for one-shot learning. In: 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 8672–8681 (2019)

    Google Scholar 

  2. Deng, J., Dong, W., Socher, R., Li, L.J., Li, K., Fei-Fei, L.: Imagenet: a large-scale hierarchical image database. In: 2009 IEEE Conference on Computer Vision and Pattern Recognition, pp. 248–255 (2009)

    Google Scholar 

  3. Dwivedi, S.K., Gupta, V., Mitra, R., Ahmed, S., Jain, A.: Protogan: towards few shot learning for action recognition. In: 2019 IEEE/CVF International Conference on Computer Vision Workshop, pp. 1308–1316 (2019)

    Google Scholar 

  4. Fan, Z., Ma, Y., Li, Z., Sun, J.: Generalized few-shot object detection without forgetting. In: 2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 4525–4534 (2021)

    Google Scholar 

  5. Finn, C., Abbeel, P., Levine, S.: Model-agnostic meta-learning for fast adaptation of deep networks. In: Proceedings of the 34th International Conference on Machine Learning. Proceedings of Machine Learning Research, vol. 70, pp. 1126–1135 (2017)

    Google Scholar 

  6. Gidaris, S., Komodakis, N.: Dynamic few-shot visual learning without forgetting. In: 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 4367–4375 (2018)

    Google Scholar 

  7. Hariharan, B., Girshick, R.: Low-shot visual recognition by shrinking and hallucinating features. In: 2017 IEEE International Conference on Computer Vision, pp. 3037–3046 (2017)

    Google Scholar 

  8. Kukleva, A., Kuehne, H., Schiele, B.: Generalized and incremental few-shot learning by explicit learning and calibration without forgetting. In: 2021 IEEE/CVF International Conference on Computer Vision (2021)

    Google Scholar 

  9. Li, W., Wang, L., Xu, J., Huo, J., Gao, Y., Luo, J.: Revisiting local descriptor based image-to-class measure for few-shot learning. In: 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 7253–7260 (2019)

    Google Scholar 

  10. Li, Z., Zhou, F., Chen, F., Li, H.: Meta-SGD: learning to learn quickly for few-shot learning, arXiv preprint, arXiv:1707.09835 (2017)

  11. Oreshkin, B., Rodríguez López, P., Lacoste, A.: Tadam: task dependent adaptive metric for improved few-shot learning. In: Advances in Neural Information Processing Systems, vol. 31, pp. 721–731 (2018)

    Google Scholar 

  12. Qi, H., Brown, M., Lowe, D.G.: Low-shot learning with imprinted weights. In: 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 5822–5830 (2018)

    Google Scholar 

  13. Ravi, S., Larochelle, H.: Optimization as a model for few-shot learning. In: International Conference on Learning Representations (2017)

    Google Scholar 

  14. Ren, M., Liao, R., Fetaya, E., Zemel, R.: Incremental few-shot learning with attention attractor networks. In: Advances in Neural Information Processing Systems, vol. 32 (2019)

    Google Scholar 

  15. Ren, M., et al.: Meta-learning for semi-supervised few-shot classification. In: International Conference on Learning Representations (2018)

    Google Scholar 

  16. Rusu, A.A., et al.: Meta-learning with latent embedding optimization. In: International Conference on Learning Representations (2019)

    Google Scholar 

  17. Schwartz, E., et al.: Delta-encoder: an effective sample synthesis method for few-shot object recognition. In: Advances in Neural Information Processing Systems, vol. 31 (2018)

    Google Scholar 

  18. Shi, X., Salewski, L., Schiegg, M., Welling, M.: Relational generalized few-shot learning. In: The 31st British Machine Vision Virtual Conference (2020)

    Google Scholar 

  19. Snell, J., Swersky, K., Zemel, R.: Prototypical networks for few-shot learning. In: Advances in Neural Information Processing Systems, vol. 30, pp. 4077–4087 (2017)

    Google Scholar 

  20. Sung, F., Yang, Y., Zhang, L., Xiang, T., Torr, P.H., Hospedales, T.M.: Learning to compare: relation network for few-shot learning. In: 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 1199–1208 (2018)

    Google Scholar 

  21. Vinyals, O., Blundell, C., Lillicrap, T., kavukcuoglu, k., Wierstra, D.: Matching networks for one shot learning. In: Advances in Neural Information Processing Systems, vol. 29, pp. 3630–3638 (2016)

    Google Scholar 

  22. Wang, Y.X., Girshick, R., Hebert, M., Hariharan, B.: Low-shot learning from imaginary data. In: 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp. 7278–7286 (2018)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Keiichi Yamada .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Yamada, K. (2023). A Simple Method to Directly Determine Novel Class Weight of CNN for Generalized Few-Shot Learning. In: Lu, H., Blumenstein, M., Cho, SB., Liu, CL., Yagi, Y., Kamiya, T. (eds) Pattern Recognition. ACPR 2023. Lecture Notes in Computer Science, vol 14406. Springer, Cham. https://doi.org/10.1007/978-3-031-47634-1_20

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-47634-1_20

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-47633-4

  • Online ISBN: 978-3-031-47634-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics