Skip to main content

Filter Pruning for Efficient Transfer Learning in Deep Convolutional Neural Networks

  • Conference paper
  • First Online:
Artificial Intelligence and Soft Computing (ICAISC 2019)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 11508))

Included in the following conference series:

Abstract

Convolutional Neural Networks are extensively used in computer vision applications. Many convolutional models became famous after being widely adopted in a variety of computer vision tasks because o their high accuracy and great generality. Trough Transfer Learning, pre-trained versions of these models can be applied to a large number of different tasks and datasets without the need to train an entire large convolutional model. We aim at finding methods to prune convolutional filters from these pre-trained models in order to make inference more efficient for the new task. To achieve this we propose a genetic algorithms based method for pruning convolutional filters of pre-trained models applied to a different dataset than the one they were trained for. After transferring knowledge from an already trained model to a new task, genetic algorithms are used to find good solutions to the filter pruning problem through natural selection. We then evaluate the results of the proposed methods and compare with state-of-the-art pruning strategies for convolutional neural networks. Obtained experimental results show that the method is able to maintain network accuracy while producing networks with a significant reduction in Floating Point Operations (FLOPs).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Anwar, S., Hwang, K., Sung, W.: Structured pruning of deep convolutional neural networks. ArXiv e-prints, December 2015

    Google Scholar 

  2. Chai, Y., Lempitsky, V., Zisserman, A.: BiCoS: a bi-level co-segmentation method for image classification. In: IEEE International Conference on Computer Vision, pp. 2579–2586. IEEE (2011)

    Google Scholar 

  3. Dean, J., et al.: Large scale distributed deep networks. In: Advances in Neural Information Processing Systems, pp. 1223–1231 (2012)

    Google Scholar 

  4. Everingham, M., Eslami, S.M.A., Van Gool, L., Williams, C.K.I., Winn, J., Zisserman, A.: The pascal visual object classes challenge: a retrospective. Int. J. Comput. Vis. 111(1), 98–136 (2015)

    Article  Google Scholar 

  5. Fei-Fei, L., Fergus, R., Perona, P.: Learning generative visual models from few training examples: an incremental bayesian approach tested on 101 object categories. Comput. Vis. Image Underst. 106(1), 59–70 (2007)

    Article  Google Scholar 

  6. Griffin, G., Holub, A., Perona, P.: Caltech-256 object category dataset (2007)

    Google Scholar 

  7. Han, S., Pool, J., Tran, J., Dally, W.: Learning both weights and connections for efficient neural network. In: Advances in Neural Information Processing Systems, pp. 1135–1143 (2015)

    Google Scholar 

  8. Konak, A., Coit, D.W., Smith, A.E.: Multi-objective optimization using genetic algorithms: a tutorial. Reliab. Eng. Syst. Saf. 91(9), 992–1007 (2006)

    Article  Google Scholar 

  9. Krizhevsky, A., Hinton, G.: Learning multiple layers of features from tiny images. Technical report, Citeseer (2009)

    Google Scholar 

  10. Krizhevsky, A., Sutskever, I., Hinton, G.E.: ImageNet classification with deep convolutional neural networks. In: Advances in Neural Information Processing Systems, pp. 1097–1105 (2012)

    Google Scholar 

  11. Li, H., Kadav, A., Durdanovic, I., Samet, H., Graf, H.P.: Pruning filters for efficient convnets. arXiv preprint arXiv:1608.08710 (2016)

  12. Long, J., Shelhamer, E., Darrell, T.: Fully convolutional networks for semantic segmentation. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 3431–3440 (2015)

    Google Scholar 

  13. Luo, J.H., Wu, J., Lin, W.: ThiNet: a filter level pruning method for deep neural network compression. arXiv preprint arXiv:1707.06342 (2017)

  14. Molchanov, P., Tyree, S., Karras, T., Aila, T., Kautz, J.: Pruning convolutional neural networks for resource efficient transfer learning. CoRR abs/1611.06440 (2016), http://arxiv.org/abs/1611.06440

  15. Nilsback, M.E., Zisserman, A.: Automated flower classification over a large number of classes. In: 2008 Sixth Indian Conference on Computer Vision, Graphics & Image Processing, ICVGIP 2008, pp. 722–729. IEEE (2008)

    Google Scholar 

  16. Oquab, M., Bottou, L., Laptev, I., Sivic, J.: Learning and transferring mid-level image representations using convolutional neural networks. In: 2014 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), pp. 1717–1724. IEEE (2014)

    Google Scholar 

  17. Pan, S.J., Yang, Q.: A survey on transfer learning. IEEE Trans. Knowl. Data Eng. 22(10), 1345–1359 (2010)

    Article  Google Scholar 

  18. Russakovsky, O., et al.: ImageNet large scale visual recognition challenge. Int. J. Comput. Vis. 115(3), 211–252 (2015)

    Article  MathSciNet  Google Scholar 

  19. Shazeer, N., et al.: Outrageously large neural networks: the sparsely-gated mixture-of-experts layer. arXiv preprint arXiv:1701.06538 (2017)

  20. Simonyan, K., Zisserman, A.: Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556 (2014)

  21. Szegedy, C., et al.: Going deeper with convolutions. In: CVPR (2015)

    Google Scholar 

  22. Wen, W., Wu, C., Wang, Y., Chen, Y., Li, H.: Learning structured sparsity in deep neural networks. In: Advances in Neural Information Processing Systems, pp. 2074–2082 (2016)

    Google Scholar 

  23. Zeiler, M.D., Fergus, R.: Visualizing and understanding convolutional networks. In: Fleet, D., Pajdla, T., Schiele, B., Tuytelaars, T. (eds.) ECCV 2014. LNCS, vol. 8689, pp. 818–833. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-10590-1_53

    Chapter  Google Scholar 

Download references

Acknowledgments

This work was partially supported under grant no. 5850.0105377.17.9 by Petrobras S.A.

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Caique Reinhold or Mauro Roisenberg .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Reinhold, C., Roisenberg, M. (2019). Filter Pruning for Efficient Transfer Learning in Deep Convolutional Neural Networks. In: Rutkowski, L., Scherer, R., Korytkowski, M., Pedrycz, W., Tadeusiewicz, R., Zurada, J. (eds) Artificial Intelligence and Soft Computing. ICAISC 2019. Lecture Notes in Computer Science(), vol 11508. Springer, Cham. https://doi.org/10.1007/978-3-030-20912-4_19

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-20912-4_19

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-20911-7

  • Online ISBN: 978-3-030-20912-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics