Abstract
In recent years, neural networks have achieved great success promising superior in various fields of artificial intelligence, including image recognition, target detection, natural language processing, and data analysis. However, the training and inference of neural networks require a large number of parameters and calculations, which hinders the deployment of embedded devices with limited resources. Therefore, using neural network pruning to reduce memory overhead and increase inference speed has become a hot topic. In this paper, the pruning algorithm is analyzed and concluded from two aspects: structured pruning and unstructured pruning. Moreover, the effects of structured pruning and unstructured pruning algorithms after running different data sets in different neural network models have been comparatively analyzed. Furthermore, we have proposed certain suggestions to the future development of neural network pruning based on the experimental results.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Simonyan, K., Zisserman, A.: Very deep convolutional networks for large-scale image recognition. Comput. Sci. (2014)
Krizhevsky, A., Sutskever, I., Hinton, G.: ImageNet classification with deep convolutional neural networks. Adv. Neural Inf. Process. Syst. (2012)
He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. IEEE Conf. Comput. Vis. Pattern Recogn. (2016)
Ye, X., Dai, P., Luo, J., Guo, X., Chen, Y.: Accelerating CNN training by Pruning Activation Gradients (2020). (Arxiv No.: 1908.00173v3)
Li, B., Wu, B., Su, J.: EagleEye: fast sub-net evaluation for efficient neural network pruning. Comput. Vis. ECCV 12347(37), 639–654 (2020)
Lee, N., Ajanthan, T., Torr, P.: SNIP: single-shot network pruning based on connection sensitivity (2018). (Arxiv No.: 1810.02340v2)
Frankle, J., Carbin, M.: The lottery ticket hypothesis: training pruned neural networks (2018)
Guo, S., Wang, Y., Li, Q., Yan, J.: DMCP: differentiable markov channel pruning for neural networks. In: 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pp. 1536–1544 (2020)
Lin Mingbao, Ji Rongrong, Zhang Yuxin, Zhang Baochang, Yongjian Wu, and Tian Yonghong. “Channel Pruning via Automatic Structure Search,” Twenty-Ninth International Joint Conference on Artificial Intelligence and Seventeenth Pacific Rim International Conference on Artificial Intelligence, pp. 673–679
Lin, M., Ji, R., Wang, Y., et al.: HRank: filter pruning using high-rank feature map. In: 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) (2020)
Ye, Y., You, G., Fwu, J.-K., Zhu, X., Yang, Q., Zhu, Y.: Channel pruning via optimal thresholding (2020). (Arxiv No.: 2003.04566v5)
He, Y., Liu, P., Wang, Z., Hu, Z., Yang, Y.: Filter pruning via geometric median for deep convolutional neural networks acceleration. In: 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), pp. 4335–4344 (2019)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Yang, Z., Zhang, H. (2022). Comparative Analysis of Structured Pruning and Unstructured Pruning. In: Hung, J.C., Yen, N.Y., Chang, JW. (eds) Frontier Computing. FC 2021. Lecture Notes in Electrical Engineering, vol 827. Springer, Singapore. https://doi.org/10.1007/978-981-16-8052-6_112
Download citation
DOI: https://doi.org/10.1007/978-981-16-8052-6_112
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-16-8051-9
Online ISBN: 978-981-16-8052-6
eBook Packages: EngineeringEngineering (R0)