Abstract
The wide deployment of Machine Learning models is an essential evolution of Artificial Intelligence, predominantly by porting deep neural networks in constrained hardware platforms such as 32 bits microcontrollers. For many IoT applications, the deployment of such complex models is hindered by two major issues that are usually handled separately. For supervised tasks, training a model requires a large quantity of labelled data which is expensive to collect or even intractable in many real-world applications. Furthermore, the inference process implies memory, computing and energy capacities that are not suitable for typical IoT platforms. We jointly tackle these issues by investigating the efficiency of model pruning techniques under the scope of the single domain generalization problem. Our experiments show that a pruned neural network retains the benefit of the training with single domain generalization algorithms despite a larger impact of pruning on its performance. We emphasize the importance of the pruning method, more particularly between structured and unstructured pruning as well as the benefit of data-agnostic heuristics that preserve their properties in the single domain generalization setting.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Chang, Y., Mathur, A., Isopoussu, A., Song, J., Kawsar, F.: A systematic study of unsupervised domain adaptation for robust human-activity recognition. Proc. ACM Interact. Mobile Wearable Ubiquit. Technol. 4(1), 1–3 (2020)
Frankle, J., Carbin, M.: The lottery ticket hypothesis: finding sparse, trainable neural networks. In: International Conference on Learning Representations (2019)
Han, S., Pool, J., Tran, J., Dally, W.: Learning both weights and connections for efficient neural network. Adv. Neural Inf. Proc. Syst. 1, 1135–1143 (2015)
He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: Proceedings of the Conference on Computer Vision and Pattern Recognition (2016)
He, Y., Liu, P., Wang, Z., Hu, Z., Yang, Y.: Filter pruning via geometric median for deep convolutional neural networks acceleration. In: Proceedings of the Conference on Computer Vision and Pattern Recognition (2019)
Howard, A.G., et al.: MobileNets: efficient convolutional neural networks for mobile vision applications. arXiv preprint arXiv:1704.04861 (2017)
Hull, J.J.: A database for handwritten text recognition research. IEEE Trans. Pattern Anal. Mach. Intell. 16(5), 550–554 (1994)
Ismail Fawaz, H., et al.: InceptionTime: finding AlexNet for time series classification. Data Min. Knowl. Disc. 34, 1–27 (2020)
LeCun, Y., et al.: Backpropagation applied to handwritten zip code recognition. Neural Comput. 1(4), 541–551 (1989)
Lee, N., Ajanthan, T., Torr, P.: Snip: single-shot network pruning based on connection sensitivity. In: International Conference on Learning Representations (2018)
Li, H., Kadav, A., Durdanovic, I., Samet, H., Graf, H.P.: Pruning filters for efficient convnets. In: International Conference on Learning Representations (2017)
Munappy, A., Bosch, J., Olsson, H.H., Arpteg, A., Brinne, B.: Data management challenges for deep learning. In: 2019 45th Euromicro Conference on Software Engineering and Advanced Applications (SEAA). IEEE (2019)
Netzer, Y., Wang, T., Coates, A., Bissacco, A., Wu, B., Ng, A.Y.: Reading digits in natural images with unsupervised feature learning. NIPS (2011)
Qiao, F., Zhao, L., Peng, X.: Learning to learn single domain generalization. In: Proceedings of the Conference on Computer Vision and Pattern Recognition (2020)
Renda, A., Frankle, J., Carbin, M.: Comparing rewinding and fine-tuning in neural network pruning. In: International Conference on Learning Representations (2020)
Sztyler, T., Stuckenschmidt, H.: On-body localization of wearable devices: an investigation of position-aware activity recognition. In: 2016 IEEE International Conference on Pervasive Computing and Communications (PerCom). IEEE (2016)
Tanaka, H., Kunin, D., Yamins, D.L., Ganguli, S.: Pruning neural networks without any data by iteratively conserving synaptic flow. Adv. Neural Inf. Proc. Syst. 33, 6377–6389 (2020)
Thodberg, H.H.: Improving generalization of neural networks through pruning. Int. J. Neural Syst. 1(4), 317–326 (1991)
Trommer, E., Waschneck, B., Kumar, A.: dCSR: a memory-efficient sparse matrix representation for parallel neural network inference. In: 2021 IEEE/ACM International Conference On Computer Aided Design (ICCAD). IEEE (2021)
Verdenius, S., Stol, M., Forré, P.: Pruning via iterative ranking of sensitivity statistics. arXiv preprint arXiv:2006.00896 (2020)
Volpi, R., Murino, V.: Addressing model vulnerability to distributional shifts over image transformation sets. In: Proceedings of the IEEE/CVF International Conference on Computer Vision (2019)
Wang, X., Han, Y., Leung, V.C., Niyato, D., Yan, X., Chen, X.: Convergence of edge computing and deep learning: a comprehensive survey. IEEE Commun. Surv. Tutorials 22, 869–904 (2020)
Xu, Z., Liu, D., Yang, J., Raffel, C., Niethammer, M.: Robust and generalizable visual representation learning via random convolutions. In: International Conference on Learning Representations (2021)
Acknowledgments
This work benefited from the French Jean Zay supercomputer thanks to the AI dynamic access program. This collaborative work is partially supported by the IPCEI on Microelectronics and Nano2022 actions and by the European project InSecTT (www.insectt.eu: ECSEL Joint Undertaking (876038). The JU receives support from the European Union’s H2020 program and Au, Sw, Sp, It, Fr, Po, Ir, Fi, Sl, Po, Nl, Tu. The document reflects only the author’s view and the Commission is not responsible for any use that may be made of the information it contains.) and by the French National Research Agency (ANR) in the framework of the Investissements d’Avenir program (ANR-10-AIRT-05, irtnanoelec).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Nguyen, B., Moëllic, PA., Blayac, S. (2022). Domain Generalization on Constrained Platforms: On the Compatibility with Pruning Techniques. In: González-Vidal, A., Mohamed Abdelgawad, A., Sabir, E., Ziegler, S., Ladid, L. (eds) Internet of Things. GIoTS 2022. Lecture Notes in Computer Science, vol 13533. Springer, Cham. https://doi.org/10.1007/978-3-031-20936-9_20
Download citation
DOI: https://doi.org/10.1007/978-3-031-20936-9_20
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-20935-2
Online ISBN: 978-3-031-20936-9
eBook Packages: Computer ScienceComputer Science (R0)