Abstract
This paper contributes towards better understanding the energy consumption trade-offs of HPC scale Artificial Intelligence (AI), and more specifically Deep Learning (DL) algorithms. For this task we developed benchmark-tracker, a benchmark tool to evaluate the speed and energy consumption of DL algorithms in HPC environments. We exploited hardware counters and Python libraries to collect energy information through software, which enabled us to instrument a known AI benchmark tool, and to evaluate the energy consumption of numerous DL algorithms and models. Through an experimental campaign, we show a case example of the potential of benchmark-tracker to measure the computing speed and the energy consumption for training and inference DL algorithms, and also the potential of Benchmark-Tracker to help better understanding the energy behavior of DL algorithms in HPC platforms. This work is a step forward to better understand the energy consumption of Deep Learning in HPC, and it also contributes with a new tool to help HPC DL developers to better balance the HPC infrastructure in terms of speed and energy consumption.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
- 2.
- 3.
- 4.
- 5.
- 6.
- 7.
- 8.
- 9.
- 10.
- 11.
- 12.
References
Abadi, M., et al.: TensorFlow: large-scale machine learning on heterogeneous systems (2015). Software available from tensorflow.org. https://www.tensorflow.org/
Avidon, E.: ‘data for good’ movement spurs action in fight for causes (2020). https://www.techtarget.com/searchbusinessanalytics/news/252487703/Data-for-good-movement-spurs-action-in-fight-for-causes
Avidon, E.: How much does it cost to run a GPU (2022). https://graphicscardsadvisor.com/how-much-does-it-cost-to-run-a-gpu/
Bianco, S., Cadene, R., Celona, L., Napoletano, P.: Benchmark analysis of representative deep neural network architectures. IEEE Access 6, 64270–64277 (2018). https://doi.org/10.1109/access.2018.2877890
Chen, C., Liu, Y., Kumar, M., Qin, J.: Energy consumption modelling using deep learning technique - a case study of EAF. Procedia CIRP 72, 1063–1068 (2018)
Ficher, M., Berthoud, F., Ligozat, A.L., Sigonneau, P., Wisslé, M., Tebbani, B.: Assessing the carbon footprint of the data transmission on a backbone network. In: 24th Conference on Innovation in Clouds, Internet and Networks, Paris, France, March 2021. https://hal.archives-ouvertes.fr/hal-03196527
García-Martín, E., Rodrigues, C.F., Riley, G.D., Grahn, H.: Estimation of energy consumption in machine learning. J. Parallel Distrib. Comput. 134, 75–88 (2019)
Henderson, P., Hu, J., Romoff, J., Brunskill, E., Jurafsky, D., Pineau, J.: Towards the systematic reporting of the energy and carbon footprints of machine learning. J. Mach. Learn. Res. 21, 1–43 (2020)
Jay, M.: How can we estimate the energy consumption of training an AI model? (2022). https://team.inria.fr/datamove/files/2022/02/220202-slides-mathilde-jay.pdf
Labbe, M.: Energy consumption of AI poses environmental problems. https://www.techtarget.com/searchenterpriseai/feature/Energy-consumption-of-AI-poses-environmental-problems
Labbe, M.: AI and climate change: the mixed impact of machine learning (2021). https://www.techtarget.com/searchenterpriseai/feature/AI-and-climate-change-The-mixed-impact-of-machine-learning
Mazouz, A., Wong, D.C.L., Kuck, D.J., Jalby, W.: An incremental methodology for energy measurement and modeling. In: Proceedings of the 8th ACM/SPEC on International Conference on Performance Engineering (2017)
Morgan, L.: AI carbon footprint: helping and hurting the environment (2021). https://www.techtarget.com/searchenterpriseai/feature/AI-carbon-footprint-Helping-and-hurting-the-environment
OpenAI: AI and compute (2018). https://openai.com/blog/ai-and-compute/
Schmidt, V., et al.: CodeCarbon: estimate and track carbon emissions from machine learning computing (2021). https://doi.org/10.5281/zenodo.4658424
Vaswani, A., et al.: Attention is all you need. In: Advances in Neural Information Processing Systems, vol. 30 (2017)
Walton, J.: Graphics card power consumption and efficiency tested (2021). https://www.tomshardware.com/features/graphics-card-power-consumption-tested
Acknowledgements
This work was supported by the research program on Edge Intelligence of the Multi-disciplinary Institute on Artificial Intelligence MIAI at Grenoble Alpes (ANR-19-P3IA-0003), and the Energy Saving in Large Scale Distributed Platforms - Energumen project (ANR-18-CE25-0008). We also thank all institutions (INRIA, CNRS, RENATER and several Universities as well as other organizations) who support the Grid5000 platform.
Author information
Authors and Affiliations
Contributions
Thi contribued to the source-code implementation, execution of experiments, data processing, analysis, and results interpretation with the guidance of Danilo. Danilo was the main writer of Sects. 1, 2, and 6, and Thi was the main writer of Sects. 3, 4, and 5. Finally, all the authors reviewed the final manuscript.
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG
About this paper
Cite this paper
Carastan-Santos, D., Pham, T.H.T. (2022). Understanding the Energy Consumption of HPC Scale Artificial Intelligence. In: Navaux, P., Barrios H., C.J., Osthoff, C., Guerrero, G. (eds) High Performance Computing. CARLA 2022. Communications in Computer and Information Science, vol 1660. Springer, Cham. https://doi.org/10.1007/978-3-031-23821-5_10
Download citation
DOI: https://doi.org/10.1007/978-3-031-23821-5_10
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-031-23820-8
Online ISBN: 978-3-031-23821-5
eBook Packages: Computer ScienceComputer Science (R0)