Estimating the Carbon Footprint of BLOOM, a 176B Parameter Language Model
Article Status
Published
Authors/contributors
- Luccioni, Alexandra Sasha (Author)
- Viguier, Sylvain (Author)
- Ligozat, Anne-Laure (Author)
Title
Estimating the Carbon Footprint of BLOOM, a 176B Parameter Language Model
Abstract
Progress in machine learning (ML) comes with a cost to the environment, given that training ML models requires significant computational resources, energy and materials. In the present article, we aim to quantify the carbon footprint of BLOOM, a 176-billion parameter language model, across its life cycle. We estimate that BLOOM's final training emitted approximately 24.7 tonnes of~\carboneq~if we consider only the dynamic power consumption, and 50.5 tonnes if we account for all processes ranging from equipment manufacturing to energy-based operational consumption. We also study the energy requirements and carbon emissions of its deployment for inference via an API endpoint receiving user queries in real-time. We conclude with a discussion regarding the difficulty of precisely estimating the carbon footprint of ML models and future research directions that can contribute towards improving carbon emissions reporting.
Repository
arXiv
Archive ID
arXiv:2211.02001
Date
2022-11-03
Accessed
27/06/2023, 15:47
Library Catalogue
Extra
arXiv:2211.02001 [cs]
Citation
Luccioni, A. S., Viguier, S., & Ligozat, A.-L. (2022). Estimating the Carbon Footprint of BLOOM, a 176B Parameter Language Model (arXiv:2211.02001). arXiv. http://arxiv.org/abs/2211.02001
Link to this record