Environmental sustainability is a crucial aspect of all human activities, and recently, the field of artificial intelligence has also been required to analyze its impact in terms of energy efficiency and its relationship with accuracy. We have conducted a literature review to evaluate the sustainability of neural models, with a specific focus on neural transpilers. These models employ neural machine translation to convert source code from one programming language to another. Neural transpilers have attracted significant attention in the scientific community, with multiple studies published between 2022 and 2023 focusing on their potential integration into programmers' development environments Our analysis focused on the environmental impact of these models, particularly in terms of electricity consumption, carbon emissions, and training time. Out of the seventeen primary studies we reviewed, nine reported information about training time and details about training devices. Our findings suggest a need for more detailed reporting on emissions and a better understanding of the carbon footprint associated with training time. This analysis sheds light on the urgency of considering the environmental costs of AI technologies and the importance of developing sustainable practices within the field.
Assessing Carbon Footprint: Understanding the Environmental Costs of Training Devices
Costagliola G.;De Rosa M.;Piscitelli A.
2024-01-01
Abstract
Environmental sustainability is a crucial aspect of all human activities, and recently, the field of artificial intelligence has also been required to analyze its impact in terms of energy efficiency and its relationship with accuracy. We have conducted a literature review to evaluate the sustainability of neural models, with a specific focus on neural transpilers. These models employ neural machine translation to convert source code from one programming language to another. Neural transpilers have attracted significant attention in the scientific community, with multiple studies published between 2022 and 2023 focusing on their potential integration into programmers' development environments Our analysis focused on the environmental impact of these models, particularly in terms of electricity consumption, carbon emissions, and training time. Out of the seventeen primary studies we reviewed, nine reported information about training time and details about training devices. Our findings suggest a need for more detailed reporting on emissions and a better understanding of the carbon footprint associated with training time. This analysis sheds light on the urgency of considering the environmental costs of AI technologies and the importance of developing sustainable practices within the field.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.