A curated list of awesome Green AI resources and tools to reduce the environmental impacts of using and deploying AI.
In 2020, Information and Communications Technology (ICT) sector carbon footprint was estimated to be between 2.1-3.9% of total global greenhouse gas emissions. The ICT sector continues to grow and now dominates other industries. It is estimated that the carbon footprint will double to 6-8% by 2025. For ICT sector to remain compliant with the Paris Agreement, the industry must reduce by 45% its GHG emissions from 2020 to 2030 and reach net zero by 2050 (Freitag et al., 2021).
AI is one of the fastest growing sectors, disrupting many other industries (AI Market Size Report, 2022). It therefore has an important role to play in reducing carbon footprint. The impacts of ICT, and therefore AI, are not limited to GHG emissions and electricity consumption. We need to take into account all major impacts (abiotic resource depletion, primary energy consumption, water usage, etc.) using Life Cycle Assessment (LCA) (Arushanyan et al., 2013).
AI sobriety not only means optimizing energy consumption and reducing impacts, but also includes studies on indirect impacts and rebound effects that can negate all efforts to reduce the environmental footprint (Willenbacher et al. 2021). It is therefore imperative to consider the use of AI before launching a project in order to avoid indirect impacts and rebound effects later on.
All contributions are welcome. Add links through pull requests or create an issue to start a discussion.
Tools to measure and compute environmental impacts of AI.
- CodeCarbon – Track emissions from Compute and recommend ways to reduce their impact on the environment.
- carbontracker – Track and predict the energy consumption and carbon footprint of training deep learning models.
- Eco2AI – A python library which accumulates statistics about power consumption and CO2 emission during running code.
- Zeus – A framework for deep learning energy measurement and optimization.
- Tracarbon – Tracks your device's energy consumption and calculates your carbon emissions using your location.
- EcoLogits – Estimates the energy consumption and environmental footprint of LLM inference through APIs.
- AIPowerMeter – Easily monitor energy usage of machine learning programs.
☠️ No longer maintained:
- carbonai – Python package to monitor the power consumption of any algorithm.
- experiment-impact-tracker – A simple drop-in method to track energy usage, carbon emissions, and compute utilization of your system.
- GATorch – An Energy-Aware PyTorch Extension.
- GPU Meter – Power Consumption Meter for NVIDIA GPUs.
- PyJoules – A Python library to capture the energy consumption of code snippets.
Tools to monitor power consumption and environmental impacts.
- Scaphandre – A metrology agent dedicated to electrical power consumption metrics.
- PowerJoular – Monitor power consumption of multiple platforms and processes.
- Boagent – Local API and monitoring agent focussed on environmental impacts of the host.
- vJoule – A tool to estimate the energy consumption of your processes.
- jupyter-power-usage – Jupyter extension to display CPU and GPU power usage and carbon emissions.
Tools to optimize energy consumption or environmental impacts.
- Zeus – A framework for deep learning energy measurement and optimization.
- GEOPM – A framework to enable efficient power management and performance optimizations.
Tools to estimate environmental impacts of algorithms, models and compute resources.
- Green Algorithms - A tool to easily estimate the carbon footprint of a project.
- ML CO2 Impact - Compute model emissions and add the results to your paper with our generated latex template.
- EcoLogits Calculator - Estimate energy consumption and environmental impacts of LLM inference.
- AI Carbon - Estimate your AI model's carbon footprint.
- MLCarbon - End-to-end carbon footprint modeling tool.
- GenAI Carbon Footprint - A tool to estimate energy use (kWh) and carbon emissions (gCO2eq) from LLM usage.
Generic tools:
- Boaviztapi - Multi-criteria impacts of compute resources taking into account manufacturing and usage.
- Datavizta - Compute resources data explorer not limited to AI.
- EcoDiag - Compute carbon footprint of IT resources taking into account manufactuing and usage (🇫🇷 only).
- LLM Perf Leaderboad - Benchmarking LLMs on performance and energy.
- ML.Energy Leaderboard - Energy consumption of GenAI models at inference.
- AI Energy Star Leaderboard - Energy efficiency ratings for AI models.
- Energy and Policy Considerations for Deep Learning in NLP - Strubell et al. (2019)
- Quantifying the Carbon Emissions of Machine Learning - Lacoste et al. (2019)
- Carbontracker: Tracking and Predicting the Carbon Footprint of Training Deep Learning Models - Anthony et al. (2020)
- Green AI - Schwartz et al. (2020)
- The Energy and Carbon Footprint of Training End-to-End Speech Recognizers - Parcollet et al. (2021)
- Carbon Emissions and Large Neural Network Training - Patterson, et al. (2021)
- Green Algorithms: Quantifying the Carbon Footprint of Computation - Lannelongue et al. (2021)
- Aligning artificial intelligence with climate change mitigation - Kaack et al. (2021)
- A Practical Guide to Quantifying Carbon Emissions for Machine Learning researchers and practitioners - Ligozat et al. (2021)
- Unraveling the Hidden Environmental Impacts of AI Solutions for Environment Life Cycle Assessment of AI Solutions - Ligozat et al. (2022)
- Measuring the Carbon Intensity of AI in Cloud Instances - Dodge et al. (2022)
- Estimating the Carbon Footprint of BLOOM a 176B Parameter Language Model - Luccioni et al. (2022)
- Bridging Fairness and Environmental Sustainability in Natural Language Processing - Hessenthaler et al. (2022)
- Eco2AI: carbon emissions tracking of machine learning models as the first step towards sustainable AI - Budennyy et al. (2022)
- Environmental assessment of projects involving AI methods - Lefèvre et al. (2022)
- Sustainable AI: Environmental Implications, Challenges and Opportunities - Wu et al. (2022)
- The Carbon Footprint of Machine Learning Training Will Plateau, Then Shrink - Patterson et al. (2022)
- Towards the Systematic Reporting of the Energy and Carbon Footprints of Machine Learning - Henderson et al. (2022)
- Towards Sustainable Artificial Intelligence: An Overview of Environmental Protection Uses and Issues - Pachot et al. (2022)
- Method and evaluations of the effective gain of artificial intelligence models for reducing CO2 emissions - Delanoë et al. (2023)
- Making AI Less "Thirsty": Uncovering and Addressing the Secret Water Footprint of AI Models - Li et al. (2023)
- Zeus: Understanding and Optimizing GPU Energy Consumption of DNN Training - You et al. (2023)
- Trends in AI inference energy consumption: Beyond the performance-vs-parameter laws of deep learning Desislavov et al. (2023)
- Chasing Low-Carbon Electricity for Practical and Sustainable DNN Training - Yang et al. (2023)
- Toward Sustainable HPC: Carbon Footprint Estimation and Environmental Implications of HPC Systems - Li et al. (2023)
- Reducing the Carbon Impact of Generative AI Inference (today and in 2035) - Chien et al. (2023)
- LLMCarbon: Modeling the End-To-End Carbon Footprint of Large Language Models - Faiz et al. (2023)
- The growing energy footprint of artificial intelligence - De Vries (2023)
- Exploring the Carbon Footprint of Hugging Face's ML Models: A Repository Mining Study - Castano et al. (2023)
- Exploding AI Power Use: an Opportunity to Rethink Grid Planning and Management - Lin et al. (2023)
- Power Hungry Processing: Watts Driving the Cost of AI Deployment? - Luccioni et al. (2023)
- Perseus: Removing Energy Bloat from Large Model Training - Chung et al. (2023)
- Timeshifting strategies for carbon-efficient long-running large language model training - Jagannadharao et al. (2023)
- Estimating the environmental impact of Generative-AI services using an LCA-based methodology - Berthelot et al. (2024)
- Towards Greener LLMs: Bringing Energy-Efficiency to the Forefront of LLM Inference - Stojkovic et al. (2024)
- Green AI: Exploring Carbon Footprints, Mitigation Strategies, and Trade Offs in Large Language Model Training - Liu et al. (2024)
- Engineering Carbon Emission-aware Machine Learning Pipelines - Humsom et al. (2024)
- A simplified machine learning product carbon footprint evaluation tool - Lang et al. (2024)
- Beyond Efficiency: Scaling AI Sustainably - Wu et al. (2024)
- The Price of Prompting: Profiling Energy Use in Large Language Models Inference - Huson et al. (2024)
- MLCA: a tool for Machine Learning Life Cycle Assessment - Morand et al. (2024)
- Hype, Sustainability, and the Price of the Bigger-is-Better Paradigm in AI - Varoquaux et al. (2024)
- Addition is All You Need for Energy-efficient Language Models - Luo et al. (2024)
- E-waste challenges of generative artificial intelligence - Wang et al. (2024)
- Reconciling the contrasting narratives on the environmental impact of large language models - Ren et al. (2024)
- Evaluating the carbon footprint of NLP methods: a survey and analysis of existing tools - Bannour et al.(2021)
- A Survey on Green Deep Learning - Xu et al. (2021)
- A Systematic Review of Green AI - Verdecchia et al. (2023)
- Counting Carbon: A Survey of Factors Influencing the Emissions of Machine Learning - Luccioni et al. (2023)
- Towards Efficient Generative Large Language Model Serving: A Survey from Algorithms to Systems - Miao et al. (2023)
- The great challenges of generative AI (🇫🇷 only) - Data For Good 2023
- Powering Up Europe: AI Datacenters and Electrification to Drive +c.40%-50% Growth in Electricity Consumption - Goldman Sachs 2024
- Generational Growth — AI/data centers’ global power surge and the sustainability impact - Goldman Sachs 2024
- AI and the Environment - International Standards for AI and the Environment - ITU 2024
- Powering artificial intelligence: a study of AI’s footprint—today and tomorrow - Deloitte 2024