What’s the environmental impact of AI and what can we do about it?
Image generated on Midjourney
AI tools have many benefits for businesses and content creators. But there’s a downside to these tools that’s often left unmentioned: the environmental impact.
As part of my mission to help people use Gen-AI tools wisely and well, I talk about the environmental impact in all of my workshops.
So, I thought I’d share some stats and three practical actions here too.
The environmental impact of AI models is complex
Looking at Gen-AI’s impact on the environment is complex. Not least because there are two sides to the story.
AI models can have a negative impact on the environment. But this technology can also help us pinpoint ways to use resources wisely, predict natural disasters, revolutionise food systems and lots more.
To make things more complex, the AI giants aren’t forthcoming with environmental impact data. This means most of the stats we have at the moment are well-researched estimates or based on older models.
So, here’s what we know.
There are three key ways that generative AI negatively impacts the environment: the training of the models, the use of the models and the making of the hardware.
Let’s look at these individually.
1. The impact of training AI models
As you might expect, it takes huge amounts of energy to train Gen-AI models like ChatGPT.
According to research from the University of Washington, training a single large language model like ChatGPT-3 can use 10 gigawatt-hour (GWh). “This is, on average, roughly equivalent to the yearly electricity consumption of over 1,000 U.S. households”.
Bear in mind, we’re now working with ChatGPT-4o, which is bigger. So, we could be looking at even more power.
GenAI models use a lot of water too
Power isn’t the only concern. Training and running AI models also requires huge amounts of water. Primarily to keep the huge server rooms cool.
In the research paper “Making AI Less "Thirsty”, researchers say: “training GPT-3 in Microsoft's state-of-the-art U.S. data centers can consume a total of 5.4 million liters of water.”
That’s a lot.
2. Using AI models
Unsurprisingly, the ongoing use of Gen-AI tools also has a cost. In fact, NVIDIA estimates that user queries make up 80-90% of the total energy used by Gen-AI systems during their lifetime.
And, according to the very detailed Towards Data Science article by Kasper Groes Albin Ludvigsen, ChatGPT could consume as much electricity as 175,000 people in just one month. Wow.
What about the water usage here?
Well, that’s fairly hefty too.
Research paper ,“Making AI Less "Thirsty”, suggests that 10-50 prompts on GPT-3 (ChatGPT’s predecessor) consumes 500ml of water, depending on when and where it’s deployed.
3. The environmental impact of making AI hardware
According to a study from 2011, “70% of the energy a typical laptop will consume during its life span is used in manufacturing the computer”.
So, it’s probably safe to assume the complex GPUs powering Gen-AI models are much higher than a laptop.
We also know that today’s AI giants are using more data centres and computing infrastructure than ever before.
Recent research from the MIT Climate and Sustainability Consortium, states that in the third quarter of 2023, Microsoft and Meta each bought three times more NVIDIA graphics processing units (GPUs) than Amazon and Google, which acquired 50,000 units each (Norem, 2023).
That’s a lot of hardware – and a lot of resources and power used to make them.
So, what can we do to help minimise the impact of Gen-AI?
Well, a lot of work has to happen upstream. The tech giants need to do their bit and legislation like the EU AI Act could help too.
But here are three practical things us business leaders and individuals can do now.
1. Use or fine-tune existing models
As we’ve seen, training and creating new models uses a lot of energy. So, if your company wants to train a generative model on your own content, it’s much more energy-efficient to fine-tune an existing model.
As the Harvard Business Review states: “Fine-tuning and prompt training on specific content domains consume much less energy than training new large models from scratch. It can also provide more value to many businesses than generically-trained models”.
2. Check the energy sources of your cloud provider or data centres
Be picky about your cloud provider. You can reduce carbon emissions by choosing providers that use environmentally-friendly power resources.
For example, a model trained and operated in the US might use fossil fuels, whereas in the Netherlands it’s likely to use more sustainable energy sources.
Recently, Google created more sustainable data centres in the Netherlands and has committed to 24/7 carbon-free energy by 2030.
3. Learn to use Gen-AI models well
If every inquiry we make has an impact, learning to use Gen-AI tools well matters.
If we all craft effective prompts for tools like ChatGPT or Midjourney, we won’t need to try over and over again to get a good result.
In the same way, if we fully understand Gen-AI tools, we’ll know when to use which ones. And when it’s wiser to avoid them altogether.
Last, but not least, if we combine AI and human intelligence, we’ll use AI models more resourcefully and generate valuable results – first time, every time.
There are many considerations when using Gen-AI models. Ethics, legal considerations, societal impact and more. But I believe we have to consider sustainability too.
After all, we could have the most innovative AI tools possible, but if we don’t have a healthy planet to live on, they won’t be much use.