Published on 10 May 2025
Share Tweet  Share

Is ChatGPT bad for the environment?

While concerns about the energy and water demands of AI aren't entirely misplaced, much of the public debate misses crucial context. Rather than focusing on individual generative AI use, more transparency is needed from AI companies on the climate impact of developing their models, writes Sam Gilbert.

Should we be worried about the climate impact of our generative AI use?

Read the news, and you would certainly think so. It’s been widely reported that messaging ChatGPT requires ten times as much electricity as a Google search, and that a full conversation consumes a 500ml bottle of water.

Often, a straight line is drawn between statistics like these and reports that AI data centres are putting grids under strain, delaying the decommissioning of coal-fired power stations, and accentuating water scarcity. No wonder many people argue that the environmental costs mean using generative AI is unethical

But there are several important problems with this story.

The first is that it doesn’t put the energy and water consumption of using generative AI into meaningful context. According to the Electric Power Research Institute (EPRI), a ChatGPT query uses 0.0029 KWh of electricity. Is that a lot? Not compared to running an electric oven for an hour, which uses 4 KWh. Put differently, for every tray of roast potatoes you cook, you could message ChatGPT 1,379 times.

What about 500ml of water? Doing a single load of laundry uses 100 times that amount. The most important insight from these ChatGPT factoids might be that we drastically underestimate how much electricity and water we consume in our daily lives.  

Another problem is that “generative AI”, “AI” and “data centres” are routinely conflated in public discourse. Use of large language models (LLMs) by individual users represents a small proportion of all AI use  – perhaps as little as 3%, with recommendation, targeting, personalisation, analytical and decisioning systems operated by organisations making up the majority.

And in aggregate, AI accounts for only 14% of data centre energy use, while mundane things like apps, databases, email, video streaming and cloud storage consume far more. Increased adoption of AI is certainly a driver of data centre expansion, but then the growth in electricity demand from data centres represents less than 10% of the International Energy Agency’s forecast increase in global electricity demand to 2030. Industry, transportation, appliances and air conditioning are all bigger contributors.  

Finally, there is a lot of uncertainty about the accuracy of estimates of generative AI’s energy and water use. Researchers at Epoch AI recently recalculated the energy consumption of a ChatGPT query at 0.0003 KWh – the same as the estimate for a Google search. They also noted that the Google search estimate itself comes from a single 2009 blogpost, and that the real figure today could be higher due to the incorporation of AI-generated overviews into Google search results, or lower due to improvements in chip and data centre efficiency. Software engineer Sean Goedecke makes a similar challenge to the claim that an exchange with ChatGPT uses 500ml of water, modelling it at 5ml. Nobody knows the actual numbers except the big tech companies, and they aren’t telling.

So, if our personal use of generative AI technologies doesn’t pose much of a threat to the climate, does it follow that wider concerns about the energy and water use of AI and data centres are overblown? Unfortunately not.

The pre-training of LLMs is computationally intensive and hence energy-hungry, with OpenAI’s GPT-3 estimated to have consumed 1,287,000 KWh of electricity before it was released – enough to make more than 2,500 tonnes of steel. In practice it is also wasteful: an outcome of race dynamics in the LLM market is that developers believe they can’t afford to spend time on energy efficiency, even though it is demonstrably possible to cut the electricity consumption of LLM training by two-thirds. To make matters worse, models from different AI labs seem to end up with near-identical capabilities. 

Different problems arise from the geography of data centres, including those that contain the clusters of GPU chips required to train and run AI systems. The customers of any given data centre are likely to be dispersed across the globe, but the data centre itself is grounded in a specific place and must draw on local supplies of electricity and water (or source its own).

This can lead to undesirable and sometimes bizarre unintended consequences. In 2022, electricity demand from new data centres in the M4 corridor (an area in the UK adjacent to the M4 motorway, which runs from London to South Wales and is a major hi-tech hub) led the Greater London Authority to warn housebuilders that no grid connections would be available in Hillingdon, Ealing or Hounslow for 13 years – a de facto ban on new homes in an area of acute housing shortage.

The following year the Chief Executive Officer of ammunition manufacturer Nammo told the Financial Times that it could not expand production to meet demand from the Ukrainian military because all the surplus electricity in the part of  Norway where its plant was located had been absorbed by a data centre supporting the social media platform TikTok. “Our future growth is challenged by the storage of cat videos”, he remarked.  

So, what are the implications? Some in the policy community may need to update their mental models of the relationship between generative AI use and data centre expansion. There is little sense in discouraging citizens from using ChatGPT and other generative AI tools for environmental reasons – much better that they insulate their homes and disconnect their hosepipes.

With that said, we are really in the dark about the current and future climate impact of generative AI model development. Those seeking to hold AI companies to account for their electricity and water use currently rely on sporadic voluntary disclosure and modelling by researchers like Sasha Luccioni. Transparency requirements may be needed to force AI companies to report both how much electricity and water they use, and what mix of energy sources their power comes from, so that their true carbon footprint can be quantified.

Lastly, the knock-on effects of increasing energy and water demand from data centres should be on the agenda of the newly-formed AI Energy Council. As researcher Lauren E. Bridges suggests, thinking of it as “industrial computing” rather than “the cloud” is a helpful opening move.


The views and opinions expressed in this post are those of the author(s) and not necessarily those of the Bennett Institute for Public Policy.

Authors

Sam Gilbert

Sam Gilbert

Affiliated Researcher

Sam Gilbert is an entrepreneur and researcher working at the intersection of politics and technology. An expert in data-driven marketing, Sam was Employee No.1 and Chief Marketing Officer at Bought...

Back to Top