Spotlight on AI This Earth Day: ‘AI Is Fundamentally Incompatible With Environmental Sustainability’

News Room
Image: Galyna_Andrushko/Envato Elements

Generative AI is energy-intensive, and the ways in which its environmental impact can be calculated are complex. Consider the downstream effect of generative AI on the environment when examining your company’s own sustainability goals.

  • What side effects might not be immediately visible but could have a major impact?
  • When does most of the energy consumption occur: during training or everyday use?
  • Do “more efficient” AI models actually address any sustainability concerns?

The impact of generative AI on electricity generation, water, and air quality

AI’s impact on air pollution

In December 2024, the University of California, Riverside, and California Institute of Technology calculated that training Meta’s Llama-3.1 produced the same amount of air pollution as more than 10,000 round trips by car between Los Angeles and New York City.

The increased air pollution from backup generators at data centers running AI caused regional public health costs of approximately $190 million to $260 million a year, the UC Riverside and Caltech researchers found.

AI’s impact on electricity use

A 2024 report from the International Energy Agency said one ChatGPT prompt used 10 terawatt-hours more electricity per year than the total used annually for Google searches.

AI’s impact on water use

Sapping more electricity could fray already struggling utilities, leading to brownouts or blackouts. Drawing water from already drought-prone areas, such as the rapidly developing Phoenix, Arizona or the deserts of California, could cause habitat loss and wildfires.

SEE: Sending One Email With ChatGPT is the Equivalent of Consuming One Bottle of Water

Do training or everyday use of AI consume more resources?

“Training is a time-consuming and energy-intensive process,” the IEA wrote in its 2025 Energy and AI World Energy Outlook Special Report. One GPU of the type suitable for AI training draws about as much electricity as a toaster at its maximum rated power consumption. The agency calculated it took 42.4 gigawatt hours to train OpenAI’s GPT-4, the equivalent of the daily household electricity use of 28,500 households in an advanced economy.

What about everyday use? Query size, model size, the degree of inference time-scaling, and more factors into how much electricity an AI model uses during the inference stage of use, to parse the prompt. These factors, and a lack of data regarding the size and implementation of consumer AI models mean the environmental impact is very difficult to measure. However, generative AI undeniably draws more power than conventional computing.

“The inference phase (also the operational phase) was already responsible for the majority (60%) of AI energy costs at Google even before mass adoption of generative AI applications happened (2019-2021),” wrote Alex de Vries, founder of the research blog Digiconomist and the Bitcoin Energy Consumption Index, in an email to TechRepublic. “Even though we don’t have exact numbers, mass adoption of AI applications will have increased the weight of the inference (/operational) phase even further.”

Meanwhile, AI models continue to expand. “Increasing the model size (parameters) will result in better performance, but increases the energy use of both training and inference,” said de Vries.

DOWNLOAD: This Greentech Quick Glossary from TechRepublic Premium

DeepSeek claimed to be more energy efficient, but it’s complicated

DeepSeek’s AI models have been lauded for achieving as much as their major competitors without consuming as much energy and at a lower price tag; however, the reality is more complicated.

DeepSeek’s mixture-of-experts approach reduces costs by processing relationships between concepts in batches. It doesn’t require as much computational power or consume as much energy during training. The IEA found that the everyday use of the inference time scaling method used by DeepSeek-R1 consumes a significant amount of electricity. Generally, large inference models consume the most electricity. The training is less demanding, but the usage is more demanding, according to MIT Technology Review.

“DeepSeek-R1 and OpenAI’s o1 model are substantially more energy intensive than other large language models,” wrote IEA in the 2025 Energy and AI report.

The IEA also pointed out the “rebound effect,” where the product’s increased efficiency leads to more users adopting it; as a result, the product continues to consume more resources.

Can AI offset the resources it consumes?

Tech companies still like to present themselves as good stewards. Google pursues energy-conscious certifications globally, including signing the Climate Neutral Data Centre Pact in Europe. Microsoft, which saw similar increases in water and electricity use in its 2024 sustainability reporting, is considering reopening a nuclear power plant at Three Mile Island in Pennsylvania to power its AI data centers.

SEE: The proliferation of AI has created a sustained boom in data centers and related infrastructure.

Supporters of AI might argue its benefits outweigh the risks. Generative AI can be used in sustainability projects. AI can help comb through massive datasets of information about carbon emissions or track emissions of greenhouse gases. Additionally, AI companies are continually working on improving the efficiency of their models. But what “efficiency” really means always seems to be the catch.

“There are some bottlenecks (like e.g. grid capacity) that could hold back the growth in AI and its power demand,” said de Vries. “This is hard to predict, also considering that it’s not possible to predict future demand for AI (for example the AI hype could fade to a certain extent), but any hope for limiting AI power demand comes from this. Due to the ‘bigger is better’ dynamic AI is fundamentally incompatible with environmental sustainability.”

Then there is the question of how far down the supply chain AI’s impact should be counted. “Indirect emissions from the consumption of electricity are the most significant component of emissions from hardware manufacturing [of semiconductors,” said the IEA in the Energy and AI report.

The cost of hardware and its use has gone down as companies understand the needs of generative AI better and pivot to products focused on it.

“At the hardware level, costs have declined by 30% annually, while energy efficiency has improved by 40% each year,” according to Stanford University’s 2025 AI Index Report.

DOWNLOAD: This IT Data Center Green Energy Policy from TechRepublic Premium

Consider how generative AI affects your business’ environmental targets

Generative AI is becoming mainstream. Microsoft’s Copilot is included by default in some PCs; smartphone makers are eagerly adding video editing AI and assistants; and Google gives out its Gemini Advanced model for free to students.

Tech companies that set promising sustainability targets may find it difficult to hit their goals now that they produce and use generative AI products.

“AI can have dramatic impacts on ESG reports and also the ability of the companies concerned to reach their own climate goals,” said de Vries.

DOWNLOAD: This Customizable Environmental Policy from TechRepublic Premium

According to Google’s 2024 Environmental Report, the tech giant’s data centers consumed 17% more water than in 2023. Google attributed this to “the expansion of AI products and services” and noted “similar growth in electricity use.” Google’s data center waste generation and water use both increased.

“As AI adoption accelerates, IT leaders are increasingly aware that smarter devices don’t directly correlate to more efficient power consumption,” said Dan Root, head of global strategic alliances at ClickShare. “The spike in compute demand from AI tools means IT departments must look for offset opportunities elsewhere in their stack.”

As the International Energy Agency pointed out in its 2024 electricity report, both the source of electricity and the infrastructure need to be considered if the world is to meet the energy demands of AI.

“You can make/keep models a bit smaller to reduce their energy requirement, but this also means you have to be prepared to sacrifice performance,” said de Vries.

Read the full article here

Share This Article
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *