by Mary Manning, Heidelberg IWC
Most of us are familiar with Artificial Intelligence (AI) through consumer applications like ChatGPT or customer service chatbots. You may have used ChatGPT or other generative AI applications to summarize the content of an article or email, plan a meal or even a vacation. They are easy to use and can help automate daily tasks or brainstorm ideas when we are stuck.
Tech companies are making big bets on the future of AI, with Google planning on spending $85 billion on AI in 2025 alone.1 Mark Zuckerberg reportedly offered one engineer $1.5 billion to join Meta.2 Obviously, the industry sees great potential in this area, with promises that AI will one day cure cancer or solve climate issues. But in pursuing these potential future benefits, we are already incurring rather significant environmental costs.
Most of the industry investments are directed towards building more data centers to process the increasingly complicated software and models behind AI – and these data centers have an enormous environmental footprint, in terms of both electricity and water consumption. A single AI-focused data center can use as much electricity as a small city and as much water as a large neighborhood. The pace of data center construction cannot be met sustainably and largely relies on fossil fuels like coal and methane gas, although the newest solution seems to be turning to nuclear power. While nuclear power is carbon free, it carries risks, results in long-term radioactive waste, and requires the consumption of substantial fossil fuels in the construction phase.
Because they are so easy to use, it may not be clear why an AI search is so much more energy intensive than the traditional Google search. A Google search uses relatively simple algorithms that match keywords against an existing index of web pages and then ranks those results, requiring minimal real-time computation. It is similar to looking up a word in a dictionary, where the work of compiling the dictionary has already been done. AI-powered searches (like ChatGPT or Google's AI Overview) function completely differently. Each prompt flows through a server that runs thousands of calculations to determine the best words to use in a response. Instead of retrieving pre-existing answers, the AI model generates entirely new text by processing your query through billions of parameters and computing probability distributions for each possible next word. According to a report by Goldman Sachs, a ChatGPT query needs nearly 10 times as much electricity as a Google search. Another way to visualize this: one query to ChatGPT uses approximately as much electricity as could light one light bulb for about 20 minutes.
This marks a disappointing reversal from recent trends. For at least 20 years, American electricity consumption has hardly grown at all – owing in part to steady advances in energy efficiency, with the tech industry leading that innovation. But that is about to change, mainly due to data centers. Google said its greenhouse gas emissions rose last year by 48% since 2019, while Microsoft said its emissions grew by 29% since 2020. Google had promised to be carbon neutral by 2030, while Microsoft set the goal of being carbon negative by that date, but it is hard to imagine that they will be able to meet those goals given their plans for AI.
It is also important to recognize who is bearing the cost of these environmental impacts.
“Energy demands are often offloaded to data centers in regions with weak environmental regulations, like the Global South. Amazon, Google, and Microsoft have constructed data centers in Africa and Southeast Asia, attracted by cheap electricity and limited governance oversight. These centers worsen local environmental degradation and natural resource depletion in areas already heavily impacted by climate change. For instance, in Kenya, where droughts and environmental crises have worsened recently, energy-intensive data centers by global tech companies raise significant concerns about the long-term sustainability of AI development in these regions.”3
So while the bulk of the wealth generated by this new technology is accrued in the Global North, it is the Global South that is paying the price, a phenomenon that Karen Hao, author of Empire of AI, refers to as a type of new colonialism.
So what can we do? First and foremost, consider your personal use of AI. If you are a Google user, you can turn off the automatic AI Overviews that are produced every time you use the Search function. Second, we need to advocate for more and better regulation governing mandatory emissions reporting, energy efficiency standards, restrictions on training AI models during peak grid demand, and requirements for renewable energy use. Finally, note that there may be areas in which AI can help reduce global emissions, such as managing power grids more efficiently, identifying new types of protein to replace meat and dairy in human diets, or lowering the cost of electric vehicles through battery improvements.
Sources:
1 Google's $85 billion capital spend spurred by cloud, AI demand | CNBC
References:
AI has an environmental problem. Here’s what the world can do about that. | UN Environment Programme
Karen Hao on How AI Is Threatening Democracy & Creating a New Colonial World | Democracy Now
The Carbon Footprint of Amazon, Google, and Facebook Is Growing | Sierra Club
AI brings soaring emissions for Google and Microsoft, a major contributor to climate change | NPR
Explained: Generative AI’s environmental impact | MIT News
What Are the Environmental Impacts of Artificial Intelligence? | Union of Concerned Scientists
Google turns to nuclear to power AI data centres | BBC
Massive AI spending shows early payoff for Big Tech | The Hill
Image from Canva.com