Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Fortune
Fortune
Jane Thier

AI doesn’t just require tons of electric power. It also guzzles enormous sums of water.

Server room with big data (Credit: Jasmin Merdad - Getty Images)

If there weren’t enough of an argument against AI from an environmental standpoint, a new waterfall of data might push even the most ambivalent consumer over the edge. 

Per the International Energy Agency, energy consumption by global data centers could more than double by 2026, “reaching levels that exceed large nations.” Ironically, “while we’re using AI to solve some of the world’s biggest challenges—from climate modeling to health-care breakthroughs—we’re also contributing to an environmental crisis of a different kind,” Chris Gladwin, a tech founder and CEO, wrote for Fortune recently. 

Now, new reporting finds that OpenAI’s ChatGPT—which uses the GPT-4 language model—consumes 519 milliliters or just over one bottle of water, to write a 100-word email. That’s according to the Washington Post in a research collaboration with the University of California, Riverside. 

In order to shoot off one email per week for a year, ChatGPT would use up 27 liters of water, or about one-and-a-half jugs. Zooming out, WaPo wrote, that means if one in 10 U.S. residents—16 million people—asked ChatGPT to write an email a week, it’d cost more than 435 million liters of water. 

While much has been made about the power usage each ChatGPT prompt immediately necessitates, the water conversation has gained additional steam in recent months. 

As WaPo explained, every prompt a user enters into ChatGPT is quickly turned into code, and “flows through a server that runs thousands of calculations to determine the best words to use in a response.” All those calculations go through real, physical servers which are housed in enormous data centers around the world. Spitting out an answer—or answering a command—makes the servers heat up, like an under-duress old laptop. 

This is where water comes in; to keep those ever-important servers from overheating and breaking down, the data centers rely on cooling mechanisms, often via “cooling towers” that themselves require water. Each facility, depending on the climate where it’s based, uses a different amount of water and electricity. West Des Moines, Iowa, is quickly becoming a popular destination, owing to a temperate climate that calls for fewer cooling interventions. 

“We haven’t come to the point yet where AI has tangibly taken away our most essential natural water resources,”  wrote Shaolei Ren, an associate professor of engineering at UC Riverside who has been trying for years to quantify AI’s climate impact. Nonetheless, Ren called AI’s increasing water usage “definitely concerning.” 

Amid rapid population growth and a changing climate, “depleting water resources and aging water infrastructures” are some of the most preeminent challenges, he wrote in November. “The concern is not only about the absolute amount of AI models’ water usage, but also about how AI model developers respond to the shared global challenge of water shortage.” 

Droughts, he noted, are among the most immediate consequences of climate change, and it’s incumbent upon businesses to address water usage in their operations—and tech firms using generative AI top that list. “We already see heated tensions over water usage between AI data centers and local communities,” Ren wrote. “If AI models keep on guzzling water, these tensions will become more frequent and could lead to social turbulence.”

In Microsoft’s sustainability report last year, the company said its global water consumption had spiked 34% between 2021 and 2022. Over the same period, Google’s water usage rose 20%, it wrote in its own report. “It’s fair to say” that the majority of that growth at both companies “is due to AI,” Ren told the AP at the time. (Microsoft’s data center used up 700,000 liters of water in training GPT-3, WaPo reported.)

Holly Alpine, who was once Microsoft’s senior program manager of Datacenter Community Environmental Sustainability, resigned from the company earlier this year on principle, she wrote for Fortune, due to the company’s ecologically irresponsible AI development. 

“Analyst reports suggest that advanced technologies—such as AI or machine learning—have the potential to increase fossil fuel yield by 15%, contributing to a resurgence of oil and potentially delaying the global transition to renewable energy,” Alpine wrote. “The real-world impacts are staggering: A single such deal between Microsoft and ExxonMobil could generate emissions that exceed Microsoft’s 2020 annual carbon removal commitments by over 600%.”

When she was a Microsoft employee, she wrote, she witnessed “dozens” of such deals.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.