Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Tom’s Hardware
Tom’s Hardware
Technology
Christopher Harper

Using GPT-4 to generate 100 words consumes up to 3 bottles of water — AI data centers also raise power and water bills for nearby residents

ChatGPT.

Research conducted by the University of California, Riverside shared by The Washington Post on Wednesday highlighted the high costs of using generative AI. It turns out AI requires pretty heavy water consumption — used to cool the servers that generate the data — even when it's just generating text. This is, of course, in addition to the severe electric grid toll

The research noted that the exact water usage varies depending on state and proximity to data center, with lower water use corresponding to cheaper electricity and higher electricity use. Texas had the lowest water usage at an estimated 235 milliliters needed to generate one 100-word email, while Washington demanded a whopping 1,408 milliliters per email — which is about three 16.9oz water bottles. 

This may not sound like a lot, but remember that these figures add up fairly quickly, especially when users are using GPT-4 multiple times a week (or multiple times a day) — and this is just for plain text. 

Data centers are shown to be heavy consumers of water and electricity, which also drives up the power and water bills of residents in the towns where these data centers are being built. For example, Meta needed to use 22 million liters of water to train its LLaMA-3 model — about how much water is needed to grow 4,439 pounds of rice, or, as researchers noted, "about what 164 Americans consume in a year."

The electric cost of GPT-4 is also quite high. If one out of 10 working Americans use GPT-4 once a week for a year (so, 52 queries total by 17 million people), the corresponding power demands of 121,517 MWh would be equal to the electricity consumed by every single household in Washington D.C. (an estimated 671,803 people) for twenty days. That's nothing to scoff at, especially since it's an unrealistically light use case for GPT-4's target audience. 

The Washington Post included quotes from OpenAI, Meta, Google, and Microsoft representatives, most of which reaffirmed commitment to reducing environmental demand rather than giving actual courses of action. 

Microsoft rep Craig Cincotta stated that the company will be "working toward data center cooling methods that will eliminate water consumption completely," which sounds nice but is vague on how. The AI profit motive has clearly taken priority over environmental goals set by these companies in the past, so even this quote should be taken with a grain of salt until we see actual improvements.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.