Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Windows Central
Windows Central
Technology
Kevin Okemwa

A new report reveals that ChatGPT exorbitantly consumes 17,000 times more electricity compared to the average US household per day

ChatGPT and Microsoft Logo.

What you need to know

  • New research highlights that ChatGPT uses 17 thousand times more electricity than an average US household consumes.
  • The data scientist behind the study states that the AI sector might consume between 85 and 134 terawatt-hours annually by 2027, translating to half a percent of global electricity consumption.
  • ChatGPT uses one water bottle for cooling per query and $700,000 daily for its operations.

A new report indicates that ChatGPT consumes up to 17,000 times more electricity than the average US household uses daily (via The New Yorker). Elon Musk recently indicated we're on the verge of the biggest technology revolution with AI, but there won't be enough electricity by 2025 to power these advancements. 

Putting this into perspective, The New Yorker indicates that an average US household consumes up to 29 kilowatt-hours daily. This essentially means that 493,000 kilowatt-hours daily. With the exponential growth, advances, and rapid adoption of generative AI worldwide, things could go from bad to worse.

According to a research paper published by Dutch National Bank data scientist Alex de Vries, if at one point Google decides to integrate AI into every search, this would ultimately lead to a surge in the amount of electricity the technology consumes pushing it to approximately twenty-nine billion kilowatt-hours per year. The data scientist indicates that this is well beyond the amount of electricity consumed by most countries, including Kenya, Guatemala, and Croatia.

While speaking to Business Insider, de Vries indicated:

"AI is just very energy intensive. Every single of these AI servers can already consume as much power as more than a dozen UK households combined. So the numbers add up really quickly."

Per de Vries' estimations (if all factors are held constant), by 2027 the AI sector could potentially be at a point where it consumes anything from 85 to 134 terawatt-hours annually. While speaking to The Verge, the data scientist added that:

"You're talking about AI electricity consumption potentially being half a percent of global electricity consumption by 2027. I think that's a pretty significant number." 

Concerns are well beyond power consumption

(Image credit: Future)

ChatGPT is arguably one of the best AI-powered chatbots right now. A recent report disclosed that the AI assistant dominates the mobile market share even after Microsoft shipped Copilot AI to iOS and Android users with free access to OpenAI's GPT-4 model and DALL-E 3 image generation technology.

The chatbot has achieved incredible feats, including developing software in under 7 minutes, generating free Windows keys, and bypassing paywalled information. While impressive, all these come at an exorbitant cost.

OpenAI uses up to $700,000 daily to keep ChatGPT running, which could be a possible explanation for the reports looming the year indicating that the ChatGPT-maker is on the verge of bankruptcy. This is amid reports that the chatbot is losing accuracy and is seemingly getting dumber.

Last year, new research showed that Microsoft Copilot and ChatGPT could consume enough to power a small country for a year by 2027. This is on top of the exorbitant cost of the high demand for cooling water, as the chatbot consumes one water bottle for cooling per query. 

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.