Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Tom’s Hardware
Tom’s Hardware
Technology
Jowi Morales

A single modern AI GPU consumes up to 3.7 MWh of power per year — GPUs sold last year alone consumed more power than 1.3 million homes

Nvidia data center GPUs.

Today's most powerful new data center GPUs for AI workloads can consume as much as 700 watts apiece. With a 61% annual utilization, that would account for about 3,740,520 Wh or 3.74 MWh per year per GPU, fueling concerns about the availability of power and environmental impacts, especially when we zoom out and look at the number of total GPUs sold last year alone.

Nvidia sold 3.76 million data center GPUs last year, accounting for 98% of the market. Adding the remaining 2% from Intel, AMD, and other players will account for over 3,836,000 GPUs delivered to data servers in 2023.

Multiply that number by the total data center GPU deliveries last year, and you get 14,348.63 GWh of electricity used in a year. To put this in context, the average American home uses 10,791 kWh per year, meaning data center GPUs sold last year consume the same amount of power that 1.3 million households use annually. If we look at this at the state level, the California Energy Commission reported that the state produced 203,257 GWh in 2022, meaning data center GPUs consume about 7% of the state’s annual production.

However, you should remember that these are just data center GPUs. The data here does not include the CPUs, cooling systems, and other equipment data centers need to run properly. For example, most RTX 4090s recommend a minimum 850-watt power supply, with some requiring 1,000 or even 1,200 watts. If we go by the minimum, servers and data centers built last year require over 25 GWh annually. These numbers do not even include data centers from 2022 and older and do not consider the many more coming online this year.

Industry analysts have estimated that the data center GPU market will grow by 34.6% year-on-year until 2028, meaning we will likely see more data center GPUs pumped out in the coming years. Furthermore, Nvidia’s next generation of AI GPUs are expected to draw more power than the current 700-watt H100. Even if data center computers retain their power consumption in the coming years (they won’t), power demand for data centers should increase proportionally with the market’s growth.

This unprecedented rise of data centers is raising concerns about our power infrastructure. In fact, the U.S. government is already in talks with tech companies about their AI electricity demands, especially as these new data centers could put undue pressure on the grid. Meta founder Mark Zuckerberg even says that limited power will constrain AI growth, especially as Enerdata noted that global power production only rose by 2.5% per year in the last decade.

Nevertheless, tech companies are not blind to this issue. Microsoft, traditionally a software company, is even looking to invest in small modular nuclear reactors for its data centers. This is especially important as it partnered with OpenAI to build a $100 billion AI supercomputer, which would definitely require a ton of power.

The rise of AI in our data-driven society means we need a lot of electricity to power our computing requirements. Furthermore, we mustn’t forget other upcoming technologies that also need a lot of juice, like EVs. Unless we find a way to develop chips (and motors) that deliver more power while simultaneously consuming less power, we’ll likely have to add more power production facilities and upgrade the supporting infrastructure to deliver them where needed.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.