On a cool enough day, it is possible to see the exhaust fumes curling out of tailpipes on the highway. The environmental impact of turning over the engine on a car is immediately visible, which makes it understandable. But despite the fact that opening up a web browser likewise sets off carbon emissions, those emissions are easily separated from the end user. They occur instead in a data center miles away.
That doesn't mean, however, that data center emissions are not real. And when it comes to artificial intelligence, those emissions are even worse.
Related: ChatGPT's development nearly cost this small Midwestern city its drinking water
Large Language Models (LLMs) like ChatGPT, powered by significant sets of incredibly dense GPU chips, require a tremendous amount of electricity. But because OpenAI and its peers have refused to share details of their models with researchers, exact numbers are impossible to estimate. One Danish scientist estimated that ChatGPT used between 1.2 million and 23.4 million kWh in January.
OpenAI did not respond to a request for comment.
The way these numbers equate to emissions is, however, complicated. It largely comes down to where the model is located and therefore, the kind of grid that powers its data center, as well as the quantity of requests it receives during any given time period.
Bloom, an open-sourced LLM designed by Hugging Face, consumed 914 kWh of electricity and had a carbon footprint of 340 kg over an 18-day period. During this period, it received 230,768 requests. ChatGPT, in comparison, broke 100 million monthly users in January, when it clocked a total of 600 million visits. The website clocked 1.4 billion total visits in August, according to Similarweb.
Beyond extraordinary (and unknown) emissions and electricity use, data centers must maintain very specific temperatures; most data centers pump in huge quantities of water to cool their center down in order to maintain these temperatures.
During the month of July, 2022, Microsoft (MSFT) -) pumped 11.5 million gallons (6% of the total water usage in the district) to its Iowa data centers, just a few weeks before OpenAI had completed the training of GPT-4. The company's global water usage jumped 34% from 2021 to 2022 to 1.7 billion gallons, likely because of AI.
The Nordic solution
The solution to this is simple in concept, but less so in practice: For AI to be sustainable, the data centers they use must run on green energy and must also avoid the over-consumption both of heating and cooling resources.
Verne Global is actively engaging in this solution.
As part of @huggingface growth, we have been working on extending our data center capacities to meet the new challenges ahead 🚀😎.
— Morgan Funtowicz (@MorganFunto) September 5, 2023
Proud to partner with @VerneGlobal on sustainable AI powered by 100% renewable energy 🌎.
Verne opened its doors in 2012, seeking to provide clients with a high-powered, fully sustainable data center. That mission, CEO Dominic Ward told TheStreet in an interview, was entirely dependent on the physical location of said data center.
So, Verne opened its first data center in Iceland, immediately taking advantage of what remains the only country in the world to be powered completely by sustainable energy sources. Owing to its geographical positioning, Iceland takes advantage both of geothermal and hydroelectric energy, generating a 100% sustainable, reliable mix of power.
Further, the country's temperate weather makes it entirely unnecessary for Verne to consume the kind of water necessary in other locations to keep the center cool; it is kept cool by virtue of being in Iceland.
Building out a sustainable data center in other locations, Ward said, is a near-impossibility; less ideal locations would likely require wind or solar power, which, until scientists can figure out how to effectively store solar energy, remains far less reliable than hydroelectric and geothermal power. And data centers need a reliable power source.
The idea of a remote data center even being an option, Ward said, has to do with the consistent expansion of cloud-based operations.
"Long gone are the challenges of trying to convince people that applications of data can work in a remote location," he said, noting the ongoing development of cable systems that "enable data to move around in a fully redundant manner all over the planet. So, everything's just pushing towards applications of data that can sit anywhere unless it's latency-sensitive. And of course, that's actually quite a small subset."
Verne has additional data centers in Finland and London and is developing another location in Norway.
Related: Here's the Steep, Invisible Cost Of Using AI Models Like ChatGPT
Part of Verne's setup, Ward said, went beyond finding optimal locations for its data centers to building out properly capable centers designed to handle dense compute.
Data centers are made up of "racks" which house servers and chips. A decade ago, a rack used about five to 10 kilowatts of power on average, Ward said. Now, with Nvidia (NVDA) -) chips in every sever, racks are using 30 or more kilowatts.
"There's only so few places on the planet that that really works efficiently," Ward said. "And that's why we're in a perfect storm at the moment."
Verne's savings are not just in climate impact; its data centers use fewer resources, reducing the fiscal cost of running these power-intensive servers.
The sustainable effort here is one of efficient thought; where latency is not a factor, location becomes irrelevant. And if a data center doesn't need to be physically close to a company, then it can be efficiently housed in an environment where it optimizes power against efficiency, resulting in powerful compute that doesn't shoot more carbon into the atmosphere.
"The compute that doesn't need to sit in those locations really shouldn't because you're not just benefiting yourselves as a customer that's saving money and reducing your carbon footprint," Ward said. "You're actually reducing the strain on the entire resource of a whole continent. And that's very much what our positioning has been. If it doesn't need to be in Virginia and Ashburn, then don't put it there."
I think the entire industry can be doing a lot more on that."
If you work (or used to work) for OpenAI, contact Ian by email ian.krietzberg@thearenagroup.net or Signal 732-804-1223
Related: The ethics of artificial intelligence: the path toward responsible AI
Get investment guidance from trusted portfolio managers without the management fees. Sign up for Action Alerts PLUS now.