Artificial Intelligence (AI) is a rapidly growing field that is raising concerns about its significant energy consumption. OpenAI's ChatGPT, a popular chatbot, is estimated to consume over half a million kilowatt-hours of electricity daily to handle around 200 million requests. To put this into perspective, the average US household uses about 29 kilowatt-hours per day, making ChatGPT's energy usage more than 17,000 times that of a typical household.
If generative AI technology becomes more widespread, the energy demands could escalate further. For instance, integrating generative AI into Google's search engine could lead to an annual consumption of 29 billion kilowatt-hours, surpassing the electricity usage of entire countries like Kenya, Guatemala, and Croatia.
Data scientist Alex de Vries highlighted the energy-intensive nature of AI, noting that a single AI server can consume as much power as over a dozen UK households combined. Despite the challenges in accurately estimating the energy consumption of the AI industry due to varying model sizes and limited transparency from tech giants, projections suggest that the sector could consume between 85 to 134 terawatt-hours annually by 2027.
This forecast implies that AI's electricity consumption could reach half a percent of global electricity usage by 2027, a substantial figure compared to energy-intensive businesses like Samsung, Google, and Microsoft. Samsung uses nearly 23 terawatt-hours, while Google and Microsoft consume slightly over 12 and 10 terawatt-hours, respectively, to power their operations.
OpenAI, a key player in the AI landscape, has not provided comments on its energy consumption. The concerns surrounding AI's energy usage have prompted discussions about the industry's environmental impact and the need for sustainable practices moving forward.