Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Tom’s Hardware
Tom’s Hardware
Technology
Jowi Morales

AI engineers claim new algorithm reduces AI power consumption by 95% — replaces complex floating-point multiplication with integer addition

Addition sign floating on hand.

Engineers from BitEnergy AI, a firm specializing in AI inference technology, has developed a means of artificial intelligence processing that replaces floating-point multiplication (FPM) with integer addition. 

The new method, called Linear-Complexity Multiplication (L-Mul), comes close to the results of FPM while using the simpler algorithm. But despite that, it’s still able to maintain the high accuracy and precision that FPM is known for. As TechXplore reports, this method reduces the power consumption of AI systems, potentially up to 95%, making it a crucial development for our AI future.

Since this is a new process, popular and readily available hardware on the market, like Nvidia’s upcoming Blackwell GPUs, aren't designed to handle this algorithm. So, even if BitEnergy AI’s algorithm is confirmed to perform at the same level as FPM, we still need systems that could handle it. This might give a few AI companies pause, especially after they just invested millions, or even billions, of dollars in AI hardware. Nevertheless, the massive 95% reduction in power consumption would probably make the biggest tech companies jump ship, especially if AI chip makers build application-specific integrated circuits (ASICs) that will take advantage of the algorithm.

Power is now the primary constraint on AI development, with all data center GPUs sold last year alone consuming more power than one million homes in a year. Even Google put its climate target in the backseat because of AI’s power demands, with its greenhouse gas emissions increasing by 48% from 2019, instead of declining year-on-year, as expected. The company’s former CEO even suggested opening the floodgates for power production by dropping climate goals and using more advanced AI to solve the global warming problem.

But if AI processing can be more power efficient, then it seems that we can still get advanced AI technologies without sacrificing the planet. Aside from that, this 95% drop in energy use would also reduce the burden that these massive data centers put on the national grid, reducing the need to build more energy plants to power our future quickly.

While most of us are amazed by the additional power that new AI chips bring every generation, true advancement only comes when these processors are more powerful and more efficient. So, if L-Mul works as advertised, then humanity could have its AI cake and eat it, too.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.