Get all your news in one place.
100’s of premium titles.
One app.
Start reading
TechRadar
TechRadar
Wayne Williams

AMD adds ultra-fast memory to flagship AI Instinct accelerator as it looks forward to next gen CDNA 4 architecture — Instinct MI325X accelerator has 2x memory and 30% more bandwidth compared to Nvidia's H200

AMD Instinct MI325X accelerator.

AMD has unveiled new CPU, NPU and GPU architectures aimed at “powering end-to-end AI infrastructure from the data center to PCs”, alongside an expanded AMD Instinct accelerator roadmap and a new Instinct MI325X accelerator, which it says will be available in Q4 2024.

The new Instinct MI325X offers 288GB of HBM3E memory and 6TB/s of memory bandwidth. AMD says this means it will offer 2x the memory capacity and 1.3x bandwidth than “the competition”, by which it means Nvidia's H200, as well as 1.3x better compute performance.

The memory upgrade is the main change here as it will use the original CDNA 3 architecture as the MI300X and the clockspeeds also look to be unchanged at 2.1GHz.

Looking to the future

Following the Instinct MI325X will be the Instinct MI350 series. Expected to be available in 2025, this will be powered by the new CDNA 4 architecture, which AMD says will deliver up to a 35x increase in AI inference performance compared to the Instinct MI300 Series.

That will be followed in 2026 by the AMD Instinct MI400 series which will be based on AMD’s CDNA Next-Gen architecture. The company, understandably, didn't go into too much detail here.

“The AMD Instinct MI300X accelerators continue their strong adoption from numerous partners and customers including Microsoft Azure, Meta, Dell Technologies, HPE, Lenovo and others, a direct result of the AMD Instinct MI300X accelerator exceptional performance and value proposition,” said Brad McCredie, corporate vice president, Data Center Accelerated Compute, AMD. 

“With our updated annual cadence of products, we are relentless in our pace of innovation, providing the leadership capabilities and performance the AI industry and our customers expect to drive the next evolution of data center AI training and inference.” 

(Image credit: AMD)

More from TechRadar Pro

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.