Get all your news in one place.
100’s of premium titles.
One app.
Start reading
AnandTech
AnandTech
Technology
Ryan Smith

NVIDIA at SC23: H200 Accelerator with HBM3e and Jupiter Supercomputer for 2024

With faster and higher capacity HBM3e memory set to come online early in 2024, NVIDIA has been preparing its current-generation server GPU products to use the new memory. Back in August we saw NVIDIA’s plans to release an HBM3e-equipped version of the Grace Hopper GH200 superchip, and now for the SC23 tradeshow, NVIDIA is announcing their plans to bring to market an updated version of the stand-alone H100 accelerator with HBM3e memory, which the company will be calling the H200.

Like its Grace Hopper counterpart, the purpose of the H200 is to serve as a mid-generation upgrade to the Hx00 product line by rolling out a version of the chip with faster and higher capacity memory. Tapping the HBM3e memory that Micron and others are set to roll out n, NVIDIA will be able to offer accelerators with better real-world performance in memory bandwidth-bound workloads, but also parts that can handle even larger workloads. This stands to be especially helpful in the generative AI space – which has been driving virtually all of the demand for H100 accelerators thus far – as the largest of the large language models can max out the 8GB H100 as it is.

Meanwhile, with HBM3e memory not shipping until next year, NVIDIA has been using the gap to announce HBM3e updated parts at their leisure. Following this summer’s GH200 announcement, it was only a matter of time until NVIDIA announced a standalone version of the Hx00 accelerator with HBM3e, and this week NVIDIA is finally making that announcement.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.