Get all your news in one place.
100’s of premium titles.
One app.
Start reading
TechRadar
TechRadar
Christian Guyton

Nvidia claims to beat AMD and Intel in key GPU battleground – but I’m not impressed

A smartphone showing a red and green graphic of 'AMD VS Nvidia'.

Wake up babe, new GPU industry beef just dropped! Yes, Nvidia is gunning for AMD and Intel again - this time, over the AV1 video encode performance of its graphics cards.

For those not in the know, AV1 is an open-source video codec used to encode video footage (such as gameplay footage captured from a graphics card) at a lower bitrate but with minimal compression, so it can be uploaded to the internet without either losing too much video quality or absolutely tanking your internet connection.

AV1 is more hardware-intensive but also more effective than other codecs (such as HEVC) and is commonly used by streamers as a result. In a recent blog post detailing an update to the free streaming software OBS Studio, Nvidia was keen to point out that its new RTX 4000 GPUs offer better AV1 encoding performance than currently-available cards from its two main competitors.

If you’re not a streamer - so, the majority of GPU users - there’s a very good chance this most recent boast from Team Green is utterly meaningless. And even if you are, Nvidia hasn’t exactly provided a ton of evidence to back up its claims; we’ve got one screenshot (not a video? Come on, guys) showing a comparison in Fortnite where, admittedly, the Nvidia-encoded image does look a lot less compressed.

A side-by-side comparison of Fortnite screenshots showing compression rates in AV1 encoded video from AMD, Nvidia, and Intel GPUs. (Image credit: Nvidia, Epic Games)

Encode, decode, reload

The thing is, in practice, it really looks like AV1 performance is broadly similar regardless of which GPU maker you choose. We previously noted that Intel’s Arc cards have impressive AV1 capabilities, and TechSpot reported last year that AMD’s Radeon RX 7900 AV1 encode performance is ‘almost on par’ with Nvidia and Intel.

It’s also frankly a joke for Nvidia to be making a fuss about comparing its RTX 4080 to the Intel Arc A770. You can get an A770 for $329, literally a third as much as the GPU Nvidia used for comparison; of course, the 4080 should do better, it’s wildly more expensive!

But I’m not here to rag on Nvidia for pricing - that’s a whole different argument for another day, and one I’ll be more than willing to back down from provided the hotly-anticipated RTX 4060 Ti does indeed launch with the rumored $399 price tag.

What I am here to complain about (and get paid for! Isn’t life grand?) is that AV1 encode performance just doesn’t matter. It’s a meaningless victory for Nvidia here because it’s a feature that only mildly varies between GPUs and isn’t even used by the majority of the people buying them.

Nvidia has its strengths when it comes to the latest generation of GPUs, and this sort of claim doesn’t impress anyone; it makes Team Green look frightened of the competition.

And there’s just no need for that; Nvidia is still leaps and bounds ahead of AMD and Intel in terms of both raw performance and market share. After all, the RTX 4090 is basically the most powerful consumer GPU ever made, and Nvidia continues to dominate the rankings of our best graphics cards list. Rest on your laurels for a bit, Nvidia, you don’t need to make everything a battle for supremacy.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.