If you're a Linux gamer and use an AMD Radeon graphics card, you might be aware that there are all kinds of limitations and bugs when trying to run at high resolutions with high refresh rates, when using the HDMI port. AMD's software engineers had been working on a solution to it all, by implementing parts of the HDMI 2.1 specification into the code, but sadly had to shelve the whole project because the HDMI Forum rejected the project.
The disappointing news was reported by Phoronix, who spotted a brief explanation of the problem by one of AMD's Linux engineers. If you try to run something like a 4K, 120 Hz display on a Radeon RX 6000/7000 series graphics card, via its HDMI ports, you'll run into problems when using Linux. You wouldn't expect this to happen, because AMD clearly advertises HDMI 2.1 support on all those models.
However, that's only true for Windows-based PCs, as the governing body of the port's specifications (HDMI Forum) doesn't permit its use in open-source software. AMD would be aware of this fact, as it's been a member of the group for a long time. It's not clear what the exact issue is but I strongly suspect that it has to do with the multiple technologies developed by other group members not wanting their work to be open-sourced.
Either that or it's the media companies within the Forum who think that anything to do with open-source will just instantly lead to DRMs being hacked and all their content being instantly pirated.
AMD tried to come to some agreement with the HDMI Forum, probably just including the bare minimum of the HDMI 2.1 spec to solve the high-resolution problem, but ultimately the legal teams involved just put a stop to the whole thing.
Best CPU for gaming: The top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game ahead of the rest.
All of this means that Linux gamers with Radeon GPUs will still need to use the DisplayPort output and an appropriate DP-to-HDMI cable if they want to run an HDMI-only 4K display at 120 Hz or more. However, finding one that will work isn't especially straightforward and cheaper ones will only cause even more problems.
Or, just to stir the pot a little, they could switch to an Intel Arc or Nvidia GeForce graphics card, as the Linux drivers for those fully support the HDMI 2.1 specs. How have they managed to do this but not AMD? In the case of the former, many Arc cards come with an extra chip on the PCB that does a DP-to-HDMI conversion and it's probably the same for Nvidia's cards too.
So if those GPUs can offer full HDMI 2.1 support, then all is not lost just yet with AMD's graphics cards.