First the crypto miners came for our GPUs. Then it was the semiconductor shortage. And now, at the point when it's actually possible to buy a new graphics card for something vaguely resembling a sensible price, and finally enjoy some high fidelity gaming for the first time in years, they stop making high fidelity games that work on PC. It's like our blood sacrifices to the PC gaming gods have gone unnoticed.
The list of games that shipped in 2023 with disappointing PC ports is hefty, even six months in. Wo Long Fallen Dynasty, The Last Of Us Part 1, Forspoken, Returnal, Jedi Survivor, Dead Space and Hogwarts Legacy are the most high-profile culprits and after they've all been checked off the list of big name PC releases this year there aren't many left. Even the generally excellent Resi 4 Remake had its wobbles.
Obviously, these releases aren't arriving in poor technical shape as part of a coincidental streak of misfortune. There's something about making games in 2023 that leads to a high risk of making PC gamers weep onto their RGB keyboards.
What are the clues? Firstly, the things these games have in common. Stuttering and poor frame rates are a given across the board, and glitchy ray tracing crops up in Hogwarts and Forspoken. That's not much of a lead.
On the other hand, their differences are more notable: despite Unreal Engine's ubiquity in the modern industry there's a broad swathe of game engines in our lineup including RE Engine, Frostbite, proprietary engines from Team Ninja, Naughty Dog, and both UE4 and 5.
Secondly, while some releases arrived on multiple platforms at once, such as Dead Space and Wo Long, others did not. TLOU represented the end point of a famously long wait to see Joel and Ellie on PC.
The nature of our frustrations differ significantly, too. There's a torturously long shader pre-compiling phase in TLOU, whereas it's the lack of a pre-compile process that causes such frequent stuttering in Wo Long.The aforementioned TLOU port has a lot of graphics options to tweak, it's just that none of them actually fix the fundamental performance issues. But that's not the case elsewhere in this phenomenon—it's not a given that a PC port will even include an FOV slider or separate resolution and refresh rate options in 2023.
An easy answer to this bizarre plague of bad PC ports is off the table, then. There just aren't easily identifiable commonalities among them, before we even get to the trickier part: linking commonality and causality.
However, developers at two unnamed studios I've recently spoken to about modern videogame performance gave me some steers. They preferred to remain nameless, but the message is that this isn't a PC port problem, it's a videogame development problem.
Exhibit A: the growing trend of ‘performance' modes in console releases. This is a different face of the same problem that's causing poor PC ports—developers have an increasingly tough time hitting the performance targets required for each platform. And in a way, it's all down to the number of transistors in microprocessors.
You'll have noticed that it's getting exponentially more expensive and more complex to create big games lately, and that the end results are more unpredictable. Since we broke Moore's Law in the 2010s we haven't been able to depend on raw technical heft to produce better game visuals, smarter AI, and other markers of fidelity that shift units and garner heavyweight metascores. That causes a big problem for digital creators.
So in simplified terms, developers have to try twice as hard to achieve half the progress that would have been possible 10 years ago. And that's expensive. It means higher headcounts on staff, longer project turnaround times, more components in the pipeline.
Since it's so difficult to raise the visual bar, one developer tells me, conversations about performance targets that should be had at the beginning of a project aren't always conducted and adhered to.
"Naturally, we insist that you make the game look as pretty as it possibly can. So that leads into a content discipline process.
"Somebody needs to be a champion for three years to say that this thing has to run 60 fps, because if you spend three, four years making that game and it runs at 30, it's pretty much impossible to make the jump to 60 fps at that point without drastically killing content."
That issue is compounded by the fact, the developer tells me, that for a long period during a game's development, it's not easy to accurately predict how a current build's actually running on its target platforms.
"It's not like, 'Oh, now I did something that made it drop below 60.' It's not a very easy and simple thing to do because when you're developing games, nobody knows where it's gonna come out."
Console performance modes are one solution to that cluster of issues. Giving the player a choice between running the game at a higher frame rate at the expense of fidelity, or a maxed out quality mode that runs at fewer frames per second—now where have we heard that concept before?
Yes, correct. Our beloved kingdom of multicoloured lights and liquid-cooled componentry. But we've always had the option to tinker with graphics settings in PC gaming, since time immemorial or Jazz Jackrabbit, whichever came first. So while it might be a solution—albeit an unpopular and piecemeal one—for console releases, it doesn't help developers making PC versions.
The difference is, while consoles are closed ecosystems whose performance levels a developer can predict and test on with a good degree of precision, the PC's always been a jamboree of variables.
You might test for a certain number of CPU, GPU and RAM configurations during PC performance testing, but you can't account for the thousands of variables that OS versions, drivers, temperatures, and background apps bring to the equation. So the actual real-world performance of your game on a specific user's machine is a black hole. You haven't tested for it, because it's nigh-on impossible, mathematically to cover so many variables.
To make matters worse, multiplatform developers have a broad range of performance targets to work to across their console versions, too. Developing an additional Xbox Series S version is an unenviable task for Xbox Series X studios, and simply the arrival of gen-9 consoles has stretched devs thinner in recent years.
In some cases the extra hardware grunt is welcomed, one dev at a leading studio tells me.
"So you might be GPU-bound, so what you have on screen takes too much time to render. Or you might have a CPU bottleneck. And in our case [there are] many situations where [we were] CPU-bound. So the leap to gen-9 allowed us to hit 60 fps solidly."
The trouble is, the more you commit to optimising on a particular console ecosystem, the less likely it is that those optimisations will carry over. Sony's architecture might not be as alien relative to other systems as the PS2 and PS3 were, but there seem to be very few apples-to-apples technical optimisations that work across devices and their operating system backends.
That leads us back full circle to the bizarre shader compiling load time that blights TLOU, and the lack of a shader pre-compile process that creates frequent stuttering in Wo Long. These are two contrasting attempted solutions to the aforementioned problem of different platforms' wildly different architectures and demands.
Consoles usually pre-compile code before running a game, because they're closed hardware systems and their developers know what GPU they're talking to. So pre-compiling works—it doesn't take noticeably long, and it prevents stuttering in real-time once the game's running.
But of course on PC it could be any number of GPUs reading that code and crunching the numbers. And that, one suspects, is why the shader compilation process takes hours in TLOU—it's likely optimised for PS5, not PC and its limitless variables. Lo Wong takes the opposite approach, getting your GPU to read new batches of code on the fly when it needs them in game, but incurs horrendous stutter in doing so.
Solutions, then? Well I'm no console architect, API developer, coder, game developer, designer, visual artist or game producer, but here's what I think…
Okay, I'm being facetious. There are no easy answers here. But there are successes if you look around for them—indies, for example.
"All our titles are developed on PC first, then ported over to consoles afterwards", a Devolver spokesperson tells me. That's by no means an easy task, but it does ensure we get great versions of those games here in PC gaming.
Going deeper though, since indies don't generally have the budget to chase triple-A fidelity levels, they use art direction to do the heavy lifting. A Highland Song marries gorgeous, painterly 2D art with dramatic highland scenery. Dredge makes operating an actual dredge boat, one of the most dour activities imaginable, look like something to make your heart sing. The per-pixel cost is small, but the way those pixels hit your eyes is certainly comparable to a ray-traced triple-A full of 4K photogrammetry.
It might not be realistic to suggest big name games simply lower their fidelity levels and make up the shortfall with inspired art direction, but what is certainly possible is to give us greater control over what's happening in our ports.
Seeing the effects of your tweaking happening in real-time in the game engine is massive, and a PC gamer will take that every time over a particularly rigorously assembled preset. It goes without saying, but we'll always need that FOV slider, ultrawide aspect ratios, and separate refresh rate and resolution options too.
And worst case scenario, a delay is a lot easier to take than an utterly broken PC port. Developers don't want to release bad ports, but by all accounts the timelines imposed on them often leave them with little to no choice. If any one change can significantly impact the quality of PC ports in the next 12 months, it's this one. Though our opinion of that might change should Starfield launch looking like a matter transporter malfunction on PC even after its lengthy development delay.