The original post: /r/pcmasterrace by /u/Ascz on 2025-03-08 04:40:27.
The current Monster Hunter Wilds performance fiasco together with the NVIDIA series 5000 disaster compels me to write this post to formally advise anyone not to skimp on CPUs, ever. I put my 30+ years of experience in IT on the line.
If you have to save money, save that on other components. Skimp on graphics card, never on CPU. It does not matter if you only game or if you use your PC for other activities: CPU always matters. You wouldn’t dare waging war against CPU. CPU is king.
Now for the technical boring part.
Too often I read about people complaining about performance while sporting a 4080 or 4090 paired with a 8th gen intel or a ryzen 3. These people are clueless and unfortunately for them they’re being ripped off because of their ignorance. They spend thousands to be bottlenecked without even knowing what that word means, because they skimped on CPUs.
The current Monster Hunter Wilds situation has also shown us something that we already know: that developers are lazy, that deadlines are too strict, and that for some reason or another the optimization process is now pretty much non-existent.
Now get this: if you lower your resolution via FSR or DLSS by setting them on “ultra performance” you’ll reduce your render resolution to values half of what resolutions going around 2000 had.
Proof: DLSS ultra performance from 1440p shrinks the original resolution to 33.3%. This means going from 2560x1440 to 853x480. The total number of pixels in a 853x480 resolution is less than that of the good old 1024x768 resolution which was mid range in 2001 (real pr0s back then had 85hz 19" CRTs @ 1280x1024). It’s 409.000 pixels vs. 786.000.
So today we have games that a modern GPU is unable to render at a playable framerate even at a resolution that is half of what it was in 2001. That is insane. Even an old 1060 should not struggle.
So what’s the problem? CPU bottlenecking due to poor optimization, which suggests that the game isn’t properly utilizing the graphics card. Instead, there’s likely an over-reliance on software rendering because of a shitty port and no optimization. So we have a CPU that is doing way more work than it should while the GPU has to wait for it to compute the unthinkable. I’m 99% certain that this is the result of a bad conversion where multi-threading, API optimization (think DirectX/Vulkan), and proper GPU offloading were neglected.
So what’s the moral of the story? That you must not skimp on CPUs. If you skimp on CPUs, you get the worst of today’s world shoved in your face with no protection. Now I don’t mean that GPUs are useless, or that getting a high end model is stupid (even though…). But they’re secondary. Also you can change them whenever you want, they’re plug and play for god’s sake. It’s like building LEGO. On the contrary if you have to change your CPU you’d better pray that the next one you buy has the same socket (spoiler: it probably won’t) or you gotta change motherboard and RAM as well.
But you should know that. You should already know all of this. Still I see people blaming the performance of their graphics card for their low frame rate, while having shit tier CPUs. There are even front page posts with top comments having thousands of upvotes complaining about performance and blaming their video card.
Stop skimping on CPUs. And if you have to blame someone, blame the developers who skip the optimization process or cluelessly port games.