GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
Saying "people have always complained about optimization" is a cop-out.

They literally have, though.

Games today look no better than their predecessors a decade ago.

This is because lots of AAA games continue to be designed for the PS4, which came out a decade ago. The reason Black Ops 6 runs at 100 fps on a low-end card today is it's a PS4 game at heart. Consequently, low settings run fine on 10-year-old hardware, while ultra settings do quite well even on low-end hardware from 2024. Do you know what the new Indiana Jones and Star Wars games have in common? No PS4 version. Minimum GPU for Indiana Jones is an RX 6600, or roughly the same GPU the PS5 has. For Star Wars, it's an RX 5600, which is only a year older than the PS5. This isn't a coincidence.

In other words, it's starting to be time for PC gamers to upgrade their machines to be at least as good as consoles. Or join the Playstation Master Race, I suppose, if you can't afford a current-gen GPU.

So what you get is the same looking shit at much worse performance. And things like RTX compound this where the graphical "improvements" do not outweigh the massive hit to performance over baked lighting.

So put the textures on low/medium and turn off RTX if it doesn't make any difference instead of complaining that your card can't handle ultra textures & raytracing.

And it goes beyond graphics too. In the case of Battlefield 2042, it has worse physics, less interaction, less props, worse destruction, etc. than it's predecessors.

This is Battlefield 3:
Screenshot 2025-01-18 124834.png

Note the low-poly concrete clutter with lighting that doesn't match the interior. Games don't look like this any more. The game's held up incredibly well over 14 years, showcasing diminshing returns in graphics, but asset fidelity has improved a lot in that amount of time.
 
Last edited:
They literally have, though.
You missed the point.

Let me make it simple. If we agree Battlefield 2024 looks no better than its predecessors from a decade ago (for whatever reason), and we installed both games today, ran them at the same presets and resolution (to make it as fair as possible), which game do you think will have the far better performance? THAT is optimization. People may have cried about optimization during the days of Quake, but every new release back then was pushing the technology further. If you took screenshots of the big hitters of the 90's and 00's and lined them up one after another you would see the blatant advances that justified lackluster performance. The same can't be said of the 2020's where every game looks the same and somehow performs worse.

So put the textures on low/medium and turn off RTX if it doesn't make any difference instead of complaining that your card can't handle ultra textures & raytracing.
That doesn't fix the worse A.I., physics, destruction, netcode, and every other aspect that goes into these games.
 
Last edited:
What is optimization, it feels like a buzzword used without knowing what it means.
People read stories about how the Rollercoaster Tycoon guys wrote handtuned assembly to make up for inadequacies in the original Pentium and think they've figured out how 'lazy modern devs' are pulling a fast one on them.
 
I really like my Arc B580 so far. I'm only playing at 1080p60 (75 max on my monitor when I can get away with it) and so far, so good. RDR2 runs like a dream even cranked up, and Space Marine II is playable apart from some weird crashes I had initially. The ASRock Steel Legend version of the card is rather robust. One thing I've noticed though is that the fans only turn on under heavy load - for the most part it runs a bunch of stuff without even turning on, all while sticking at 60c or below. I can't seem to actually control the fans through my motherboard or software. I'll be digging a bit deeper into that, but so far it hasn't limited me and the card hasn't overheated or anything.

Following the discourse back and forth on the internet, I get the feeling that this is going to be one of those investment purchases that pays good dividends as driver support gets better. I got the card taxes in for about $480 and I definitely feel like it was a better purchase than a 4060/7600XT.
 
If we agree Battlefield 2024 looks no better than its predecessors from a decade ago (for whatever reason), and we installed both games today, ran them at the same presets and resolution (to make it as fair as possible), which game do you think will have the far better performance?
They'd run exactly the same, because if that were true (and it's not), it would mean that they just used the same exact code from a 2015 and used different assets of the exact same quality. If you modded BF2042 to use BF4 assets, it would probably run faster, since they've likely made a few improvements to the rendering pipeline in 10 years, similar to how MW2019 has a much better optimized graphics engine than MW3.

If you took screenshots of the big hitters of the 90's and 00's and lined them up one after another you would see the blatant advances that justified lackluster performance. The same can't be said of the 2020's where every game looks the same and somehow performs worse.

This happens to every technology. Early improvements are huge and cheap. Late improvements are expensive and marginal. We could be talking about airplanes, steam engines, televisions, cell phones, or video games. Look at the innovations in the first 30 years of the auto industry - windshields, automatic wipers, rearview mirrors electric headlamps, heaters, etc - nothing in the last 75 years changes the driving experience as dramatically or cheaply as being able to drive in the dark without rain hitting your face.

Those early innovations in the first 10 years of 3D gaming (about 1993 through 2003), like colored lighting, gloss maps, Gouraud shading, etc are computationally cheap, conceptually simple, and dramatic in effect. Innovations in the last 10 years are less dramatic and more expensive for the simple reason that all the low-hanging fruit has been picked.

The mistake you are making is thinking that if an improvement feels minor, it can be made computationally small. A simple example is color depth. I first started gaming when PC games were EGA,4 bits per pixel:
1737310029956.png

Then came the jump to 256 color VGA, 8bpp, 2x more expensive than EGA. The colors were stunning:
1737310003386.png

Or here's what Doom would have looked like in 4-bit EGA vs 8-bit VGA:
1737310986681.png1737312877897.png

In 30 years, despite going from 8bpp to 16bpp to 32bpp, nothing's ever matched that jump from 4-bit to 8-bit in terms of how dramatic and stunning it was (actually, 2-bit CGA to 4-bit EGA was pretty impressive as well). Going from 16bpp to 32bpp was truly marginal. Color banding a was bit better, and there was less dithering. "Lazy programmers refusing to optimize" was not the issue. A hard-working programmer optimizing the shit out of a 16-bit rasterizer would still not have been able to eliminate color banding and dithering. You needed another doubling of color depth. But it was a doubling in cost with marignal results.

The reason SSAO, raytracing, 4K, and other technologies give modest results at massive expense is that the low-hanging fruit is gone, it's the path of all technologies. There is no way to "optimize" raytracing to make it 10x faster or 10x more visually stunning. Again, the mistake you are making is thinking that a graphical effect that gives marginal improvements in the look & feel of a game should be computationally cheap, and that just isn't true.

That doesn't fix the worse A.I., physics, destruction, netcode, and every other aspect that goes into these games.

DICE not making small objects moveable or concrete barriers destructible in BF 2042 that were dynamic in BF3 has nothing to do with optimizing code.
 
Last edited:
The mistake you are making is thinking that if an improvement feels minor, it can be made computationally small. A simple example is color depth. I first started gaming when PC games were EGA,4 bits per pixel:
1737310029956.png

Then came the jump to 256 color VGA, 8bpp, 2x more expensive than EGA. The colors were stunning:
1737310003386.png
Didn't expect to see Keen 4 or Hocus Pocus today, but here we are.
 
Doom with a dithered EGA renderer:
That runs horrible
They'd run exactly the same, because if that were true (and it's not), it would mean that they just used the same exact code from a 2015 and used different assets of the exact same quality. If you modded BF2042 to use BF4 assets, it would probably run faster, since they've likely made a few improvements to the rendering pipeline in 10 years, similar to how MW2019 has a much better optimized graphics engine than MW3.
As an avid battlefield player, the core engine is same shit as in BF3, a bit of new code but underneath, the same laggy garbage, getting killed behind walls, hits not registering and so on...
only thing that really changed is textures and that's about it.
It also suffers from EA shittification that any good weapon is getting nerfed into oblivion.
 
Back