It still ran at a dodgy 20-25 fps in 480i on its target platform.
Ight, now take note of how I said The Witcher 3 and GTA V, both released in 2015, ran in playable framerates on a GTX 1060, a mid-range GPU from 2016, in native 1080p. Granted, it didn't hold stable 60fps, but it never dropped below 40fps, so it was firmly in the acceptable range on period accurate hardware, from the PS4 era, which according to your S-graph was at the point where you had to justify ramping up hardware demand for better graphics.
I'd like to know exactly why is it that between Pascal and Turing, we suddenly ended up in a rat race where all of that ceased to matter, and how much of it was "it's just how it'll be from now on", and how much of it was "Jensen said so, so now believe it or die". The way I see it, everyone had it pretty much figured out a decade ago, and it was a straight road from there.
Which isn't something everyone demands. Look at Minecraft and Fortnite, two of the most popular games around. Pixel art block game and cartoonish art style. Gamers are fine with stylized graphics if the game itself is good, the whole realism chase is something fueled mainly by the delusions of the industry, that games HAVE to be photorealistic or else they won't be good. Not to mention, games that don't go full photorealism and stick to a more stylized look are the games that age the best and look the best.
It's very easy to land into uncanny valley with overt realism and brake the suspension of disbelief, much less so if you give it that distinctive video game look. Again, The Witcher 3 and GTA V. They weren't 100% photorealistic, they had just enough stylization to not cause any issues related to photorealism, still looked good, still played well, still hold up to this day. I'm saying this because the justification of "we have to make everything run like shit for the sake of ray traced everything and photorealism" is asinine.
No one is demanding for everything to be as photorealistic as possible to justify this.
problems with tricks like cube mapping and SSR
Yes, both have issues. Cubemapping is limited and SSR is buggy. But most other lighting aspects we fake very well at low computing demand where ray tracing would only have a major performance impact at virtually no visual improvement, so why not ray trace the parts that no conventional tricks can match and keep everything else as-is to have both good visuals and good performance? Why not be smart about it? Why toss
everything in the garbage for the sake of ray tracing?
The issue at hand is:
-Hardware demands in games are getting way too high for the level of visual quality we get in return
And the only two explanations to these issues are:
-Nvidia being a dick
-Devs being lazy
What isn't an explanation is:
-It's just the way it is, diminishing returns, deal with it
Unless of course, you're in the camp of the two aforementioned issues, by which point it's shifting the blame of your mistakes onto the people that are ultimately affected by it.