It's everything. Just look at GTA: San Andreas on the PS2. The hardware was limited, the ambitions were big, and R* pulled it off by having talented programmers that knew when to write complex code to push the hardware to it's limits, and when to use simple tricks that worked well enough and saved on resources. Not only that, due to those hardware limitations they couldn't have a photorealistic game so they've done their best to achieve a distinctive look. Graphically, it does look badly dated, with the low poly models and blurry textures, but stylistically, it still holds up. The orange sunrise over 90's LA suburbs will still look as good as ever.
Nowadays we have an overabundance of hardware capabilities, but instead of bringing more advanced games and graphics, it brought upon stagnation, or even regress, as this overabundance has been abused as a buffer for dev laziness. Who cares about making things well if you can just half-ass everything and hope that whoever plays it has a 4070Ti so it'll even out? Plus, since the "photorealistic graphics" rabbit chase has only picked up speed, it only led to games aging like milk. Every game will age graphically, but when you ditch all style for the sake of "realism", your game has nothing to withstand the trial of time. The moment graphics technology gets older, so does the visual style of the game, while a ton of PS2 titles are still good visually despite being technologically outdated.
Just look at the games from the Pascal era. Metal Gear Solid V, The Witcher 3: Wild Hunt, Grand Theft Auto V on PC. I remember playing TW3 and GTA V on my 1060 and it already looked and ran great, on a mid-range card from the same time those games were released. Games knew how to use the hardware that was available and it was all already this close to catching that photorealism rabbit. But something went wrong, games started to look only marginally better while running significantly worse, and now we've hit the wall where in many cases the games are looking worse than those 2015 titles while running worse than those 2015 titles. But instead of tackling the source of this issue, everyone is gaslit into believing that their hardware is at fault, that they need to buy the newest RTX card with 4x frame generation to make the games run well, and everyone is buying into this lie for some reason.
Not to mention the whole ray tracing scam where the games, again, only look marginally better than what they already looked like with the conventional 3D graphics tricks while significantly raising computing demand on the hardware. For what exactly? For more reasons to convince you to buy yet another fancy, expensive GPU, when in reality all that is needed is for game developers to do their fucking job right.
Remember: everything "AI" that Nvidia is throwing out is there to excuse the dev laziness even further, not to fix the sorry state of games today. If everyone, from individual game studios to game engine developers, gave two shits about actually harnessing the computing power of today, there would be no need to abuse DLSS upscaling and frame generation to make the games playable. They would only be there as a little extra, not a mandatory component, like filling in the gaps to achieve high frame rate 4K gaming, with everything below it being achievable natively.