Hardware requirements keep going up, yet games don't look any more advanced than they did 10 years ago.
Nearly all games out will run just fine on a 7-year-old GPU at the right settings. What is upsetting PC gamers right now is they feel that buying a gaming PC makes you part of the PC Master Race, and they get psychological satisfaction from maxing out settings, turning on MSI Afterburner, and taking screenshots of the game running at 75-100 fps to sneer at console peasants online.
If you run the newest games at settings a Radeon 6700 XT can handle, they look great. How do I know? I have one. But what you don't get is that mental satisfaction of opening up the settings window and seeing everything is on "Ultra," meaning that in the back of your mind, you have to cope with the painful truth that somebody out there,
possibly a console peasant with a PS5 Pro, is having a better experience than you.
And that's not fucking fair.
You're PC master race.
You paid $1500 for your machine, and that console peasant paid $500 for his.
It doesn't matter that his machine is more powerful, you
deserve a better experience
. You paid for 12 GB of VRAM and 12 CPU cores. You future-proofed. The game industry owes it to you to not release any cross-platform games that have fidelity settings exceeding what your GPU can do until you've gotten a good, solid 5 years out of your system. At the very least, they owe it to you to ensure the console versions look worse than what your aging GPU can do, regardless of how powerful the console's GPU is.
Because it's not fair. You're PC Master race. You paid a $1000 premium for your bragging rights. And the minute a developer releases a game whose Ultra textures need 16 GB of VRAM, or has optional raytraced reflections, you have to run at something less than Ultra settings and have had your bragging rights taken away. And that's not fucking fair.