Lmao no. Absolutely not. The 30xx and 6x00 series had an improvement so large that they made literally every card before them immediately obsolete. Once in a decade leap in performance. Moreover, the Ampere architecture, like the Pascal, without getting too far into it, is fundamentally flawed, and the fix that is likely coming with the 40xx will increase performance by up to 60% for little increase in production price. We have so far to go in GPU tech it is insane. We are still looking at 40%-60% jumps in performance with each generation as AMD catches up to Nvidia in both raw performance and technology, and Nvidia tries to stay ahead. Things are just now ramping up.
As for consoles? Literally who cares. Consoles are going the way of the dinosaur as it becomes harder and harder to enclose the needed tech in a cheap box small enough that the consumer doesn't revolt. People are already pissed at the size of the new generation of consoles, and they already have overheating problems. You cannot run CoD: BOCW in 4k or it will overheat your PS5 and potentially brick it. Compare that with my 6700K/1080 rig that is years old that can handle the same game in 4k with better fps. The chips in the PS5 will regularly stay at 95-98°C under heavy load and spike to over 100 and force a shutdown. If consoles want to keep up they will need more active cooling and larger cases for more conducive airflow. This is a deal breaker for a lot of consumers, and I expect within the next two generations consoles legitimately being in danger of going extinct.
Yeah, but all of that's the technical and performance side of the coin. He's more talking about the actual appearance in that we're pretty much able to walk right up alongside photorealism even in games that aren't produced on massive budgets from "AAA" studios. Just the Skyrim modding scene alone can make some disturbingly realistic-looking environments, and that's on an engine that was made sometime before most teenagers were even born, just thanks to an unhealthy injection of ENB presets and 8k textures.
Getting the game to
perform properly is another matter entirely, but getting a game to look as realistic as possible is almost an achieved goal nowadays. You don't see the same leaps that you did in the old days like Daggerfall to Morrowind to Oblivion to Skyrim to Fallout 4, it's all mostly coming down to a matter of how much extraneous crap you litter around the scene to make it look more believable and less like an OG DOOM level. Otherwise, the last major development I can even think of where I noticed an actual leap in fidelity that wasn't just related to complex, high-poly modeling was subsurface scattering.
High-poly modeling has existed since the ability to make 3D models existed in the first place, but due to a complete lack of shader effects or much of any other technology at the time, Bryce 3D didn't exactly make us believe that what we were looking at was real. Everything else has just been a matter of how many cans and piles of trash we can throw around to make us feel like Cyberpunk's cities look real, up until a car launches itself off of a tin can thanks to their amazing physics engine and body-slams into a pedestrian.
We're essentially out of the obvious, technological leaps at this point and down to fine-tuning the little, fiddly shit that the casual market won't be able to notice in the first place. There's enormous, diminishing returns on high-poly modeling to the point where an object with 50k triangles won't look very different at all compared to one with 25k, but they'll both look
very different than the one with 500. The same wall is coming up very quickly on the texture and shader work, but we're definitely a long ways away from this weird Bryce shit.
