Are we close to topping out on graphics?

Lmao no. Absolutely not. The 30xx and 6x00 series had an improvement so large that they made literally every card before them immediately obsolete. Once in a decade leap in performance. Moreover, the Ampere architecture, like the Pascal, without getting too far into it, is fundamentally flawed, and the fix that is likely coming with the 40xx will increase performance by up to 60% for little increase in production price. We have so far to go in GPU tech it is iinsane

You can't drop a nugget like that without explaining more.
 
  • Thunk-Provoking
Reactions: Vyse Inglebard
Whenever I read "x game needs to be remade" comments on youtube (which pops up really often unfortunately), I just ask myself "Do I really want this game but with more lens flares, 99 less GB on my PC, awful outsourced models, stiff Chinese animation, longer load times, microtransactions, but the same 20+ year old gameplay?"
If graphics finally top out we might be finally able to free ourselves of the curse of having to buy new consoles every five to seven years, and maybe it might actually set us down the road to finally agreeing on a standardised gaming platform that can play anything.
They already have that, its called a DVD player (since most modern videogames are just movies).
 
You can't drop a nugget like that without explaining more.
So how the Ampere architecture works is that it has three cores in a block, each block contains a tensor core, an FP32 core, and an Integer/FP32 core. FP32 cores are the cores responsible for shaders and shit mainly. Basically what that means is that it has double the theoretical available FP32 cores over the previous die, however whenever an integer process comes across, the half and half core has to process the integer process before it can go back to shading processing. Basically, instead of doubling the FP32 cores they added 50%, since the FP32 side can only fire half the time. Now Nvidia has all but confirmed they will be going to a 5nm architecture which is a massive improvement, which should allow for that half an half to be a dedicated FP core. Moreover going to 5nm would allow for a wider architecture which should theoretically allow for something like 16k CUDA cores compared to the 3090s numbers at just over 10k. And as we discussed, each core will be better, again hypothetically 50% better. This is also assuming they stay with a single monolithic die instead of their theoretical "hopper" architecture which would call for many smaller dies, which, for a lot of reasons would further increase performance.

Tl;Dr: many more cores, each core is significantly better.
 
  • Informative
Reactions: Considered HARMful
Lmao no. Absolutely not. The 30xx and 6x00 series had an improvement so large that they made literally every card before them immediately obsolete. Once in a decade leap in performance. Moreover, the Ampere architecture, like the Pascal, without getting too far into it, is fundamentally flawed, and the fix that is likely coming with the 40xx will increase performance by up to 60% for little increase in production price. We have so far to go in GPU tech it is insane. We are still looking at 40%-60% jumps in performance with each generation as AMD catches up to Nvidia in both raw performance and technology, and Nvidia tries to stay ahead. Things are just now ramping up.

As for consoles? Literally who cares. Consoles are going the way of the dinosaur as it becomes harder and harder to enclose the needed tech in a cheap box small enough that the consumer doesn't revolt. People are already pissed at the size of the new generation of consoles, and they already have overheating problems. You cannot run CoD: BOCW in 4k or it will overheat your PS5 and potentially brick it. Compare that with my 6700K/1080 rig that is years old that can handle the same game in 4k with better fps. The chips in the PS5 will regularly stay at 95-98°C under heavy load and spike to over 100 and force a shutdown. If consoles want to keep up they will need more active cooling and larger cases for more conducive airflow. This is a deal breaker for a lot of consumers, and I expect within the next two generations consoles legitimately being in danger of going extinct.
Yeah, but all of that's the technical and performance side of the coin. He's more talking about the actual appearance in that we're pretty much able to walk right up alongside photorealism even in games that aren't produced on massive budgets from "AAA" studios. Just the Skyrim modding scene alone can make some disturbingly realistic-looking environments, and that's on an engine that was made sometime before most teenagers were even born, just thanks to an unhealthy injection of ENB presets and 8k textures.

Getting the game to perform properly is another matter entirely, but getting a game to look as realistic as possible is almost an achieved goal nowadays. You don't see the same leaps that you did in the old days like Daggerfall to Morrowind to Oblivion to Skyrim to Fallout 4, it's all mostly coming down to a matter of how much extraneous crap you litter around the scene to make it look more believable and less like an OG DOOM level. Otherwise, the last major development I can even think of where I noticed an actual leap in fidelity that wasn't just related to complex, high-poly modeling was subsurface scattering.

tumblr_inline_nlhknflwYp1rjzilr_1280.png

High-poly modeling has existed since the ability to make 3D models existed in the first place, but due to a complete lack of shader effects or much of any other technology at the time, Bryce 3D didn't exactly make us believe that what we were looking at was real. Everything else has just been a matter of how many cans and piles of trash we can throw around to make us feel like Cyberpunk's cities look real, up until a car launches itself off of a tin can thanks to their amazing physics engine and body-slams into a pedestrian.

We're essentially out of the obvious, technological leaps at this point and down to fine-tuning the little, fiddly shit that the casual market won't be able to notice in the first place. There's enormous, diminishing returns on high-poly modeling to the point where an object with 50k triangles won't look very different at all compared to one with 25k, but they'll both look very different than the one with 500. The same wall is coming up very quickly on the texture and shader work, but we're definitely a long ways away from this weird Bryce shit.

372088_QJTE96gBfQ563NFdPUQlHLgrs.jpg
 
Last edited:
Keep in mind "128 bit" (the reason for the quotes is graphics have nothing to do with bits) is an insanely huge number of computations that theoretically could compute everything on Earth, every piece of data ever created throughout history and the entire internet. The limitation is and always will be storage space.
128bit is 4x32bit, which is one row in a 3D transformation matrix or any 32bit XYZW vector. A 3D transformation matrix is used to calculate the position of every vertex in the scene graph to setup rendering so for a console like the PS2 it was important to be able to read data like that in one transfer instead of four because it will be doing A LOT of matrix math. It was later expanded to 512bits on the PS3 and Xbox 360 and 512bits is coincidentally a full 4x4 transformation matrix. The GPU would do the math for rendering geometry on the PS3/360 but it is used a lot outside of rendering as well, for all kinds of gameplay.

There's still a lot more to be achieved in graphics but I think it's just harder for most people to be impressed by new stuff, sort of like CGI in movies. There are still many things in realtime graphics that look like absolute dogshit though.

One thing I was impressed by was the hair on a character in Uncharted 4, the lighting and shading is really good and hair have always been a pain in the ass. It's still massively flawed in other ways, but still, it was a big step up from the PS3/360.
unchartedhair.jpg


This will continue to repeat itself though.
games tiger 90s.jpg
 
Last edited:
No way. We're very close to hitting the limit on silicon but we aren't there yet. We're also looking at graphene as a replacement. So we've still got a lot of headroom.
 
I wish there was half as much focus on things like animation quality, physics simulation and AI as there is on pixel count and shader effects in video game development. Who gives a shit about ray tracing and 4k resolution when your retarded Blade Runner rip-off's NPC movement and car handling are straight out of a PS2 game, and stepping on the wrong environmental prop will randomly launch you 50 feet in the air?
 
As someone who is just plays video games, I don't think graphics have topped out yet. I don't think we'll see textures resolutions above 4k for a while due to storage, but there will always be room for improvements regarding lighting, shadows, models, physics, etc.

I do think Raytracing is the future for high-end lighting and shadows. Improving performance is the major hurdle, though. That's why there has been a big push for DLSS.

As far as Skyrim modding making it look photorealistic, well... Even modders can only make it go so far. The problem with Skyrim (besides the bugs) is the engine. After modding that game for years, I've moved back to using Skyrim Realistic Overhaul for the majority of the games architecture and landscape textures because they work well with Skyrim's lackluster meshes. 8k Whiterun textures remind me of high quality Minecraft textures. In my opinion, it just looks off-putting.
 
I do think Raytracing is the future for high-end lighting and shadows. Improving performance is the major hurdle, though. That's why there has been a big push for DLSS.
Raytracing is marketing hoopla. The race for photorealism will always be a losing one because the product looks like ass about a year after release. A stronger focus on art direction (and maybe gameplay maybe I dunno people don't much like gameplay anymore) will better serve most games.
 
Raytracing is marketing hoopla. The race for photorealism will always be a losing one because the product looks like ass about a year after release. A stronger focus on art direction (and maybe gameplay maybe I dunno people don't much like gameplay anymore) will better serve most games.
Raytracing is good, it's just not good right now. It will help smaller companies and indies produce better looking games down the line for less cost.
 
Good art direction is definitively more appreciated than cutting-edge graphics, which will also often age much better over the years

And graphics don't matter if the game itself isn't fun to play
I agree with this 1000%. Graphics were overrated even during the 360 and PlayStation 3 days. Graphics don’t age well as tv and console technology improves, so imo one may as well as go with an asethetic. Gameplay trumps graphics. If the best thing someone has to say is it has great graphics, I’ll watch a movie instead.
 
Back