how long before the law of diminishing returns for computers hits its limit?

Michael Janke

kiwifarms.net
Joined
Apr 18, 2021
I was watching a video about a 10 year old graphics card. to be fair, it was literally the most powerful card 10 years ago but it was managing to run games 9 years after its release.
sure the settings and resolution were at minimum and the framerate was pretty bad, but the fact that it could run these things at all is pretty telling.
in the year 2000, the most powerful graphics card of that time stopped working when when the ps3 became a thing. it certainly couldn't play triple a games nine years after its release, no matter how low the settings.
at that rate, the most powerful card we have now should be able to play games like 15 years after its release
thats just graphics cards. Old CPU"s can still be used today as well.
ram is slightly different, as software gets more bloated we'll just use up more and more over time.
how much time do you figure is left before we go from diminishing returns to no returns?
 
GPUs are still making pretty good progress with each generation, though it's becoming less and less. I'm pretty sure the next architecture after Ampere will be the point where we start seeing a lot less upgrades each generation. And that's considering you want to use raytracing or that smart scaling stuff.
 
GPUs are still making pretty good progress with each generation, though it's becoming less and less. I'm pretty sure the next architecture after Ampere will be the point where we start seeing a lot less upgrades each generation. And that's considering you want to use raytracing or that smart scaling stuff.
i consider a graphics cards unusable when it cant run the newest games at absolute minimum settings. the newest and most powerful cards are built for 4k raytracing. so when it cant even do minimum settings 720p raytracing, thats when the card basically becomes useless.
 
i consider a graphics cards unusable when it cant run the newest games at absolute minimum settings. the newest and most powerful cards are built for 4k raytracing. so when it cant even do minimum settings 720p raytracing, thats when the card basically becomes useless.
For me it's like the middle ground. I'm an FPS slut so prefer everything to be 120. I'll probably pick up the middle of the road ampere next and be done for quite a while. Markets are probably a lot more relaxed by then as well.
 
moores law states that computing power doubles every 2 years.
the competence of programmers halves in 3.

if we had more programmers of the zx spectrum/c64 calibre, we could replace supercomputers with a cellphone and possibly save 1.21 Jiggerwatts of energy, enough power to go back in time and tell hitler about gas centrifuges.
 
The biggest leaps in 3D graphics happened between 1994-2007, and any progress after that has been just incremental. The need for better hardware is partially driven by the lack optimization in games. If you are developing for the newest cards you are going to get decent FPS without having to resort to all the hacks and tricks that made older games run well.
 
The biggest leaps in 3D graphics happened between 1994-2007, and any progress after that has been just incremental. The need for better hardware is partially driven by the lack optimization in games. If you are developing for the newest cards you are going to get decent FPS without having to resort to all the hacks and tricks that made older games run well.
and imagine if people learned to use opcode
 
You've noticed that computer graphics haven't really advanced in the past 12 years, have you? What if I told you that having the nice GPU is really just a band-aid on the leaky dam that is incompetence in programming? There are many games, features, quality rendering pipelines, et al, that we have now, that could easily have run on your 10 year old GPU, and the only reason they don't, is thanks to shitcode.

We have so much computational overhead for things like games, and it's usually wasted, thanks to human hands never touching batch jobs, like texture scaling and formatting, GI baking, or optimizing code for certain platforms and architectures. When you have a script that processes all of your assets for you, you're not exactly motivated to go back and check its work. Removing useless or excess data from being rendered adds up with each byte removed, almost like how supercar manufacturers shave ounces from common parts for weight reduction. You don't see that in software, so you're running a glut of things you will never see in your game.

If all you do is play games and browse the internet on your computer, you haven't had a good enough reason to get a new CPU in over a decade. That's where most of the overhead is. Unfortunately, you're not pinning your CPU at 100% usage while running Fortnite, because almost everything is being run through the graphics card.
 
Back