GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

my logic for moore's law breaking down is gaming.
the best graphics card from 10 years ago in 2012 can run brand new games today.
granted they run at low resolution and abysmal framerates but the fact it still runs shows that the jumps became minimal over time.
a graphics card from the year 2002 could not run a game from 2012. it'd be impossible. no matter how low you set the graphics settings, it simply wont run. at most, you might make it to the starting menu, but nothing beyond that.
2.5x more powerful doesn't necesarily mean 2.5x more performance in any real world way either.
That's largely because of the feature set, games can't run if the hardware capabilities aren't there and in the early 2000's going forwrd a lot of new stuff was introduced. A game that requires hardware T&L can't run on a perfectly capable TNT2 either and if something requires RT cores you can't just run it on a 980Ti even though it's a perfectly fine card.
 
That's largely because of the feature set, games can't run if the hardware capabilities aren't there and in the early 2000's going forwrd a lot of new stuff was introduced. A game that requires hardware T&L can't run on a perfectly capable TNT2 either and if something requires RT cores you can't just run it on a 980Ti even though it's a perfectly fine card.
look at it this way, if a 10 year old card can run current games, though badly
a current 3090 can probably run for the next 15 years at least in a similar rate.
an example of a recent games jump i remember that obsoleted old hardware was batman arkham knight. the ram and processing power of the game would be impossible for 10 year old hardware to run. not even 10, most likely 6-7 year old from that point in time.
now 10 year old cards are still useful.
and current cards will be useful for the next 15 years.
imagine a reverse of moore's law if you will
instead of hardware being obsoleted every 18 months because the power doubles every time. the opposite happens, the life of the card increases signficantly every few years.
 
  • Feels
Reactions: Smaug's Smokey Hole
Lol I just wanted to get out of the xbox ecosystem and then all this shit happens at the perfect time. I really hope the new series of cards bring prices down, but I'm doubtful.
 
  • Feels
Reactions: Brain Problems
i have the youtube downloads github thing. is there a way to raise the resolution of downloads?
videos don't seem to go above 720p
You might want to check the YouTube DL thread.

From what I understand, anything above 720p is stored in a strange format that requires you to get the video and audio separately and add them together. Im no expert though.

instead of hardware being obsoleted every 18 months because the power doubles every time. the opposite happens, the life of the card increases signficantly every few years.
Moores law has been dead a while.

But as to your point. There's a few reasons for that. The most obvious is consoles. Devs target console performance. Anything that PC does above that is a bonus. Even PC games target the mid range average joe build. It's also why games look good played on medium settings, whereas in the past the difference is night and day, and running a game at min specs basically meant it booted, but wasn't playable. This is further exaggerated by the "developing world" where there's an emerging market for games on low end hardware.

Modern graphics offer diminishing returns due to simple maths. To double the resolution of a texture requires 4 times the data. There's also a question of how much of that resolution you can actually see. going from 1k to 8k textures makes little difference if you're still playing at 1080p.

There's also hesitancy in adopting new tech, made worse with the GPU shortage. While I might be interested in ray tracing and VR, I can't get on board due to the shortage. Not only that, but the attitude of most is that these things are flash in the pan gimmicks, only for those same people to turn around and complain that games aren't pushing hardware forward any more. We didn't see this kind of resistance to real time physics or bump mapping. People didn't demand a version of Half-Life 2 without the physics so their old laptop could run it, but we saw lots of people demanding a flatscreen version of Half-Life: Alyx.

There's the general ceiling for basic tasks. At a certain point, you don't need more processing power for word processing. Raw power isn't the bottleneck.

Finally, things are getting standardised. It's not like the days where you needed different versions of a game for each brand of graphics card. People have generally agreed that X should be done a certain way, Y should be done a certain way, etc. Since only niche games require features outside of those basics, they're easy to leave as optional.
 
  • Agree
Reactions: Smaug's Smokey Hole
While I might be interested in ray tracing and VR, I can't get on board due to the shortage.
Do you think that as opposed to the days where something like Myst or 7th Guest were "killer apps" that pushed adoption of CD-ROM drives, shit like ray tracing and VR are not only being held back by literal hardware limitations in the sense that consumers can't even get their goddamn hands on the physical materials to even try them, but also because there are no good or interesting reasons to even want to give a shit about them?

What I mean is that not only can I not go out tomorrow and walk (lol) into and out of my local parts shop with the components necessary to become an adopter (at least not for a price remotely close to reasonable if at all) but that there's actually no compelling reason to not just sit where I am with dated but functional hardware and gradually drop settings to hit a framerate that makes me happy?

In other less rambling words: there ain't any good or ground-breaking/revolutionary games so who cares?
 
I'm still trucking along with my 4650G Pro. If you're no framerate fanatic (I'm not, I played 3D stuff in the 90s, framerates dipping below 20 were the norm, not the exception) it's a really decent APU even for games if you limit the framerate to 30 FPS. I'm always surprised anew what games run just fine on it. I wouldn't trust it to play the newest graphics orgy from $current_year in a great way but honestly, I don't play these anyways. Lately I've been playing e.g. Alien: Isolation and it runs just fine with everything turned to ultra in 1920x1200. (and yes, the framerate limited) I know it's kind of an old game but still, pretty cool to play something that modern on a computer that I built to be almost completely silent. It's also really power efficent for what it can pull off.
 
Do you think that as opposed to the days where something like Myst or 7th Guest were "killer apps" that pushed adoption of CD-ROM drives, shit like ray tracing and VR are not only being held back by literal hardware limitations in the sense that consumers can't even get their goddamn hands on the physical materials to even try them, but also because there are no good or interesting reasons to even want to give a shit about them?

What I mean is that not only can I not go out tomorrow and walk (lol) into and out of my local parts shop with the components necessary to become an adopter (at least not for a price remotely close to reasonable if at all) but that there's actually no compelling reason to not just sit where I am with dated but functional hardware and gradually drop settings to hit a framerate that makes me happy?

In other less rambling words: there ain't any good or ground-breaking/revolutionary games so who cares?
It's a bit like a chicken/egg thing, zede rum had other benefits like storage and being faster than diskettes. CD games like Rebel Assault were actually hot garbage though while moving a lot of CD-Rom readers(it was often bundled with it). I guess it being a crap novelty that can be compared to a lot of VR games. 7th Guest and Myst on the other hand, those are good examples of games that are worth playing even today.

The strange thing is that the Valve Index have frequently shown up on the Steam Top 10 bestseller charts for quite some time, Facebook is still trucking along with Oculus, Sony is making a PSVR successor(unless that was fake news) so people are buying the damn things. No one seems to really be talking about the games though and that's weird. It can't all be porn, can it?
With some exceptions a lot of games seems to be the VR equivalent of mobile games. Maybe people scoff at the idea of paying more than $9.99/14.99 for a game to use on their $1000 headpiece running on an even more expensive PC.

Maybe the hypothetical push for "metaverse" crap will do something. In Facebooks scenario lots of people will be sitting on VR helmets at home because they do virtual meetings with them, why wouldn't the father let the kids mess around with it after dinner, it would be similar to how kids played PC games on dads work computer in the late 80's early 90's and that really built up a future market.

Some interesting things have happened, RE4 can be played with free movement and those that I've seen play it doesn't seem to have an issue with it. When they said that the problem with VR is that the player can't move like in a standard FPS or they will get sick my immediate thought was that people were pussies and it would pass and that seems to have started happening. I played a Quake 1 port in VR a couple of years ago and that fucker is fast and jarring but I never felt sick.
When people played/streamed early games on the Oculus DK2 a lot of them had the familiar FPS movement and it didn't seem like much of a problem.

My point with the last paragraph is that the 'do not's probably hampered game development and some are fading away. I think that it is possible to do some truly wild shit in VR, things that is not possible on conventional game platforms, but the current wisdom is "oh no, don't do that people will feel uncomfortable, making a shooting gallery instead".
 
CD games like Rebel Assault were actually hot garbage though while moving a lot of CD-Rom readers(it was often bundled with it). I guess it being a crap novelty that can be compared to a lot of VR games. 7th Guest and Myst on the other hand, those are good examples of games that are worth playing even today.
I never thought of that, but yes. The early days of CD based games were shitty FMV games, and the occasional classic like Sonic CD that (at first glance) have no reason it couldn't work on a stock Megadrive.

there ain't any good or ground-breaking/revolutionary games so who cares?
You can argue games like H3VR, Half-Life: Alyx, and Boneworks aren't killer apps, or that you're just not interested in VR. That's not really the point. The point is that we've hit diminishing returns when it comes to things like polycount and texture resolution. If you want meaningful advances in graphics, gameplay, or technology, you have to look outside bumping up VRAM and the number of cores.

When they said that the problem with VR is that the player can't move like in a standard FPS or they will get sick my immediate thought was that people were pussies and it would pass and that seems to have started happening.
I remember people complaining about getting motion playing the original Doom on release, but people have moved past that.
 
I'm still trucking along with my 4650G Pro. If you're no framerate fanatic (I'm not, I played 3D stuff in the 90s, framerates dipping below 20 were the norm, not the exception) it's a really decent APU even for games if you limit the framerate to 30 FPS. I'm always surprised anew what games run just fine on it. I wouldn't trust it to play the newest graphics orgy from $current_year in a great way but honestly, I don't play these anyways. Lately I've been playing e.g. Alien: Isolation and it runs just fine with everything turned to ultra in 1920x1200. (and yes, the framerate limited) I know it's kind of an old game but still, pretty cool to play something that modern on a computer that I built to be almost completely silent. It's also really power efficent for what it can pull off.
It really is crazy what these Ryzen APUs are capable of. I've got an older Ryzen 3 in a small form factor box hooked up to my TV and it runs older games no problem (Borderlands 2, Skyrim, Arkham Asylum).

I've been doing a lot of gaming on my old laptop recently that has a GTX 950m and honestly not sure I need any more. Its been able to run everything I've wanted to play and I don't care if I can't play on max settings with super high FPS. If a game isn't fun no amount of graphical fidelity will change that (for me anyway) and if it is fun all the other stuff doesn't matter past a certain point. I put in just over a 100 hours into Wither 3 and had a grand old time, decided to check how it was actually performing when I was messing around with afterburner and it was hovering around 30 fps and about 26 fps in the cities. That right there would cause a lot of people online to reee and say its completely unplayable but I didn't even notice.
 
I've been doing a lot of gaming on my old laptop recently that has a GTX 950m and honestly not sure I need any more.
Is that the XP gaming laptop you were looking for in the other thread? (I feel bad about not being able to help you with that. I know nothing about laptops.)

It really is crazy what these Ryzen APUs are capable of. I've got an older Ryzen 3 in a small form factor box hooked up to my TV and it runs older games no problem (Borderlands 2, Skyrim, Arkham Asylum).
I fell down a small form factor rabbit hole recently. NUCs, Raspberry Pi's, and others. One video showed a Raspberry Pi 4 (a single board computer about the size of a PC mouse) running Dreamcast emulation at full speed was crazy to me.
 
  • Like
Reactions: Brain Problems
Is that the XP gaming laptop you were looking for in the other thread? (I feel bad about not being able to help you with that. I know nothing about laptops.)
No, its an old gaming laptop I've got running win 10 (I think its from maybe 2015? has an i7 6700HQ). I've been keeping an eye out for an XP era machine to tinker with at my local flea market but no luck so far.
I fell down a small form factor rabbit hole recently. NUCs, Raspberry Pi's, and others. One video showed a Raspberry Pi 4 (a single board computer about the size of a PC mouse) running Dreamcast emulation at full speed was crazy to me.
I think I fell down the same rabbit hole a while ago. I ended up buying a Lenovo SFF PC, its about the size of a book and sits next to my TV, I use it for basically all multimedia in my living room and some gaming/emulation. They are meant to be business machines so Lenovo keep putting out new iterations and the performance with the newer Ryzen chips is honestly crazy considering the size.
 
if it is fun all the other stuff doesn't matter
Exactly. That 4k gravelstone texture is not gonna hide the fact that the game is repetitive shit I already found boring 10 years ago. I mostly limit the framerate to avoid the big jumps that are noticeable and because it really takes work off the GPU. A smooth 30 FPS feel smooth, a framerate that jumps between 30 and 55 depending on where you look doesn't.

One video showed a Raspberry Pi 4 (a single board computer about the size of a PC mouse) running Dreamcast emulation at full speed was crazy to me.
I bought a cheap celeron netbook on a whim I might or might not keep, 11" tiny and fanless with a battery runtime of about 12 hours. Decent internet machine but what really blew my mind was it running Daggerfall in DOSBox without breaking even a tiny sweat. On the surface not a surprise, but I remember perfectly getting Daggerfall the first time and being annoyed how poorly it ran in some circumstances on my then-system I paid several thousands for. It also does Amiga emulation and even some modern, non-recent 2D kind of games decently. This really is the future. I often forget how mindblowing some of the things that are everyday now would've been to me back then.
Lenovo SFF PC,
I almost got one of them but then I wanted to use a mechanical drive and I was also a bit concerned with the cooling in these ready-made units maybe not being the best decibel-wise so I custom built. Also interesting is that the 4650G Pro supports ECC RAM in that configuration. (you need the Pro version if you want to use ECC with the iGPU) I consider ECC to be a mandatory feature for my main workstation. I would expect the ECC RAM to add some latency though. RAM throughput makes or breaks those APUs.
 
Last edited:
Do you think that as opposed to the days where something like Myst or 7th Guest were "killer apps" that pushed adoption of CD-ROM drives, shit like ray tracing and VR are not only being held back by literal hardware limitations in the sense that consumers can't even get their goddamn hands on the physical materials to even try them, but also because there are no good or interesting reasons to even want to give a shit about them?

What I mean is that not only can I not go out tomorrow and walk (lol) into and out of my local parts shop with the components necessary to become an adopter (at least not for a price remotely close to reasonable if at all) but that there's actually no compelling reason to not just sit where I am with dated but functional hardware and gradually drop settings to hit a framerate that makes me happy?

In other less rambling words: there ain't any good or ground-breaking/revolutionary games so who cares?
Ray tracing was already understood to be a gimmick of the RTX 2000 series GPUs that would take a few years to really catch on. 2023 if you believe this guy, but 2025 seems more likely. When there have been a few generations of GPUs from AMD and Nvidia with ray tracing acceleration, and consoles have been refreshed, it will become more common and will have been around long enough for some "good game" to really take advantage of it. Adoption will just naturally happen as it benefits game developers and the hardware accelerators will be in everything, even including phones.

VR actually needs something killer to break out of its niche, and it needs new hardware advancements like varifocal lenses to be less crappy. I don't think Half Life: Alyx was enough.

Half-Life: Alyx received acclaim for its graphics, voice acting, narrative, and atmosphere, and has been cited as VR's first killer app.

I bought a cheap celeron netbook on a whim I might or might not keep, 11" tiny and fanless with a battery runtime of about 12 hours. Decent internet machine but what really blew my mind was it running Daggerfall in DOSBox without breaking even a tiny sweat. On the surface not a surprise, but I remember perfectly getting Daggerfall the first time and being annoyed how poorly it ran in some circumstances on my then-system I paid several thousands for. It also does Amiga emulation and even some modern, non-recent 2D kind of games decently. This really is the future. I often forget how mindblowing some of the things that are everyday now would've been to me back then.
OpenMW with cranked up view distance is the true test. Also, allow me to guess your CPU... N4020?
 
  • Informative
Reactions: Brain Problems
Exactly. I got it very cheap for a bit over a 100 Euros as "used" from a commercial seller who was selling these in bulk. (it actually was never used though, that was fairly obvious) eBay in my parts is flooded with these at the moment. I have a feeling it has something to do with COVID and the attempts at homeschooling my government made who all fell flat.

Usually I'm not a fan of intel but it's alright. The only thing I'd really criticize is the screen. 1366x768 is actually fine and has fairly nice DPI at that size (Lowering the resolution to 1360x768 makes sense for full screen linux terminal though as no common font size divides well by 1366, I mostly use bitmap fonts and don't care much about GUIs) it's just the screen is generally kinda crappy, from viewing angles to color reproduction. It wouldn't be too difficult to replace the panel with a 1080p IPS panel though. It's such a common notebook and everything is so standardized these days that you could theoretically buy all the spare parts from china and put one together yourself.
 
Ray tracing was already understood to be a gimmick of the RTX 2000 series GPUs that would take a few years to really catch on. 2023 if you believe this guy, but 2025 seems more likely. When there have been a few generations of GPUs from AMD and Nvidia with ray tracing acceleration, and consoles have been refreshed, it will become more common and will have been around long enough for some "good game" to really take advantage of it. Adoption will just naturally happen as it benefits game developers and the hardware accelerators will be in everything, even including phones.
what advantages tho? looking slightly better in areas no one notices unless pointed out while requiring several magnitudes more processing power? most consumers have no advantage from raytracing, that's why it's a meme mainly driven by marketing to sell new hardware because NEXT BEST THING IN GRAPHICS TECH (while the idea being literally centuries old).

it's like pushing for more polygons in models "so they look more realistic and stuff", completely ignoring the diminishing returns or point.

The strange thing is that the Valve Index have frequently shown up on the Steam Top 10 bestseller charts for quite some time, Facebook is still trucking along with Oculus, Sony is making a PSVR successor(unless that was fake news) so people are buying the damn things. No one seems to really be talking about the games though and that's weird. It can't all be porn, can it?
With some exceptions a lot of games seems to be the VR equivalent of mobile games. Maybe people scoff at the idea of paying more than $9.99/14.99 for a game to use on their $1000 headpiece running on an even more expensive PC.

Maybe the hypothetical push for "metaverse" crap will do something. In Facebooks scenario lots of people will be sitting on VR helmets at home because they do virtual meetings with them, why wouldn't the father let the kids mess around with it after dinner, it would be similar to how kids played PC games on dads work computer in the late 80's early 90's and that really built up a future market.

Some interesting things have happened, RE4 can be played with free movement and those that I've seen play it doesn't seem to have an issue with it. When they said that the problem with VR is that the player can't move like in a standard FPS or they will get sick my immediate thought was that people were pussies and it would pass and that seems to have started happening. I played a Quake 1 port in VR a couple of years ago and that fucker is fast and jarring but I never felt sick.
When people played/streamed early games on the Oculus DK2 a lot of them had the familiar FPS movement and it didn't seem like much of a problem.

My point with the last paragraph is that the 'do not's probably hampered game development and some are fading away. I think that it is possible to do some truly wild shit in VR, things that is not possible on conventional game platforms, but the current wisdom is "oh no, don't do that people will feel uncomfortable, making a shooting gallery instead".
software is another chicken/egg problem, while it sells it doesn't sell enough to warrant AAA development, since there are no "killer apps", there's no reason for normie joe casual to buy into it. vr is the hardware version of native linux ports.

some other factors are stuff like side effects, it might have worked for you but the big issue for now is still that it works for everyone differently. it's a hard sell to have normie joe casual give it a spin only for him to puke his guts out and be out of commission for 2+ hours due to nausea. "you just need to get used to it" doesn't help much when some people simply can't (similar to some people still get motion sickness from a pancake screen).
it's still not old enough that all the kinks have been figured out, it's finally technically possible to use to a satisfying degree while not costing several paychecks, so now people can move on to figure the next hurdles. not only had devs to try to diminish said effects, there were no best practices or common formula established yet. if you go back 15-20 years most games feel janky with weird input schemes, sometimes completely different per game, while these days everything will be exactly like you'd expect it to be - which in turn makes it easy for normie joe casual to pick up and get into it fast.

and least of all you still need a use case. while VR is great, it's still too much of a hassle and expensive to be attractive to the mainstream. most genres simply don't work or need more technical advancement, while the ones where it works are more niche, thus limited to a niche crowd, leading to the chicken/egg problem from the beginning (not to mention the increased costs since you're a literally in the game where you can directly eyeball every texture and model). for example take the most popular games right now - how would they work in vr? why would you even want it in vr when it's much easier/cheaper/comfortable to use it the way it works right now? most people simply have no reason to dip into VR besides the occasional gimmick or playing it at a friend's house. so ok, let's try something new to replace them, how and what could you do with current tech? which would also require expertise and ability to innovate which simply doesn't exist much anymore in the modern video games industry (and indies don't have the money for it, hence the mobile tier length and simple games).

as for metaverse, it's a nice idea, problem is again what's the point and the devil is in the details. why would I want to sit in a virtual conference, concert or whatever when you're the virtual equivalent of a lego figure because there is no tech to properly capture all your gestures and more importantly mimics to give a proper virtual representation of yourself? good luck selling that to companies when the beancounter asks "why not use microsoft teams for that?". for now it's the zoomer version of playstation home (BUT IN 3D THIS TIME!), and will be for a long while because outside deep dive tech (which works completely different and has it's own literal mountain ranges of issues to figure out - like who in his right mind would give facebook access to his brain or even optic nerve? how would you research it without turning loads of human guinea pigs into vegetables in the process?) even miniaturization and increase in performance won't solve the fundamental issues.
 
Last edited:
@ZMOT
as for metaverse, it's a nice idea, problem is again what's the point and the devil is in the details. why would I want to sit in a virtual conference, concert or whatever when you're the virtual equivalent of a lego figure because there is no tech to properly capture all your gestures and more importantly mimics to give a proper virtual representation of yourself. good luck selling that to companies when the beancounter asks "why not use zoom for that?".
I was banking on stupidity and trend for that one. Like, phone conferences worked fine up until corona hit and suddenly we have to use teams. On one end there are two people and a laptop, on the other there's three people and a laptop. Bunching together so everyone can see and be seen isn't an option because of workplace social distancing rules but this is totes better than using the old conference phone.

Someplace, somewhere, there's 17 people in a conference room with 17 laptops zooming into a meeting with 7 other people on the other side of the building, speakers blaring, everyone is annoyed and agitated by this forced stupidity. Someone higher up notices the frustration and remembers reading an article about putting VR set on dairy cows to calm them down...
 
what advantages tho? looking slightly better in areas no one notices unless pointed out while requiring several magnitudes more processing power? most consumers have no advantage from raytracing, that's why it's a meme mainly driven by marketing to sell new hardware because NEXT BEST THING IN GRAPHICS TECH (while the idea being literally centuries old).

it's like pushing for more polygons in models "so they look more realistic and stuff", completely ignoring the diminishing returns or point.
@ZMOT As long as complete photorealism hasn't been achieved, the industry should push graphics forward using techniques such as ray tracing. Game developers will want to use it since it's easier to do lighting. Whether or not you want to buy into it is up to you, but your hand may be forced if new games start ditching rasterization entirely several years from now. If you don't care because new games suck, then you have no problem, but every new GPU, APU, and phone will have ray tracing acceleration anyway.

It won't require several orders of magnitude more performance. The AI denoising techniques used allow it to be done in real time in the first place. Rasterization and ray tracing performance will increase in parallel with new generations of GPUs until nobody cares about rasterization anymore.
 
Back