GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

I'm still trying to figure out how the fuck a lot of these tech youtubers can get ahold of these high end cards when the average consumer can't find them anywhere.
well you see, theyre techtubers. of course they're gonna get the cards. half the time, if theyre big enough, theyre straight up given to them (they usually have to give them back of course but still)
 
  • Agree
Reactions: bigoogabaloogas
I'm still running a RX 580 in my system (and have been for the last 5 years or so), and it's been just fine. It chugs through games from the Xbox One / PS4 era or earlier, and I don't really plan on upgrading to Windows 11. From what I'm hearing the 24H2 update has fucked with a lot of older titles, so I'll be sticking with Windows 10 for the forseeable future. This PC is mostly used as a Media Center / Emulation box / Older game machine. I built it largely to replace my Xbox, but I have a Playstation 5 Slim for a lot of newer titles (that catch my interest). This setup has worked out fine for me for the last few years.
 
I'm still trying to figure out how the fuck a lot of these tech youtubers can get ahold of these high end cards when the average consumer can't find them anywhere.
It's not that surprising.

If they
  1. don't get review copies (shills like LTT will ALWAYS get these- and anyone who drops a release video on day 1 obviously got these)
  2. don't succeed in sniping a card (with a lot more monetary incentive to do so than someone who just wants to give Jenson Huang his money for better framegen)
  3. don't just straight up pay for a scalper card, given that they can easily make the money back
which will cover almost everybody actually making money from YT, obviously- then for any substantial channel with a big group of Discord sycophants, they'll just be able to borrow a card from some cashed up cuck who did manage to buy a copy (possibly even from a scalper) who would happily lend their rapidly deprecating 'investment' to his parasocial daddy for five minutes on voicechat.
 
  • Like
Reactions: The Ugly One
I've gone full AMD as of some time ago and other than not having DLSS, I really don't see much in way of issues. From what I've read, AMD is superior in barebone price:power, why it's the skeleton for handheld decks etc. Nvidia is apparently only kept artificially alive by every single developer on earth catering to them? It feels like people have gotten so used to nvidia gpus they'd sooner buy a dogshit nvidia piece than "This is literally superior in every fucking way" from intel.

Kinda funny looking back at how big an investment my 5800x3D seemed, but it was mostly installment wise and not pricewise. Hm.
 
I'm still trying to figure out how the fuck a lot of these tech youtubers can get ahold of these high end cards when the average consumer can't find them anywhere.
Incidentally, the review card batches were NOT affected by the Nvidia ROP problem.

RTX 5090s supplied to independent reviewers had their die stamped with the term “Press Build”. While Nvidia explicitly denied any performance enhancements, it’s speculated these chips had all 176 ROPs enabled. This has led to claims that certain retail variants might be carrying a “defective” or slightly cut-down GB202 chip with fewer functioning ROPs. Note this is not a proven fact and is still a hypothesis.
 
X3D chips are godsends for true autist games: tactics/strategy/simulation games and Factorio
I got it more or less for WoW, but didn't feel the perma 120 frames cause my render scale was set to 105% which apparently halves your fps. I swear at this point specs don't matter shit cause games don't use half your gear. You need to overshoot by 200% to be guaranteed you'll get even 60 frames in a contained Yakuza scene. I somehow got 120 average in Wilds with all the fake ai shit on, so I hope i can get 60 on high without fancy effects but we'll see. If I can't? It literally ain't on me. Any manner of optimization and I can run new games really well.
 
It still ran at a dodgy 20-25 fps in 480i on its target platform.
Ight, now take note of how I said The Witcher 3 and GTA V, both released in 2015, ran in playable framerates on a GTX 1060, a mid-range GPU from 2016, in native 1080p. Granted, it didn't hold stable 60fps, but it never dropped below 40fps, so it was firmly in the acceptable range on period accurate hardware, from the PS4 era, which according to your S-graph was at the point where you had to justify ramping up hardware demand for better graphics.

I'd like to know exactly why is it that between Pascal and Turing, we suddenly ended up in a rat race where all of that ceased to matter, and how much of it was "it's just how it'll be from now on", and how much of it was "Jensen said so, so now believe it or die". The way I see it, everyone had it pretty much figured out a decade ago, and it was a straight road from there.
more realism
Which isn't something everyone demands. Look at Minecraft and Fortnite, two of the most popular games around. Pixel art block game and cartoonish art style. Gamers are fine with stylized graphics if the game itself is good, the whole realism chase is something fueled mainly by the delusions of the industry, that games HAVE to be photorealistic or else they won't be good. Not to mention, games that don't go full photorealism and stick to a more stylized look are the games that age the best and look the best.

It's very easy to land into uncanny valley with overt realism and brake the suspension of disbelief, much less so if you give it that distinctive video game look. Again, The Witcher 3 and GTA V. They weren't 100% photorealistic, they had just enough stylization to not cause any issues related to photorealism, still looked good, still played well, still hold up to this day. I'm saying this because the justification of "we have to make everything run like shit for the sake of ray traced everything and photorealism" is asinine. No one is demanding for everything to be as photorealistic as possible to justify this.
problems with tricks like cube mapping and SSR
Yes, both have issues. Cubemapping is limited and SSR is buggy. But most other lighting aspects we fake very well at low computing demand where ray tracing would only have a major performance impact at virtually no visual improvement, so why not ray trace the parts that no conventional tricks can match and keep everything else as-is to have both good visuals and good performance? Why not be smart about it? Why toss everything in the garbage for the sake of ray tracing?

The issue at hand is:
-Hardware demands in games are getting way too high for the level of visual quality we get in return

And the only two explanations to these issues are:
-Nvidia being a dick
-Devs being lazy

What isn't an explanation is:
-It's just the way it is, diminishing returns, deal with it

Unless of course, you're in the camp of the two aforementioned issues, by which point it's shifting the blame of your mistakes onto the people that are ultimately affected by it.
 
Ight, now take note of how I said The Witcher 3 and GTA V, both released in 2015, ran in playable framerates on a GTX 1060, a mid-range GPU from 2016, in native 1080p. Granted, it didn't hold stable 60fps, but it never dropped below 40fps, so it was firmly in the acceptable range on period accurate hardware, from the PS4 era, which according to your S-graph was at the point where you had to justify ramping up hardware demand for better graphics.

I'd like to know exactly why is it that between Pascal and Turing, we suddenly ended up in a rat race where all of that ceased to matter, and how much of it was "it's just how it'll be from now on", and how much of it was "Jensen said so, so now believe it or die". The way I see it, everyone had it pretty much figured out a decade ago, and it was a straight road from there.

I would like to know exactly what games you have in mind that were released in 2021 and cannot run at an unstable 60 fps, sometimes dropping as low as 40 fps, on a 4060, a mid-range GPU from 2022.

Which isn't something everyone demands. Look at Minecraft and Fortnite, two of the most popular games around.

You're changing the subject. Your complaint was that more processing power isn't delivering good enough graphics, and I'm pointing out that the issue is that as the list of things we can't render in real time shrinks, the computational cost of whittling that list down is increasing exponentially. Fortnite being an 8-year-old game designed to run on 15-year-old hardware doesn't change that.

so why not ray trace the parts that no conventional tricks can match

This is called partial scene ray tracing, it's already being done, it's still really expensive, and it can't do shadows.

And the only two explanations to these issues are:
-Nvidia being a dick
-Devs being lazy

What isn't an explanation is:
-It's just the way it is, diminishing returns, deal with it

The cost of ray tracing is well known and is not based on how nice the GPU company's CEO is or how hard the developers work. It's a very old technology that Pixar uses in every release. There is a hard mathematical floor, and that floor is really high. Lisa Su is one of the nicest CEOs on the Fortune 500, probably the nicest, and there isn't a single studio that's managed to make ray tracing run acceptably on a 5700 XT, nor will there ever be. Not only that, but AMD cards are worse at ray tracing than NVIDIA cards.

Another example: 4K resolution is exactly 4x the computational cost of 1080p, despite many people feeling like the difference is pretty marginal compared to the leap from 480p to 720p, which was comparatively smaller. No amount of hard work or niceness will change the fact that 8,294,400 is exactly four times greater than 2,073,600. The visual difference is marginal because as things get tinier, our eyes are worse at seeing them. This is simply an outcome of biological evolution, again, the fact that Lisa Su is nice doesn't change this on AMD cards. It's just hard math, embedded in the foundations of the universe itself.
 
Last edited:
Whoever at Nvidia decided to discontinue the 40 series deserves to be beaten with a tire iron.
The 40 series was the GOAT and im tired of pretending it wasn't.
The 50 series is made on same process node as the 40 series, so every 40 series chip they make is a 50 series chip they don't make. There's no reason to continue making the 40 series at all.
I mean you're right, I just wish they cost the same. That and saying 4090 had a certain feel to it that saying 5090 just doesn't.
I would like to know exactly what games you have in mind that were released in 2021 and cannot run at an unstable 60 fps, sometimes dropping as low as 40 fps, on a 4060, a mid-range GPU from 2022.
What resolution too? The 4060 gets ragged on, but its good at where it shines: at 1080p gaming. It can even go beyond under certain circumstances.
 
I mean you're right, I just wish they cost the same.

Given that 5070 Tis are going for $1500 on ebay, and there's an 18 month wait for Blackwell datacenter cards, GPUs are underpriced if anything. Not that I "like" this. I'm not paying two fucking thousand dollars for a toy. I spent $1500 on this PC, and that felt like a lot.

But the Biden admin just printed off another 1.8 trillion dollars last year, the Trump admin is set to print off just as much (DOGE is already floating the idea of not reducing the deficit at all and just redirecting all that printing straight to consumer spending), and the AI bubble still hasn't popped, so demand is still massively outstripping supply.
 
Given that 5070 Tis are going for $1500 on ebay, and there's an 18 month wait for Blackwell datacenter cards, GPUs are underpriced if anything. Not that I "like" this. I'm not paying two fucking thousand dollars for a toy. I spent $1500 on this PC, and that felt like a lot.

But the Biden admin just printed off another 1.8 trillion dollars last year, the Trump admin is set to print off just as much (DOGE is already floating the idea of not reducing the deficit at all and just redirecting all that printing straight to consumer spending), and the AI bubble still hasn't popped, so demand is still massively outstripping supply.
I'd wait at least a year before even considering a 50 series card. Your rig isn't going to explode before then. Just wait for supply to catch up naturally. And *hopefully* inflation to equalize.
 
  • Like
Reactions: Flaming Dumpster
My biggest worry is if my 4090 catches fire and after the warranty ends or it’s within the warranty period and the manufacturer decides they want to give me 5080 instead of a 5090.

I imagine due to the cost and the limited supply, manufacturers are all going to go to extreme lengths to try and screw 4090 owners who end up having to RMA.
 
My biggest worry is if my 4090 catches fire and after the warranty ends or it’s within the warranty period and the manufacturer decides they want to give me 5080 instead of a 5090.

I imagine due to the cost and the limited supply, manufacturers are all going to go to extreme lengths to try and screw 4090 owners who end up having to RMA.
Just tell them your use case is AI, so they’ll just have to replace it with an H100 if they can’t find you another 4090.
 
My biggest worry is if my 4090 catches fire and after the warranty ends or it’s within the warranty period and the manufacturer decides they want to give me 5080 instead of a 5090.

I imagine due to the cost and the limited supply, manufacturers are all going to go to extreme lengths to try and screw 4090 owners who end up having to RMA.
4090 Chad? More like 4090 Charred
 
I think we're beginning to hit a wall graphically
IIRC the nvidia CEO said the same thing about their graphics cards, they're focusing more on AI because they can't really do more with graphic cards beyond upping the voltage and cuda cores. In a way we're inching closer and closer back to those wall sized computers (pretend data centers don't exist)
 
IIRC the nvidia CEO said the same thing about their graphics cards, they're focusing more on AI because they can't really do more with graphic cards beyond upping the voltage and cuda cores. In a way we're inching closer and closer back to those wall sized computers (pretend data centers don't exist)
we're also inching closer to Intel being good enough for gaming, even their iGPUs.

the day you can get a CPU or APU with the performance of a 3060 is the day the GPU market starts dying.

Right now the best right now is the AMD Ryzen™ 7 8700G which has an iGPU called the 780M which is pretty close to a RTX 2050 mobile in power
 
Back