GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

The point is that a large percentage of the playerbase can't even touch DLSS3
The segment which can't use DLSS is "people with old GPUs." NVIDIA doesnt care about what sold GPUs 10 years ago. They care about what will sell them today and tomorrow. What all old computer hardware has in common is it eventually gets replaced. What matters is that like 85% of GPUs sold in the last 3 or 4 years support it, and it's getting heavy use. That's why both Intel and AMD are investing into inference-based upscaling. It's just not going away.

People were making this exact same argument against hardware T&L 25 years ago. Voodoo cards didn't support it. Riva TNT and Rage 128 didn't support it. "Nobody can use this except that tiny minority that has 256AV cards!" But then, as now, the new technology always wins.

EDIT: There are AI models that can take a drawing as input and produce photo realistic output. It may very well be that the next massive leap in rendering isn't yet another marginal gain from even more shader units, but full-scene inferencing to generate photorealistic images, or however else the model is tuned
 
Last edited:
  • Thunk-Provoking
Reactions: Gog & Magog
So are there any news about any Intel Arc B580 releases? From what I see, all the supposed "restock" dates sellers like newegg listed have come and gone, with no real noticeable increase in supply/availability (and thereby still high prices). This is making me kinda consider looking into alternatives like RX 7600 XT.
 
  • Thunk-Provoking
Reactions: Vecr
So are there any news about any Intel Arc B580 releases? From what I see, all the supposed "restock" dates sellers like newegg listed have come and gone, with no real noticeable increase in supply/availability (and thereby still high prices). This is making me kinda consider looking into alternatives like RX 7600 XT.
I got a notice that newegg had more stock a few days ago, but they ran out in ten minutes. I've been using nowinstock to keep tabs on it but there really hasn't been much restocking here.
 
I got a notice that newegg had more stock a few days ago, but they ran out in ten minutes. I've been using nowinstock to keep tbs on it but there really hasn't been much restocking here.
So wanting a B580 is effectively waging a war against scalpers and scalper bots, right? This and new issues being found with the card every week (like the overhead issue discovered few days back) is really putting me off it. A shame, 'cause I was really hoping to buy it.
 
So wanting a B580 is effectively waging a war against scalpers and scalper bots, right? This and new issues being found with the card every week (like the overhead issue discovered few days back) is really putting me off it. A shame, 'cause I was really hoping to buy it.
I keep getting drawn to the A770, which I can get used cheaper for the new (non-scalper) price of the B580. their power seems roughly equivalent but I suppose the new technology in the B580 will pay dividends
 
I keep getting drawn to the A770, which I can get used cheaper for the new (non-scalper) price of the B580. their power seems roughly equivalent but I suppose the new technology in the B580 will pay dividends
Actually, yeah, that seems like a good suggestion. Seems like it is more or less competing with 7600 xt for a lower price. Are there any major issues with A770?
 
i got to be honest, every time i look I get information that contradicts the previous information I had.
I've seen that issue too when looking at A770 benchmarks. I think it's safe to say, though, that it is about as good as 7600 xt while being cheaper. Some benchmarks show A770 ahead, some show below, but seems to just be a better deal, imo. I'll see if any other Kiwis pipe in, but otherwise, if B580 still stays out of stock (or gets even more issues), I might just grab A770
 
So are there any news about any Intel Arc B580 releases? From what I see, all the supposed "restock" dates sellers like newegg listed have come and gone, with no real noticeable increase in supply/availability (and thereby still high prices). This is making me kinda consider looking into alternatives like RX 7600 XT.
Clearly some people are getting the B580 at MSRP, but it is a low-volume card that Intel loses money on. You can keep working at it or pivot to your alternatives.

Low-end RDNA4 (such as a 7600 XT replacement) could be better, but there's no guarantee we see that before March/April. AMD's CES keynote is on Monday (J6!) but they may only announce the 9070 XT and 9070 non-XT January launch dates. 9070 XT could be $500-600, non-XT likely to be $400+.

The B570 (10 GB) launch date is January 16. It could alleviate the supply issues, or you could get it instead, but at the initial $219 I doubt it's going to be a popular choice next to a $249 MSRP B580. It would look better at 75-80% instead of 88% of the B580's price.
 
4090 was melting connectors
To be fair that wasn’t because of the GPU, but because of the dumb connector they chose to use. Normal Molex other graphics cards use is rated for twice what the PCIe spec allows a card to draw through it, so there’s lots of margin to keep it safe even if it isn’t plugged in correctly. The new connector’s specification is within a few percent of what it’s physically rated for, so if it isn’t plugged in correctly it quickly overheats when the GPU starts doing something intensive. AFAIK every case where a 4090 melted its power cord was when a user made the cable do a tight ninety degree turn right at the connector. This makes the pins misalign and drastically reduce their contact surface, increasing resistance even though the connector appears plugged in correctly.
Things like that are why good engineers leave a safety margin, Nvidia. You should have just used EPS12V instead.
 
To be fair that wasn’t because of the GPU, but because of the dumb connector they chose to use. Normal Molex other graphics cards use is rated for twice what the PCIe spec allows a card to draw through it, so there’s lots of margin to keep it safe even if it isn’t plugged in correctly. The new connector’s specification is within a few percent of what it’s physically rated for, so if it isn’t plugged in correctly it quickly overheats when the GPU starts doing something intensive. AFAIK every case where a 4090 melted its power cord was when a user made the cable do a tight ninety degree turn right at the connector. This makes the pins misalign and drastically reduce their contact surface, increasing resistance even though the connector appears plugged in correctly.
Things like that are why good engineers leave a safety margin, Nvidia. You should have just used EPS12V instead.
Or included a couple right-angle adaptors in various orientations
 
This video shows off pretty well what I was talking about above:

Shadow of the Colossus:

Perfect Dark:

F-Zero X:

Modern Warfare 2:

All they did was take video game footage and feed it through an AI enhancer, which just blindly enhances the video fed to it. Obviously, this is not real-time inferencing, and the source input is such low-fidelity on the really old games that the AI has to do an immense amount of lifting, but one could easily imagine that, given higher-quality input like a current-gen video game, especially with guides like how DLSS uses motion vectors, a much lighter-weight model could be used to enhance visuals quite a bit.
 
Last edited:
I got lucky a few days ago waking up in the middle of the night to check stocks and managed to snag the ASRock Steel Challenger version of the Intel B580 off PC-Canada. I'll let you know how much suffering it causes me but honestly even at $414.99 vs the $359.99 LE version from Intel it's still pretty good considering the cooler has an extra fan. They didn't even have time to upload a damn image before I was clicking buy. Arrives on Friday. Wish me luck, I'll report back.
 
Last edited:
The point is that a large percentage of the playerbase can't even touch DLSS3 and they will be segmenting even more with DLSS4. Developers are going to care less about supporting it when 99% of people can't even use it.

And how do they determine that 70% of people use DLSS? What telemetry are they using? If I turn it on for a short period (before turning it off) does that count? Some games even have it on as default.
It's bullshit.
many, many games have it on as default from what I have seen
 
Some games even have it on as default.
Does that somehow make it cheating? If they turn it on because your eyes can't actually see in any resolution higher than 1600x1200 and turning on DLSS prevents you from encountering unnecessary frame rate loss from running at ridiculously high resolutions, that seems like a very sensible choice by the developers.
 
Does that somehow make it cheating? If they turn it on because your eyes can't actually see in any resolution higher than 1600x1200 and turning on DLSS prevents you from encountering unnecessary frame rate loss from running at ridiculously high resolutions, that seems like a very sensible choice by the developers.

It's not about running at 1600x1200. It's about running at higher settings. Nearly every effect you can adjust in a game has a cost in fill rate. By reducing the total number of pixels rasterized, DLSS lessens the cost of any given effect, enabling higher frame rates. For example, I use DLSS when running Darktide at 1080p on my laptop. My other options are to run at bare minimum settings or get 40 fps.

Yes because the implied claim is that 70% of users are purposefully using it.

Why do you keep splitting hairs and moving goalposts? It's like this feature makes you angry. If you like lower settings or worse frame rates, you can turn it off. Most people clearly would rather have better graphics, but for some reason, this really seems to bother you, and you're wishcasting for people to not use it and NVIDIA to stop developing it, when it's obvious from how AMD is responding that it's a big deal.

How is PhysX doing these days?

Bought by NVIDIA and integrated into CUDA. Hardware-accelerated physics (GPUs are now general purpose many-thread floating-point machines) is now a multi-billion-dollar industry. Yet another one Intel missed out on because they thought AVX-512 would be enough.
 
Last edited:
That's you. My contention is the segmentation and how it will negatively effect games and support.
The feature has been widely adopted over the last 4 years, since the introduction of DLSS2 made it really viable, and virtually all new games now support it. Moreover, Microsoft has now unified them all behind their DirectSR API.

How have you seen games & support negatively affected since 2021?

That's a nice way of saying it was depreciated because segmentation didn't work.

Yes, the more advanced technology of SIMT displaced the PPU (and multiple other types of hardware acceleration as well). Seems like a great example of better technology winning. What was your point?
 
Last edited:
How have you seen games & support negatively affected since 2021?
Yes. 1, developers are now using upscaling and frame generation as a substitute for proper optimization. 2, segmentation means that unless you have the latest cards their poor optimization hurts you further as the latest DLSS versions will require newer cards. If you honestly support this you may have brain damage.
 
Back