GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

The 4090 clearly suppressed 4080 sales because buyers were disgusted by the 4080 being so expensive with similar price/perf to the 4090
4080 sales were really cannibalized on both ends - if your aim is a native/internal resolution of 1440p at ultra/high settings, the 4070 and 4070 Ti were both more than powerful enough to do it. If you wanted native/internal 4K at ultra/high settings, even the 4090 struggled here so most gamers just opted to stick to 1440p. Nvidia tried to position the 4080 as being firmly in the 'for serious gamers/prosumers, not enthusiasts/professionals' segment but gamers didn't bite on the upsell and prosumers just bought a 4060 Ti 16 GB for shit like video editing.

With the mid-gen refresh over, AAA gaming in chaos, and AI hype dying down, I actually feel like the 5090 is going to be a much tougher sell than the 4090 was. And I think the 5080 is also DoA. If Nvidia is smart, they won't repeat the same mistake and make a fuckton of 5080s that have to sit unsold.
 
4080 sales were really cannibalized on both ends - if your aim is a native/internal resolution of 1440p at ultra/high settings, the 4070 and 4070 Ti were both more than powerful enough to do it. If you wanted native/internal 4K at ultra/high settings, even the 4090 struggled here so most gamers just opted to stick to 1440p. Nvidia tried to position the 4080 as being firmly in the 'for serious gamers/prosumers, not enthusiasts/professionals' segment but gamers didn't bite on the upsell and prosumers just bought a 4060 Ti 16 GB for shit like video editing.

With the mid-gen refresh over, AAA gaming in chaos, and AI hype dying down, I actually feel like the 5090 is going to be a much tougher sell than the 4090 was. And I think the 5080 is also DoA. If Nvidia is smart, they won't repeat the same mistake and make a fuckton of 5080s that have to sit unsold.
Either 5070ti or 5090 better fits price points likely.
 
With the mid-gen refresh over, AAA gaming in chaos, and AI hype dying down, I actually feel like the 5090 is going to be a much tougher sell than the 4090 was.

I wonder how much hotter they can even go on these GPUs before they just plain can't sell them. 4090 was melting connectors, and 5090 is supposed to be over 25% hotter at 575W. To put that in perspective, the very hottest server CPUs out right now are 500W.
 
  • Like
Reactions: Bananadana and Vecr
not to mention that Intel may have just proven that the better card isn't the most powerful, but a decent one that's inexpensive and NOT deliberately crippled.
 
not to mention that Intel may have just proven that the better card isn't the most powerful, but a decent one that's inexpensive and NOT deliberately crippled.

This is in a gaming market that is happy enough with the 10-year-old, discontinued PS4's graphics that marquee titles are still coming out for it. The 4090 and 5090 feel like products that are increasingly desperate to justify their own existence.
 
not to mention that Intel may have just proven that the better card isn't the most powerful, but a decent one that's inexpensive and NOT deliberately crippled.
The problem is that the B580 is just Intel offloading chips they'd already paid for. I don't think the card makes money and isn't intended to be a sustainable venture. 272 square mm die on N5 for half the price of AMD's 200 346 square mm die on N5 is blatantly Intel selling at cost so they can avoid being sued by their shareholders.

B770 is unlikely to materialize. Once they run out of B570s and B580s, I don't think they're going to make anymore.

EDIT: I was erroneously using the graphics complex die size for AMD. My point still stands - the economics of fabbing the B580 don't make sense for its MSRP.
 
Last edited:
The problem is that the B580 is just Intel offloading chips they'd already paid for. I don't think the card makes money and isn't intended to be a sustainable venture. 272 square mm die on N5 for half the price of AMD's 200 346 square mm die on N5 is blatantly Intel selling at cost so they can avoid being sued by their shareholders.

This shareholder suit rumor seems to have materialized out of nowhere. Companies have no obligation to not cancel products. Much more likely is they're giving up nearly all their margin to get their market off the ground. They've invested over a billion in gaming GPUs with fuck all market share to show for it, and if they can sell through all their stock, that might be enough to get them some decent brand presence for the next generation to be commercially successful.
 
I'm really not sure which is going up faster with nVidia's GPUs.

The performance, the power use or the price.

And I'd REALLY like to think that performance would be second on the list at worst.
 
That's probably another reason why handheld PCs were unexpectedly popular, we reached the point where you can get good enough performance from a handheld for portable gaming.
Both Sony and Microsoft came out with really expensive and powerful consoles that didn't sell while the Switch sold pretty well, I think both are rumored to be making handheld consoles with a similar power level to their prior generation.
 
Should I buy the battlemage or intel is gonna pull the plug and leave me with an unsupported hunk of shit?

>just buy a 4090RTXGTbrbturboOCdickripper

Everything in this godforsaken wasteland is 3x than in burgerland cuz of taxes (thank you based milei for not lowering them) so buying that shit for me is like buying that shit
 
Last edited:
Every leak that inches closer to CES 2025 paints a pretty bleak picture for the GPU industry. I pity the fools that will part their money in PC building post 2025.
 
This shareholder suit rumor seems to have materialized out of nowhere.
I made it the fuck up, but it was suspiciously launched in mid-December. I do believe they make nothing on the cards though, and it's not just MLID saying that.
Should I buy the battlemage or intel is gonna pull the plug and leave me with an unsupported hunk of shit?

>just buy a 4090RTXGTbrbturboOCdickripper

Everything in this godforsaken wasteland is 3x than in burgerland cuz of taxes (thanks based milei for not lowering them) so buying that shit for me is like buying that shit
You have to consider your local situation and none of us track prices in Argentina. We are seeing late January launches for the RTX 5080 and RX 9070 XT, which won't do much on Nvidia's side but may push down AMD prices (esp. 7800 XT). Then it will be additional months for cheaper cards to come out like 5070 Ti and below, 9060 and below.

In theory, Battlemage could be well supported since it shares the Xe2 architecture of millions of Intel CPU/APUs, but it's Xe2-HPG vs. Xe2-LPG so who knows? It's clearly better than Alchemist was at launch.

You also need to consider Resizable Bar (ReBar) support if you're using an older system:

>Just buy a used turd.
Every leak that inches closer to CES 2025 paints a pretty bleak picture for the GPU industry. I pity the fools that will part their money in PC building post 2025.
Imagine playing video games in 2025 when you should be preparing for the drone wars.

The market sucks but there will be decent new cards between $200 and $400. Maybe sub-$200 in a few months if we see Intel B380, AMD RX 9040, etc. You can even play games with 8 GB of VRAM in the current year.
 
What would you recommend? was thinking a 3080 but isnt it getting too old at this point? should go for a 4070 or 4080 instead?
I barely pay attention to used US prices. You'll have to figure it out yourself or present a list of cards with prices. 3080 was expensive here when I checked ($400+ on ebay), and has only 10 GB. I'm sure the 4080 is too expensive.
 
DLSS4 better not be series exclusive. Segmenting every version is just full retard. DLSS3 was already pushing it, but why would any developer fuck around with supporting something that 1% of users will ever touch. And it has no longevity because future cards will just run the games native and probably not even support old versions of DLSS. It's a mess.
 
DLSS4 better not be series exclusive. Segmenting every version is just full retard. DLSS3 was already pushing it, but why would any developer fuck around with supporting something that 1% of users will ever touch. And it has no longevity because future cards will just run the games native and probably not even support old versions of DLSS. It's a mess.
Gaymers will accept anything Nvidia does. Developers can auto-detect and turn on DLSS settings instead of counting on users to touch it. Old GPUs can use FSR/XeSS/competitor (for example, Unreal has Temporal Super Resolution) or driver-level upscaling.
 
Should I buy the battlemage or intel is gonna pull the plug and leave me with an unsupported hunk of shit?
Intel has a pretty good track record of not suddenly leaving you high and dry on driver support.

DLSS4 better not be series exclusive. Segmenting every version is just full retard. DLSS3 was already pushing it, but why would any developer fuck around with supporting something that 1% of users will ever touch. And it has no longevity because future cards will just run the games native and probably not even support old versions of DLSS. It's a mess.
Something like 75% of gamers use DLSS according to the metrics they track. DLSS3 was a superset of DLSS2, meaning any game that implemented DLSS3 automatically had DLSS2 support, and if DLSS4 is the same way, there should be no compatibility issues.

Generally NVIDIA is much better at software than AMD, which is why they're winning.
 
  • Like
Reactions: N Space
Something like 75% of gamers use DLSS according to the metrics they track.
I don't even need to look at the data to know that's bullshit or misleading. 70+% of Steam users can't even use DLSS3, which is further proof that segmenting is retarded.
 
I don't even need to look at the data to know that's bullshit or misleading. 70+% of Steam users can't even use DLSS3, which is further proof that segmenting is retarded.
I didn't say DLSS 3. I believe the full stat is 70% of all gamers who can use it do so when they can. That's the vast majority of GPUs sold in the last few years. It's a very popular and useful feature.
 
I didn't say DLSS 3. I believe the full stat is 70% of all gamers who can use it do so when they can. That's the vast majority of GPUs sold in the last few years. It's a very popular and useful feature.
The point is that a large percentage of the playerbase can't even touch DLSS3 and they will be segmenting even more with DLSS4. Developers are going to care less about supporting it when 99% of people can't even use it.

And how do they determine that 70% of people use DLSS? What telemetry are they using? If I turn it on for a short period (before turning it off) does that count? Some games even have it on as default.
It's bullshit.
 
  • Thunk-Provoking
Reactions: Betonhaus
Back