GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

With how disappointing the AMD GPUs are according to kepler's leaks alongside Nvidia being retards as usual, we're going to have another skip generation and a major stagnation issue in the long run.
 
With how disappointing the AMD GPUs are according to kepler's leaks alongside Nvidia being retards as usual, we're going to have another skip generation and a major stagnation issue in the long run.
RDNA4 won't be that disappointing if they improve price/performance.

Whatever the top RDNA4/Navi 44 card is (e.g. 8800 XT) should be competitive with at least the 5070, probably the 5070 Ti. In the $500-600 range. No significant change to raster vs. the 7900 XTX but better raytracing and efficiency, at a lower price.

Then at the bottom with the small Navi 48 die, the 8600 XT is hopefully a bandwidth-starved 7700 XT (or a bit worse) with 16 GB. And I hope to see actual low-end cards based on that die to replace bad/overpriced 6500 XT, RTX 3050 6 GB, etc.
 
RDNA4 won't be that disappointing if they improve price/performance.

Whatever the top RDNA4/Navi 44 card is (e.g. 8800 XT) should be competitive with at least the 5070, probably the 5070 Ti. In the $500-600 range. No significant change to raster vs. the 7900 XTX but better raytracing and efficiency, at a lower price.

Then at the bottom with the small Navi 48 die, the 8600 XT is hopefully a bandwidth-starved 7700 XT (or a bit worse) with 16 GB. And I hope to see actual low-end cards based on that die to replace bad/overpriced 6500 XT, RTX 3050 6 GB, etc.
I was expecting that Intel's B580 release would maybe change the expectations for low-midrange GPUs for the other two, but given how the 8600 and 5060 has 8GB of VRAM and the price being above $300, I have extremely low expectations for them. Can't they read their customers that the majority are starved gamers who just want a GPU from the $100-$300 max price range or are they just purely fucking retarded?
 
  • Like
Reactions: Bananadana
but given how the 8600 and 5060 has 8GB of VRAM and the price being above $300, I have extremely low expectations for them.
I don't know what you read from "Kepler", but nobody knows pricing because that can change within hours before the announcement. I also expect AMD to follow up the 7600 XT 16 GB with an 8600 XT 16 GB. Same 128-bit bus and GDDR6 though.
 
I don't know what you read from "Kepler", but nobody knows pricing because that can change within hours before the announcement. I also expect AMD to follow up the 7600 XT 16 GB with an 8600 XT 16 GB. Same 128-bit bus and GDDR6 though.
I was referring to Kepler_L2. Also I noticed, why 128-bit? Why not say 192-bit? I don't see GPUs with 352-bit like the 2080ti anymore.
 
I was referring to Kepler_L2. Also I noticed, why 128-bit? Why not say 192-bit? I don't see GPUs with 352-bit like the 2080ti anymore.
I know who Kepler_L2 is, but you should never 100% believe any leaker. 128-bit is cheaper and uses less power. The memory controller is on the die taking up silicon area (or a separate MCD for Navi 31/32). We also have large enough VRAM modules now to give 8 or 16 GB to 128-bit cards, which is plenty. Technically, a 128-bit card could have 24 GB GDDR7 soon with the 3 GB modules, although that would be extreme. I think we could see a 128-bit RTX 5060 Sooper 12 GB, once 3 GB GDDR7 is available later in 2025.

With RDNA4, they've boiled it down to just two dies: Navi 48 (256-bit) and Navi 44 (128-bit). They aren't giving up per se but it's a scaled down generation. 128-bit is common at the low end, 256-bit is good enough for mid/high. AMD's Infinity Cache (or Nvidia's large L2 cache) alleviates the need for a wider memory bus somewhat.

2080 Ti was 352-bit because they cut it down from the 384-bit die used in the Titan RTX. That also gave it the weird capacity of 11 GB instead of 12 GB. Having weird capacity amounts could affect the marketability of the card, although we may see that again soon with 192-bit 18 GB, etc.

The memory controller die (MCD) approach of RDNA3 made it easier to make different bus widths, because you just add/remove an MCD, each of them controlling 64 bits. No binning required and the cost goes down. Therefore there was:
7900 XTX = 384-bit with 6 MCDs
7900 XT = 320-bit with 5 MCDs
7800 XT = 256-bit with 4 MCDs
7700 XT = 192-bit with 3 MCDs

RDNA4 is only using monolithic dies this time. They could partially disable the die to get down to 192-bit and 96-bit, we'll have to see. But 256-bit and 128-bit are the starting points.
 
Last edited:
With how disappointing the AMD GPUs are according to kepler's leaks alongside Nvidia being retards as usual, we're going to have another skip generation and a major stagnation issue in the long run.
I wonder how much marketshare Intel is going to snap up if the other two play silly games like that? I don't want to pay a scalper for a B580
 
  • Like
Reactions: Bananadana
I wonder how much marketshare Intel is going to snap up if the other two play silly games like that? I don't want to pay a scalper for a B580
Not much. Even in datacenter, where you'd think buyers are less retarded than consoomers who see "NVIDIA 4090 best GPU ever" and then go buy a 3050 and wonder why it sucks, brand identity wins. By price and performance, EPYC should have been beating the shit out of Xeon since 2019, but it's only now that AMD is reaching a majority share of the server CPU market.

Same is happening in datacenter GPUs. Instinct gives you by far the best bang for the buck, but people still demand NVIDIA's overpriced meltboxes "because AMD doesn't have CUDA" and other such retarded reasoning.
 
  • Informative
Reactions: Kane Lives
Not much. Even in datacenter, where you'd think buyers are less retarded than consoomers who see "NVIDIA 4090 best GPU ever" and then go buy a 3050 and wonder why it sucks, brand identity wins. By price and performance, EPYC should have been beating the shit out of Xeon since 2019, but it's only now that AMD is reaching a majority share of the server CPU market.

Same is happening in datacenter GPUs. Instinct gives you by far the best bang for the buck, but people still demand NVIDIA's overpriced meltboxes "because AMD doesn't have CUDA" and other such retarded reasoning.
Would you say that AMDs software has caught up to Intel and Nvidia, then?
 
Would you say that AMDs software has caught up to Intel and Nvidia, then?

No, but it's at the level where it's no longer a significant obstacle to software getting released on their platform. HIPify works well enough.

What language software is programmed in is invisible to the user, of course. However, NVIDIA has done a very good job of socializing the CUDA brand to users rather than just developers. I am especially thinking of someone I know who hasn't written any code since writing F77 in grad school who said, "Well, AMD GPUs look good on paper, but they can't run fast if they don't have CUDA." It's a surprisingly common sentiment.
 
I am just really curious how the low end sales look in 6-12 months from now when Nvidia and AMDs low end cards (god forbid they actually charge 300+ for 8gb) have been revealed and people had some time to buy the B580.
With how positive overall reception has been i would be surprised if Intel does not outperform another round of "planned obsolescence special" 8gig 300$+ entry level cards.

You know you REALLY screwed up when they ask you to return your golden parachute back to the company like its a stolen work laptop from the guys who lost their jobs because of you.

That being said, the salary alone is pittance compared to the $140,433,000 in stock awards (that i think they also want back) he got in 2021 lol.
To quote that part (page 37) of this lolsuit PDF:
" 177. Plaintiff, as a shareholder and representative of Intel, seeks restitution from the defendants in the Securities Class Action and seeks an order from this Court disgorging all profits, benefits, and other compensation obtained by the defendants in the Securities Class Action due to their misconduct and breach of their fiduciary and contractual duties. "
 
I am just really curious how the low end sales look in 6-12 months from now when Nvidia and AMDs low end cards (god forbid they actually charge 300+ for 8gb) have been revealed and people had some time to buy the B580.
With how positive overall reception has been i would be surprised if Intel does not outperform another round of "planned obsolescence special" 8gig 300$+ entry level cards.
There is not a lot of B580 supply, and probably never will be. Intel's Battlemage is not a savior. It is a weak card that they are probably selling at a loss because it uses too much silicon to reach the 7600 XT / 6700 XT / 4060 level. Releasing it in December 2024 allowed them to avoid it becoming part of a shareholder lawsuit.

AMD can easily fumble the low-end, but if they sell an 8600 XT 16 GB at $300, it could have similar price/perf to a $250 B580. Depends on what the actual price point is and how close to the 7700 XT the card can get.

B580 is around 7600 XT performance, so if that drops in price to match it, you have a card with basically the same raster, worse raytracing, but +4 GB VRAM.

Based on this Zotac leak listing 5090 through 5070, it might be a while before we see 5060 Ti and 5060. Those were already expected to come out later, like previous staggered launches.
 
  • Informative
  • Thunk-Provoking
Reactions: Bananadana and Vecr
Is the Nvidia 3060 12gb or the Intel B580 a better GPU? Or is that depending on sale price?
 
AMD Ryzen 7 7800X3D drops to just $359 at Amazon — Secure the fastest Zen 4 gaming chip while supplies last

7800X3D at $359, Ships from zhangqidong art, Sold by zhangqidong art

NVIDIA GeForce RTX 5090 PCB leak reveals massive GB202 GPU package
RTX-5090-PCB-HERO-1-1200x624.jpg
 
Last edited:
Back