GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

  • 🔧 Actively working on site again.
Hardware Unboxed claims that the 5080 is really what the 5070 should have been based on specs.
the 80 card is usually a cut down 102 die, but instead this time it's the full fat 103 die, so it really does look that way
the 5080 seems to have a good chunk of OC headroom however, which is quite strange for a recent card, perhaps they're saving room for a 24 gig 5080 super/ti on the same die?
 
Hardware Unboxed claims that the 5080 is really what the 5070 should have been based on specs.

Not surprised. Nvidia tried that last launch with the original 12gb 4080.
I didn't watch the video, but I guess they're complaining that the 5080 is less than half the CUDA cores of the 5090.

x070x070 Tix080Top Die Size
% cores of 3090 Ti54.8%57.1%81%628.4 mm^2
% cores of 409035.9%46.9%59.4%608.5 mm^2
% cores of 509028.2%41.2%49.4%750 mm^2

Thing is that the 5080 remained almost the exact same die size as the 4080 (378mm^2) on a virtually identical node (4NP vs. 4N, the name's backwards because it's "custom"). Meanwhile the 5090 die grew by 23% (750mm^2 vs. 608.5mm^2 for the 4090), getting very close to the reticle limit.

So if the 5080 is about +10% performance, while the 5090 is about +25-35%, the latter is a result of making the die as fat as technically possible, which also raised the power consumption. Bandwidth is overabundant at +78% from the 4090.

By the numbers, the damage was already done by 40 series. If you shrunk the 5090 back to 4090 level (608.5 / 750 = 0.811, then 0.811 * 21760 cores), the 5080 would have about 61% as many cores as the 5090.
 
It's Nvidia's fault for paper launching the card. They could have built up supply and launched later.

TSMC makes plenty of chips for Nvidia. Nvidia would rather sell a B200 AI accelerator for $40,000 than two RTX 5090s for $4,000.
B200s are back ordered for 18 months. At this point NVIDIA would make more money by exiting gaming, but it would be short sighted.
 
B200s are back ordered for 18 months. At this point NVIDIA would make more money by exiting gaming, but it would be short sighted.
they have 90% market share without even trying, why would they exit the effectively free money market for them?
 
  • Agree
Reactions: Post Reply
they have 90% market share without even trying, why would they exit the effectively free money market for them?

Which will make you more money:

Selling 1000 GPUs at $40,000 each and 1000 gaming GPUs at $1000 each

or

2000 datacenter GPUs at $40,000 each and zero gaming GPUs.

It would be unwise to put all their eggs in the AI basket for other reasons, but not because they would make less quarterly. Fortunately for NVIDIA, it's being run by its founder, not some asshole trying to max his bonus before triggering his golden parachute.
 
Hardware Unboxed claims that the 5080 is really what the 5070 should have been based on specs.

Not surprised. Nvidia tried that last launch with the original 12gb 4080.
Hardware Unboxed claims a lot of things. He's currently seething because his audience called him out for testing the 5090 at 1080p.

Trying to claim that a certain GPU 'should' belong to a certain class is goofy autism logic. The 4090 was a massive die that completely shifted the scale of where everything sat and even putting that aside we're talking about ludicrously expensive process nodes.

"The 5080 is actually what we would have used to called the 5070 so therefore I feel like I'm getting a bad deal" completely ignores the fact that there are costs associated with fabbing this shit and opportunity costs associated with not using that fab capacity to just shit out more AI accelerators.
 
Hardware Unboxed claims a lot of things. He's currently seething because his audience called him out for testing the 5090 at 1080p.

Trying to claim that a certain GPU 'should' belong to a certain class is goofy autism logic. The 4090 was a massive die that completely shifted the scale of where everything sat and even putting that aside we're talking about ludicrously expensive process nodes.

"The 5080 is actually what we would have used to called the 5070 so therefore I feel like I'm getting a bad deal" completely ignores the fact that there are costs associated with fabbing this shit and opportunity costs associated with not using that fab capacity to just shit out more AI accelerators.
Cold hard truth but it needs to be done: make 1440p the new standard.
 
Hardware Unboxed claims a lot of things. He's currently seething because his audience called him out for testing the 5090 at 1080p.

Trying to claim that a certain GPU 'should' belong to a certain class is goofy autism logic. The 4090 was a massive die that completely shifted the scale of where everything sat and even putting that aside we're talking about ludicrously expensive process nodes.

"The 5080 is actually what we would have used to called the 5070 so therefore I feel like I'm getting a bad deal" completely ignores the fact that there are costs associated with fabbing this shit and opportunity costs associated with not using that fab capacity to just shit out more AI accelerators.

The 50 series is just the best GPU NVIDIA can produce on TSMC's 5nm process. There is zero reason to continue producing 40 series GPUs, which is why they've been discontinued. Yes, if you mindlessly buy a new GPU every year, you're wasting your money, but that's no more NVIDIA's fault than it any other chip maker's fault if you sidegrade every time a new chip comes out on an old node.
 
Intel is canceling Falcon Shores.

Ponte Vecchio was a day late and a dollar short and had a lot of My First GPU design flaws (engineering is hard, and your first design is always shit). One engineer I know described it as, quote, "a massive piece of crap" that is simply unable to saturate its bandwidth. Gaudi 2 & 3 were both held back by Intel's node disadvantage (strangely, the DEI investments did not make up for it), software, and who knows what other issues--I don't have one and haven't talked to anyone who does.
 
Last edited:
On the other hand... what if their shitty products are actually because they aren't hiring enough niggers? :thinking:

They might just need more black girl magic to keep up with AMD and Nvidia.
The discussion many people in silicon want to have, but know they will be immediately fired and blacklisted for having, is how it is even possible for nigger-free TSMC, on its nigger-free island of Taiwan, to rocket past Intel, which has more niggers than any other chip company.
 
The discussion many people in silicon want to have, but know they will be immediately fired and blacklisted for having, is how it is even possible for nigger-free TSMC, on its nigger-free island of Taiwan, to rocket past Intel, which has more niggers than any other chip company.
God bless the slant-eyed bugmen for the X3D chips.
 
Hardware Unboxed claims a lot of things. He's currently seething because his audience called him out for testing the 5090 at 1080p.

Trying to claim that a certain GPU 'should' belong to a certain class is goofy autism logic. The 4090 was a massive die that completely shifted the scale of where everything sat and even putting that aside we're talking about ludicrously expensive process nodes.

"The 5080 is actually what we would have used to called the 5070 so therefore I feel like I'm getting a bad deal" completely ignores the fact that there are costs associated with fabbing this shit and opportunity costs associated with not using that fab capacity to just shit out more AI accelerators.
Why the fuck are you testing a 4090/5090 at anything other than 4k?
 
  • Thunk-Provoking
Reactions: Drael
What exactly is causing the Intel Arc B580 to be bottlenecked on some CPUs? Would a Xeon 2696 v3 bottleneck the GPU?
 
Back