GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
@The Ugly One if you have first hand information about scummy Nvidia, you can actually help Steve out.

View attachment 7387457

Unfortunately, this particular thing is so specific, and the circle of who knows what went down is sufficiently small, that I can't risk being any more specific than I've been here. Maybe this sounds like My Dad Works At Nintendo, so all I can do is say it isn't.
 
I don't think NVidia will ever pull out entirely, but at worst the GeForce brand would just be reduced to launching a token number of cards with minimal performance increases at a premium price. Kinda like how things are now, but somehow even worse.
Listen to what they've said about the future of AI and games. It would be very stupid for them to abandon the consumer segment and gift the market of gamers and other plebs to their competitors.
 
Listen to what they've said about the future of AI and games. It would be very stupid for them to abandon the consumer segment and gift the market of gamers and other plebs to their competitors.
I think it’s more of a risk analysis thing that they haven’t completely dropped gaming as part of their business. They don’t really know how long this AI craze will go or if it’s just a temporary fad that will see demand for all this machine learning tech drop over time.

They built up their dominance in the gaming sector, so to just completely abandon it would be a waste and risky if the AI bubble does indeed burst.

That’s the only reason I think they haven’t allocated 100% of their silicon to datacenter.
 
I think it’s more of a risk analysis thing that they haven’t completely dropped gaming as part of their business. They don’t really know how long this AI craze will go or if it’s just a temporary fad that will see demand for all this machine learning tech drop over time.

They built up their dominance in the gaming sector, so to just completely abandon it would be a waste and risky if the AI bubble does indeed burst.

That’s the only reason I think they haven’t allocated 100% of their silicon to datacenter.
I think it's part that and part not wanting to make the mistake of giving up their "terminals" business and go all in on the lucrative "mainframes" because 10-20 years from now that might turn out to have been a big mistake. AI is here to stay in whatever forms it takes.
 
They got 'em.


0f0e6d115f1fc960d727c38c4f442827fd564d0e3eec2c80f4b8596a05deaf20.webp

22% faster than the 4060 at 1080p, in line with the 25% core increase. Around the same perf as 3070 (8GB) and 4060 Ti 8GB, and a little faster than the Intel Arc B580 (12 GB) at 1440p. 1% lows over 60 FPS in this set of games. Obviously falls flat when it runs out of VRAM, and some games will silently degrade textures.

I grabbed the screenshot from comments of this story:
NVIDIA GeForce RTX 5080 SUPER Specs Leak: 24 GB VRAM At 32 Gbps & 10752 Cores at 400W+
 
Around the same perf as 3070 (8GB) and 4060 Ti 8GB
It used to be the case that every generation the 60 series would be around the same performance as the last gen 70 series. Now it's every 2 generations huh?

The article mentions an expected gaming improvement of around 5%. That puts it still behind the 4090 correct? I believe the 5080 was on average around 10-15% slower than the 4090. I'm curious what price Nvidia will launch that card. It's an 80 class that will likely be slightly slower than the last gen 90 series. Are they seriously going to try and price an 80 class card at over $1,500?
 
It used to be the case that every generation the 60 series would be around the same performance as the last gen 70 series. Now it's every 2 generations huh?
The last time NVIDIA released two successive generations on the same process node was the 7 series and the 9 series, and in that case, the GTX 960 was significantly less powerful than the GTX 770, with about half the memory bandwidth, 3/4 the compute, and half the texture fill.

 
  • Informative
Reactions: Rololowlo
The article mentions an expected gaming improvement of around 5%. That puts it still behind the 4090 correct? I believe the 5080 was on average around 10-15% slower than the 4090. I'm curious what price Nvidia will launch that card. It's an 80 class that will likely be slightly slower than the last gen 90 series. Are they seriously going to try and price an 80 class card at over $1,500?
I forgot a lot. Glancing at Tom's, 4090 was 20% ahead of 5080 in 4K raster back in January. So overclocking and adding some memory will not do much to bridge that gap. Months before launch, I theorized that the 5080 was intentionally nerfed (half the cores and bus width of 5090, little change from 4080/Super) to keep it sellable in China, but then the 5090D was conceived... only to face order cancellations and possible additional nerfing recently:

Nvidia reportedly halts RTX 5090D deliveries in China — undelivered orders canceled, GPU ban speculated
NVIDIA RTX 5090D Rumored To Get Nerfed To Just 14,080 CUDA Cores, RTX 5080 Ti/Super Expected To Release At The End Of 2025
MANLI claims GeForce RTX 5090D will be downgraded to 24GB memory

What a mess. However, there could be other reasons why a 5080 Super may be a better buy than the 4090. Nvidia advertises 1801 TOPS for the 5080, and 1321 TOPS for the 4090. I don't remember if they are trying to compare INT4 to INT8 or something lame. Blackwell should be about 10% more power efficient in gaming than 4080S/4090 out of the box, maybe better with tuning. It has a newer generation of NVENC/NVDEC, and doubles the NVDEC decode capability.

The 5080 is only $999, remember? But it's not exactly rotting on store shelves, although that could be an effect of the low supply. Maybe a $1200 MSRP for the 5080 Super instead of dropping it in at the same price this time.

AMD Radeon RX 9060 XT features 32 RDNA4 CUs, 8/16GB GDDR6 memory and PCIe 5.0×16
A special surprise is the use of a full PCIe 5.0 x16 interface. This is a bit unexpected for the Navi 44 GPU, which is smaller than the Navi 48 used in the RX 9070 series and was initially expected to have only 8 lanes, similar to NVIDIA’s RTX 5060 and 5060 Ti. This wider interface could give AMD an edge, especially when paired with PCIe 4.0 systems.

The total board power is rated at 150W for the 8GB version and 182W for the 16GB version, with AMD not listing any performance or clock differences between them.
Spec leak before Computex presentation in a few hours. Not much new here but PCIe x16 is nice to have on the lower die. You could probably get away with sticking it in a PCIe 3.0 system. Will certainly be cut to x8 if they make a 9050/9040.

It seems likely that the 8 GB model will be faster than the 5060. If it lands at the same MSRP ($300) or lower, could be interesting for those who don't care about the VRAM.
 
Last edited:
What a mess. However, there could be other reasons why a 5080 Super may be a better buy than the 4090
The only advantages it really has is that you theoretically can by one brand new, the GDDR7 memory, and the questionable value of MFG.

4090's on Ebay are still selling for more money than they were brand new. Even if NVidia made the 5080 Super $1200 or $1400 it would be cheaper to buy that than a 4090.

Still, short of cutting a GB202 die in half for a 12288 CUDA core die there isn't a whole lot they can do to make any 5080 variant faster than a 4090 in a 1:1 comparison.

It would be very stupid for them to abandon the consumer segment
Good thing I didn't say they would. But AMD still hasn't been able to really pressure them to try so they feel like they can just sit on their laurels and not lose market share. Who knows, maybe after this whole 50-series disaster and AMD celebrating their 9070 sales we could see a strong comeback in the 60-series.
 
Who knows, maybe after this whole 50-series disaster and AMD celebrating their 9070 sales we could see a strong comeback in the 60-series.
I don't think AMD can get a clean kill against the 5060 Ti 16 GB, but they can easily undercut the 5060 Ti 8 GB and outperform the 5060 8 GB with their 8 GB model. (Whatever excitement they generate has to be accompanied by a good supply to change anything, obviously.)

5060 Ti 8 GB is still selling for $420 and up on Newegg, :wow:. That's probably what the 9060 XT 16 GB price will rise to after they give it a $350-360 MSRP. No pricing leak yet, but if they announce it tonight we could know within the next 3 hours.
 
  • Like
Reactions: Brain Problems
Unfortunately, this particular thing is so specific, and the circle of who knows what went down is sufficiently small, that I can't risk being any more specific than I've been here. Maybe this sounds like My Dad Works At Nintendo, so all I can do is say it isn't.
can you at least point in the direction of things that there are potential anonymous sources for?
 
Last edited:
Confirmed $350 MSRP before AMD livestream utters it:
16GB and FSR4 for $350 is great value. Most models are going to be $50-100 more than that. If there are supply issues they’ll also be scalped to hell and back. So hopefully AMD at least has decent supply ready to ship.
 
Confirmed $350 MSRP before AMD livestream utters it:


AMD Provides Initial Details On The Radeon RX 9060 XT
AMD Ryzen Threadripper 9000 Series Launching In July For Linux Workstations
AMD Announces The Radeon AI PRO R9700 Graphics Coming In July - 32 GB

FSR "Redstone" coming to RDNA4 users in the second half of the year. Looks like it's for better path tracing, and frame generation.
I'm excited to see how it performs. It has exactly half the cores of a 9070XT, but no idea what that means for guesstimating the performance.
 
I really don’t get why they even make the 8GB variants. They’re great if you only play Minecraft, Valorant, and LoL I guess.
 
Back