GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

It sounds like that (the non-XT) will have +70-80% the performance of a 6700 XT at about the same power consumption (220-230W). With 16 GB and probably no less than $500 at this point.

AMD Radeon RX 9070 series gaming performance leaked: RX 9070XT is 42% faster on average than 7900 GRE at 4K

AMD preparing Fluid Motion Frames 2.1 (AFMF)
Combine this with out of the box Linux support, just don't fuck up the price AMD, please?

EDIT: And fix the idle power consumption.
 
How have the prices of used 3090s managed to increase by $200 on average? I swiped one for $800 a few months ago. The only exception is the two slot, but that one runs at MSRP even used.
 
Tough shit, the cost of the current 6750XT/6800XT stock is way too expensive.
Me: A 3060 Ti is just as good
Me after looking at current prices: Not for $700 it's fucking not

24 gigs of VRAM make these cards useful for AI moreso than gaming. Plus there's a GPU panic right now.

The federal government also printed off $1.8 trillion new dollars last year.
 
  • Lunacy
Reactions: Brain Problems
Up to an 18 month wait on those bad boys. But maybe the AI bubble will pop first. I keep hearing in places that matter, "Okay, everyone's spent billions on AI infrastructure. Now we just need to find a way to show them how to make money with it." This is bubble behavior. Everyone's apparently been buying GPUs and software licenses out of pure FOMO, with no idea how any of this stuff will actually benefit their business.

I know of one company that bought a DGX and is paying for ChatGPT to generate marketing material, and it's costing them even more money, since the output is such crap that they have to have professionals review and edit everything anyway.
Even if the AI bubble bursts, at best you are getting a bunch of 2 Slot enterprise cards with blower style coolers. 3090s optimized for AI with 300w tdp instead of 350w. Priced at around $1.3k. Not suitable for gaming, but you can stick 3 in a 2 socket Broadwell white box and run with 72GB VRAM doing god knows what while burning your house down.

Would cum buckets if a bunch of A40s flooded the market for a reasonable price. No fans though, so I would need to do some engineering.

Consumer AI bros aren't selling their stock of 12GB+ VRAM cards because it's like gaming to them.
 
Even if the AI bubble bursts, at best you are getting a bunch of 2 Slot enterprise cards with blower style coolers.
If the AI bubble pops, not only will there will be tons of A100s, H100s, and H200s on the secondary market in the event of a pop (probably useless for gaming, but still, they're great AI-HPC cards), but the demand for cutting-edge silicon to be used in AI hardware will collapse. Since the demand for gaming GPUs is almost completely unrelated to the demand for enterprise AI hardware, this would mean lots of slack capacity at the fabs for NVIDIA to actually start meeting demand for consumer products.
 
  • Optimistic
Reactions: Allakazam223
How have the prices of used 3090s managed to increase by $200 on average? I swiped one for $800 a few months ago. The only exception is the two slot, but that one runs at MSRP even used.
I got a used 3090 Ti last year for 700. Maybe I should sell it, my 60-series works perfectly fine in everything I (rarely) play. And DLSS is, and this has been mentioned before by @TheUglyOne, really good. At launch Ninja Gaiden 2 Black didn't support it on the Gamepass version so I tried FSR... I'm not going to try FSR again.
 
  • Thunk-Provoking
Reactions: Brain Problems
If the AI bubble pops, not only will there will be tons of A100s, H100s, and H200s on the secondary market in the event of a pop (probably useless for gaming, but still, they're great AI-HPC cards), but the demand for cutting-edge silicon to be used in AI hardware will collapse. Since the demand for gaming GPUs is almost completely unrelated to the demand for enterprise AI hardware, this would mean lots of slack capacity at the fabs for NVIDIA to actually start meeting demand for consumer products.
Assuming that we don't have a Next Big Thing after cryptocurrency and AI that sucks up the supply
 
If the AI bubble pops, this would mean lots of slack capacity at the fabs for NVIDIA to actually start meeting demand for consumer products.
Company invents hyperintelligent benevolent AI, someone at the company deliberately poisons and kills it to bring GPU prices back down
 
  • Optimistic
Reactions: Allakazam223
One thing that needs to be remembered about the 4090/5090 is these cards are about the cheapest professional level cards you can get. These cards are being used with AI/CUDA applications for a personal user or small business and are kind of waste for gaming outside of 4k.
I think the issue I have with the 4090/5090 is apparently the burning GPU power connector issue with the 12vhpwr happens a lot more than you'd want. it's not a fun idea to suddenly have your expensive investment catch on fire or have a connector melt so you have to send it in for repairs.
 
I think the issue I have with the 4090/5090 is apparently the burning GPU power connector issue with the 12vhpwr happens a lot more than you'd want. it's not a fun idea to suddenly have your expensive investment catch on fire or have a connector melt so you have to send it in for repairs.
It's just such a dumb issue that should not be happening. We've had cards in the past with high power draws that worked just fine. For some reason (to make the board as small as possible to cut those corners) now they're making dumb design choices and the customers are paying for it.

Idk, feels like Nvidia is just getting sloppy and saying "Whatchu gonna do 'bout it, bish?"
 
  • Agree
Reactions: The Ugly One
It's just such a dumb issue that should not be happening. We've had cards in the past with high power draws that worked just fine. For some reason (to make the board as small as possible to cut those corners) now they're making dumb design choices and the customers are paying for it.

Idk, feels like Nvidia is just getting sloppy and saying "Whatchu gonna do 'bout it, bish?"
We haven't had any consumer graphics card with the power draw of the 4090 or 5090. It is an entirely unprecedented level of power draw in a consumer GPU outside of meme shit like the Radeon Pro Vega II with its proprietary Apple power connector. The entire reason the cable exists is that 8 pin PCIE power cables are only configured to provide 150W, meaning that 5090 would need at least four of them which is just infeasible for the way the PCBs are designed.

Part of the problem is such retardedly high power-draws in consumer equipment in the first place and part of the problem is that Nvidia somehow sold the PCI SIG on standardizing around a design that just barely meets the specifications of what it's supposed to do.

In a saner world, AMD and Intel would have immediately launched a competing high-power standard and fought Nvidia's connector becoming an ATX standard but alas...
 
  • Agree
Reactions: Allakazam223
Back