GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

We haven't had any consumer graphics card with the power draw of the 4090 or 5090. It is an entirely unprecedented level of power draw in a consumer GPU outside of meme shit like the Radeon Pro Vega II with its proprietary Apple power connector. The entire reason the cable exists is that 8 pin PCIE power cables are only configured to provide 150W, meaning that 5090 would need at least four of them which is just infeasible for the way the PCBs are designed.

Part of the problem is such retardedly high power-draws in consumer equipment in the first place and part of the problem is that Nvidia somehow sold the PCI SIG on standardizing around a design that just barely meets the specifications of what it's supposed to do.

In a saner world, AMD and Intel would have immediately launched a competing high-power standard and fought Nvidia's connector becoming an ATX standard but alas...
What if we just, and now hear me out here, crazy idea--didn't consume enough power to kill a farm animal just to play games?
 
If the AI bubble pops
i doubt it will, next year they're saying AGI and 3 years from now ASI (meaning what they originally meant by AGI) plus the tariffs will make GPUs even more valuable. gaming is more likely to die out as a use case for GPUs over AI at this point. as people have been saying, a 3060 is enough for most games.
 
What if we just, and now hear me out here, crazy idea--didn't consume enough power to kill a farm animal just to play games?
The market has already shown it doesn't care about power consumption by rewarding nvidia and AMD for releasing flagships with increasingly retarded TBPs. I expect by 2035 we'll have kilowatt TBPs and 600W cards will be the new efficiency kings.

(fwiw I agree with you)
 
  • Agree
Reactions: Post Reply
The market has already shown it doesn't care about power consumption by rewarding nvidia and AMD for releasing flagships with increasingly retarded TBPs. I expect by 2035 we'll have kilowatt TBPs and 600W cards will be the new efficiency kings.

(fwiw I agree with you)
The also ridiculous effect is that these 4090s and 5090s easily undervolt to where it's something like 95-99% of the power for 2/3rds to 3/4th the power consumption. It's not like these cards are starved for computing power.
 
i doubt it will, next year they're saying AGI and 3 years from now ASI (meaning what they originally meant by AGI) plus the tariffs will make GPUs even more valuable. gaming is more likely to die out as a use case for GPUs over AI at this point. as people have been saying, a 3060 is enough for most games.
A 3060 is great for AI on the cheap. You can run SDXL on that boy EZPZ. Man the 3060 ti is the red headed step child of the 3000 series next to the 3070 which is almost as bad.
 
Notorious Caveman Beve Sturke has released a video talking about the 50 series ROP disaster.
He argued that Nvidia was either incompetent or knew about the problem and shipped anyway (there's no way). But the 5080 portion seems to have been added to the video last minute since it's breaking news, and points to a different technical explanation since it's supposed to be a fully enabled die. Maybe this is fixable with a firmware/driver update.

Other issues like the use of the shitty power cable with a low safety margin are intentional.
 
  • Thunk-Provoking
Reactions: Brain Problems
$700 for an 8 GB card is cheap in 2024. :cryblood:
There is the 12GB SKU of the 3060 as well, but also used 3090's go for ~$750 here, and that gets you 24GB of VRAM with roughly 4070Ti performance, minus the AV1 encoder and better power efficiency. Given the current state of the market it's basically a steal.

At this rate people will have to take the risk and buy used, and at the rate of Nvidia's fuckups, it might end up less risky than trying to get a brand new card.
 
There is the 12GB SKU of the 3060 as well, but also used 3090's go for ~$750 here

Do they still today? As of about 24 hours ago, i was thinking of a 3060 Ti as a nice $300 card, a 4060 as a $400 card, and a used 3090 as a, uh...


1740487145474.png

:cryblood:
 
Do they still today? As of about 24 hours ago, i was thinking of a 3060 Ti as a nice $300 card, a 4060 as a $400 card, and a used 3090 as a, uh...


View attachment 7023459

:cryblood:
Emphasis on "here". Used 3090's pop up online for around 3000PLN, so ~$760 give or take. Will probably change soon but you could get a used 3090 for ~1000PLN less than a brand new 4070Ti. And by used I mean refurbished with warranty and standard EU 14-day online purchase return policy.
 
AMD teases Radeon RX 9070 focusing on sub-$700 price point (archive)
AMD-9070-TEASER-768x419.jpg

The comment section is going nuts.

Since PhysX Games Are A Problem On NVIDIA RTX 50 GPUs, This User Combined RTX 5090 With RTX 3050 To Solve The Performance Issue (archive)
With the dropping of support for 32-bit PhysX in the Blackwell series, a user tried to fix the performance problem by adding another NVIDIA GPU that comes with PhysX support. The Redditor u/jerubedo tried adding a GeForce RTX 3050 to his gaming PC alongside the GeForce RTX 5090 (this isn't SLI) and dedicated the RTX 3050 strictly for PhysX processes in the NVIDIA Control Panel.

He then ran and benchmarked a couple of PhysX-based games, including Mafia II Classic, Batman Arkham Asylum, Borderlands 2, Assassin's Creed IV: Black Flag, and Mirror's Edge. The final results show an improvement of up to 1425%.
 
Idk, feels like Nvidia is just getting sloppy and saying "Whatchu gonna do 'bout it, bish?"
Nice. The tried and true Adobe / Autodesk gambit. Your problem doesn't exist and even if it did we're certainly not going to admit by providing a solution in our KB. But... there's a ten year old thread in our forum with instructions on how to fix a fifteen year old problem that still exists in the current version under certain circumstances.

That last part has been around for a long time, using a secondary GPU as the dedicated PhysX processor(remember when PhysX was a card on it's own?) so I'm not surprised.
 


This was a real eyeroll:

Nonetheless, even though NVIDIA specified that the series won't support 32-bit CUDA applications, it didn't explicitly state that they are dropping the PhysX support on the RTX 50 series.

Nonetheless, even though my local convenience store specified that they were no longer stocking Pepsico products, it didn't explicitly state that it was dropping Mountain Dew.
 
What i don't get is why we haven't explored double-height video cards intended for being mounted parallel with the motherboard with a risor card. something like this:
standard gpu on top, double height gpu below. instead of relying on the slot to support the weight it can be screwed down in all four corners
1740530643871.png
 
What i don't get is why we haven't explored double-height video cards intended for being mounted parallel with the motherboard with a risor card. something like this:
standard gpu on top, double height gpu below. instead of relying on the slot to support the weight it can be screwed down in all four corners
View attachment 7026013
Because it's not supported by the ATX standards which means any company making such a thing would have a severely limited userbase to sell into. I'm not even sure what problem this would solve.
 
Back