GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
Any reason you're not considering an RX 9070 (not-XT)? They're around the same price as a 5070 and you get more VRAM and better raw perf.

Also GPU prices are trickling downwards towards MSRP so if you wait a month or two you'll probably be able to get a 9070 XT for MSRP.

I don't think the 5070 is really worth 600. I barely think it's worth the $550 MSRP.

If you can wait you should. If you can't, a $600 5070 isn't the worst deal if the alternative is not having a GPU at all or something.
I don't get why the 9070 isn't advertised more its much faster than a 5070 has more vram and it has similar RT for anything that's not pathtracing.
 
We all know AMD is going to change their naming scheme with their next GPUs, probably their CPUs too. They're just that shit at naming products.

Whenever the base 60-class card outperforms what I have will probably be when I start to think about upgrading. So at this rate I'll be fine for the next 15 years.
wdym your not exited for the ryzen RX9090 AI ?!
 
We all know AMD is going to change their naming scheme with their next GPUs, probably their CPUs too. They're just that shit at naming products.

Whenever the base 60-class card outperforms what I have will probably be when I start to think about upgrading. So at this rate I'll be fine for the next 15 years.
oh boy I hope we get the confusing clusterfuck naming schemes from monitors
 
5090 Gigachad
vs
4090 Chad
vs
3090 Chud

Who would win?
the 3090 is definitely the weakest in its own release lineup of the 3, the 4090 was a massive step up and the 5090 is also uncontested by anything while the 3090 was only 10-15% better than the 3080
 
the 3090 is definitely the weakest in its own release lineup of the 3, the 4090 was a massive step up and the 5090 is also uncontested by anything while the 3090 was only 10-15% better than the 3080
The 3090 did come with NVLink though, which makes it still quite attractive for workloads like AI. When Nvidia realised people were buying 3090s instead of workstation cards for four times as much, they dropped this.
 
The 3090 did come with NVLink though, which makes it still quite attractive for workloads like AI. When Nvidia realised people were buying 3090s instead of workstation cards for four times as much, they dropped this.
This, but also, the 3090 is still a capable GPU for a variety of workloads, from gaming to ML. Sure the 4090 is way better, but at the same time, the 3090 is roughly twice the computing power of a 1080Ti, which was, and still is a fairly capable GPU.

Also the 3090 is now extremely affordable on the second hand market thanks to the 4070Ti matching it's raw performance at a much lower MSRP, making it a great option if you want that extra VRAM at a good price. Besides NVLink, Ampere was the last Nvidia generation where they didn't force board partners to use a dangerous power delivery scheme so you get the ability to plug in 2-3 6+2pin plugs and you're off to the races.

The 3090/4070Ti/5070 performance bracket is still a very capable one, where the 4090/5090 only becomes a necessity due to game devs being incompetent faggots, so really, is it that bad to stay behind in the tech race if we've already reached this far?
 
I can't believe some of you faggots are actually buying NV-slop. As if the 5090 price is okay and justified. AMD is at least somewhat open source software-wise and has reasonable pricing.
 
I can't believe some of you faggots are actually buying NV-slop. As if the 5090 price is okay and justified. AMD is at least somewhat open source software-wise and has reasonable pricing.
AMD would have reasonable pricing if their pricing wasn’t just literally fake, and they don’t offer anything for enthusiast class builds, if they had a 90 tier GPU I would have considered it
as for the last part nvidia has gotten far better in this aspect recently, but they’ve still got a long way to go
but basically I’m not going to buy an AMD GPU if they literally don’t offer a product that suits what I’m looking for because their architecture doesn’t scale well enough to compete at the top end
 
3090 was only 10-15% better than the 3080
Big thing about the 3090 was the VRAM, 24GB is still way more than any game uses, even when you mod in as many 4K textures as you can find. Most I've ever managed to use is 19GB. I use it at 4K and can still run a lot of games at max settings.

I considered a 5090 for about 20 seconds before seeing that it's triple the MSRP, and has a plethora of issues. Plus the number of AIB models with waterblocks available still is pretty limited.
I don't get why the 9070 isn't advertised more its much faster than a 5070 has more vram and it has similar RT for anything that's not pathtracing.
It has been in enthusiast circles. But a lot of people just default to NVidia cards, if you mention AMD they won't know what that is. The other problem with the 9070 right now is it's a lot more than the price is supposed to be right now in a lot of regions.
 
  • Like
Reactions: Brain Problems
AMD would have reasonable pricing if their pricing wasn’t just literally fake, and they don’t offer anything for enthusiast class builds, if they had a 90 tier GPU I would have considered it
as for the last part nvidia has gotten far better in this aspect recently, but they’ve still got a long way to go
but basically I’m not going to buy an AMD GPU if they literally don’t offer a product that suits what I’m looking for because their architecture doesn’t scale well enough to compete at the top end
If you truly need something like the RTX 5090, then I guess you don't have much choice yeah. I only need like half of that in terms of VRAM and GPU-compute. And as @Bogan just mentioned, the new NVIDIA GPU's have had all sorts of issues, which turns me off even more. If I pay a ridiculous amount of money for a GPU, I'd expect it to at least not instantly melt the PSU pins or whatever.
 
If you truly need something like the RTX 5090, then I guess you don't have much choice yeah. I only need like half of that in terms of VRAM and GPU-compute. And as @Bogan just mentioned, the new NVIDIA GPU's have had all sorts of issues, which turns me off even more. If I pay a ridiculous amount of money for a GPU, I'd expect it to at least not instantly melt the PSU pins or whatever.
I’ve seen lots of reports of widespread driver issues but I’ve been lucky enough to not run into those so far, but they have been pretty shit and unstable since the 50 series dropped
and the 12VHPWR connector is pretty dogshit, the 5090 should realistically have 2 since they want to push it basically right to the safety limit of the cable at stock, for that reason alone I’ve been running an undervolt on mine just to back away from that safety limit
 
  • Like
Reactions: The Ugly One
So with the power connector, what's funny to me is the 3090Ti has both a 450W TDP and the same 12VHPWR connector as the 4090, but none of the 3090Ti connectors have melted.

Buildzoid and DerBauer did content pieces about it a couple months ago. Warning: very overly-technical content.

TL;DR, the entire power draw is going down 2 of the 12 cables instead of being spread out easily, plus the 3090Ti has multiple shunt resistors, the 40 and 5090's have just 1.
 
TL;DR, the entire power draw is going down 2 of the 12 cables instead of being spread out easily, plus the 3090Ti has multiple shunt resistors, the 40 and 5090's have just 1.
in normal operation it shouldn't be going down 2 of 6 cables (6 are ground so it's 6 for any card using the 12VHPWR cable) it's just that any lack of load balancing over the pins is unable to be detected by the GPU, meaning that in abnormal operation caused by uneven resistance across the pins, power will just flow down the wires with the least resistance meaning they heat up and the connector melts, this is why a few other youtubers that attempted to replicate what derbauer found were unable to see the same situation, as derbauers connector had been reused several times and thus had uneven resistances, I would seriously recommend getting a new cable if you were to move to a new card using the same connector because of this situation.
 
in normal operation it shouldn't be going down 2 of 6 cables (6 are ground so it's 6 for any card using the 12VHPWR cable) it's just that any lack of load balancing over the pins is unable to be detected by the GPU, meaning that in abnormal operation caused by uneven resistance across the pins, power will just flow down the wires with the least resistance meaning they heat up and the connector melts
In DerBauer's video he took a heat camera and pointed it at the cable to show only 2 of them were being used. He even cut cables to show it wasn't affecting the card at all.
 
Back