GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
CPUs have gone the other way around with their pricing, a Ryzen 7 3700 can be had for $300 and that's a lot of bang for your buck.
The $300-400 range has always been the "7" tier. The first i7, the i7 920 IIRC was just under 300 10 years ago. Thankfully Zen forced polycore (that even a term?) CPU's into the market.

But that's they way things should be. What defines a product segment should be better as the years go by. The infuriating part are fanboys who are fine with being bent over, justifying the sodomy that "what mid range is changed with the intro of 4k" or some shit like that.

Here's a hot take: None of you fuckers have any real use for the cards that are coming out, most new games are garbage anyway, the GTX 780 Ti is still good enough for most things, 4K is useless, hell I'm still more than happy with 720p. Literally the only people who need these cards are people with VR and people working on machine learning.
I only upgrade when there's something I want to play that's out of my spec. I usually coast on hardware until they fail. Right now, I have a 2060 and I'm just 1080p with a sizable steam backlog.

People rendering on the GPU could also use the high VRAM. Even in cases when system RAM can be used, VRAM is still massively favored. Whether that's worth it is going to be up to the individual use case.

As for the 4k thing, I've done some work in places with ultrawides, and they're useful, especially when you need a lot of stuff open. I haven't done a side by side with dual 1080/2560. 4k looks to be one of those things that they're pushing but the world just isn't ready for. Program UI and sites look like shit on high res. Non native gaming sucks. And Tte displays are expensive. I'd put money towards a 4k display on a Cintiq instead (or additional hardware).
 
  • Agree
Reactions: Allakazam223
The $300-400 range has always been the "7" tier. The first i7, the i7 920 IIRC was just under 300 10 years ago. Thankfully Zen forced polycore (that even a term?) CPU's into the market.

CPU prices going down was an effect of Intel's whoopsie in the late 90's and AMD becoming insanely competitive with the Athlon shortly thereafter. The best graphics card at that time was "always" (this was a hectic ~3-4 year period with a billion cards released) between 200-300 bucks while the best consumer CPU would be like $650(the P2 450mhz).

The 3090 looks pretty nice with all that VRAM, CUDA and Tensor cores for productivity. They have loosened up some of their artificial restrictions in the drivers for their consumer cards and they even offer productivity oriented drivers for us plebs. Makes me wonder that the Quadro line will be like.
 
RX 5700XT gang, people bitch about drivers but I've had literally zero issues, probably because I'm not a Windows cuck.
I've heard the AMD open source drivers on linux are pretty good, are the newer cards drivers good as well?
 
I've heard the AMD open source drivers on linux are pretty good, are the newer cards drivers good as well?
Like I said, zero issues, well, rarely when playing Subnautica the screen would go black for no reason but unplugging and plugging back in would fix it and I was running that through Proton so there could've been all sorts of weird reasons for that.
 
SELL SELL SELL SELL

2020-09-05 10.58.30 mobile.twitter.com 6706265e7fad.jpg


I count my blessings having bought millenia ago a 970 :(
 
>Buy Radeon VII a few months ago for workstation computer
>Made decision after much research and some comparison to Nvidia GPUs (2080 TI)
>Not even 4-5 months later 30XX series comes out with 3080 and 3090
>$1400 dollar 2080 TI falls to sub $900 on Ebay. $500 dollar loss instantly.
>Radeon VII APPRECIATED value, now >$800 on Ebay
>I bought mine originally for $600, could sell it now for a few hundred more

The chad Radeon VII vs virgin Nvidia 2080 TI.
VII.png
 
>Buy Radeon VII a few months ago for workstation computer
>Made decision after much research and some comparison to Nvidia GPUs (2080 TI)
>Not even 4-5 months later 30XX series comes out with 3080 and 3090
>$1400 dollar 2080 TI falls to sub $900 on Ebay. $500 dollar loss instantly.
>Radeon VII APPRECIATED value, now >$800 on Ebay
>I bought mine originally for $600, could sell it now for a few hundred more

The chad Radeon VII vs virgin Nvidia 2080 TI.
View attachment 1575327
AMDChads win again gamers,
 
RX 5700XT gang, people bitch about drivers but I've had literally zero issues, probably because I'm not a Windows cuck.
The 5700XT is honestly finicky, but I'm convinced a lot of issues are people failing to troubleshoot. I see people speculating recently that it might be PSU related. Will be interesting to see if the new Nvidia cards have similar problems.
 
The 5700XT is honestly finicky, but I'm convinced a lot of issues are people failing to troubleshoot. I see people speculating recently that it might be PSU related. Will be interesting to see if the new Nvidia cards have similar problems.

Wasn't the 5700XT the card, at least some models, that smelled like it was burning? AMD (and Nvidia) have been caught doing stupid power shit in the past, with the 5700XT they tried to convince people that 110C was a perfectly normal operating temperature ("Aurora Borealis!? At this time of year, at this time of day...") and before that they had a card that went wildly out of spec with power consumption from the PCIE slot.
 
Wasn't the 5700XT the card, at least some models, that smelled like it was burning? AMD (and Nvidia) have been caught doing stupid power shit in the past, with the 5700XT they tried to convince people that 110C was a perfectly normal operating temperature ("Aurora Borealis!? At this time of year, at this time of day...") and before that they had a card that went wildly out of spec with power consumption from the PCIE slot.
110C is the junction temperature limit. Not to be confused with the standard temperature measurement that you usually see. I'm pretty sure Nvidia has the same thing, but they hide that information.
Admittedly the card does run hot, especially the reference design that has a poor cooler and even worse fan curve. Undervolting or adjusting the fan curve solves this however.
 
110C is the junction temperature limit. Not to be confused with the standard temperature measurement that you usually see. I'm pretty sure Nvidia has the same thing, but they hide that information.
Admittedly the card does run hot, especially the reference design that has a poor cooler and even worse fan curve. Undervolting or adjusting the fan curve solves this however.

Pfft, thats fucking nothing. Years back my 7950x2 used to hit the 125oc limit and turn my computer off. Back then the threat of things bursting into flames was real, and it made real gamers. Worth it to play Stalker with wonderful graphics, to hell with my house and skin.
 
Now would be a good time to have a spare $800 huh.

The Nvidia 2080Ti card price at launch was ludicrous and I'm quite satisfied with my 2070 until it dies in however many years time. It's not like I'm itching to snap up a 30 series, and it's weird how people with a 2080Ti are so ready to drop somewhere around $1400 dollars for a shiny new card when theirs are what, 2 years old and still at the very top of the line?

Unless you're a 3D artist who does massive projects or something, I don't get it, it's not like you need much more power if you're just gaming.
 
Back