- Joined
- Oct 29, 2019
The $300-400 range has always been the "7" tier. The first i7, the i7 920 IIRC was just under 300 10 years ago. Thankfully Zen forced polycore (that even a term?) CPU's into the market.CPUs have gone the other way around with their pricing, a Ryzen 7 3700 can be had for $300 and that's a lot of bang for your buck.
But that's they way things should be. What defines a product segment should be better as the years go by. The infuriating part are fanboys who are fine with being bent over, justifying the sodomy that "what mid range is changed with the intro of 4k" or some shit like that.
I only upgrade when there's something I want to play that's out of my spec. I usually coast on hardware until they fail. Right now, I have a 2060 and I'm just 1080p with a sizable steam backlog.Here's a hot take: None of you fuckers have any real use for the cards that are coming out, most new games are garbage anyway, the GTX 780 Ti is still good enough for most things, 4K is useless, hell I'm still more than happy with 720p. Literally the only people who need these cards are people with VR and people working on machine learning.
People rendering on the GPU could also use the high VRAM. Even in cases when system RAM can be used, VRAM is still massively favored. Whether that's worth it is going to be up to the individual use case.
As for the 4k thing, I've done some work in places with ultrawides, and they're useful, especially when you need a lot of stuff open. I haven't done a side by side with dual 1080/2560. 4k looks to be one of those things that they're pushing but the world just isn't ready for. Program UI and sites look like shit on high res. Non native gaming sucks. And Tte displays are expensive. I'd put money towards a 4k display on a Cintiq instead (or additional hardware).