GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

i should make sure the new motherboard for my next server has at least two high speed PCIe slots then, as I'll need one for a video decoding gpu and one for the AI. having a personal AI will be weird.
 
new titles will lean more into multithreading
this has been constantly repeated since the 8th gen consoles released and games still rarely use more than 4 threads effectively

Well alright then lol, sold
meant to upload this earlier but it turns out that torrenting a 100GB game takes quite a while
CP77-8C-HTOFF.jpg
even 8 haswell cores (HT off) at 3.8GHz are more than enough to push 100FPS+ in cyberpunk so long as you are not GPU bound

don't overspend on the CPU and get a faster GPU instead if you just want to game
 
  • Feels
Reactions: WelperHelper99
this has been constantly repeated since the 8th gen consoles released and games still rarely use more than 4 threads effectively

Efficiently parallelizing nontrivial workloads is extremely difficult, but if your game isn't CPU bound below 200 fps, why bother putting any effort into improving the parallelism?
 
  • Like
Reactions: Leaded Gasoline
Hardware requirements are varying degrees of bullshit. The only games that benefit from very high end processors are unusually computationally expensive ones, think strategy games like Stellaris and Civilisation, or expansive sims like Cities. Generally speaking CPU and RAM are the areas a gamer can deemphasise in his budget, GPU reigns supreme. I personally wouldn't go for Intel before 12th gen though, since new titles will lean more into multithreading and that's where Intel are weakest. The guideline is, look at what CPU is in the current consoles, and try to match that. The PS5 uses an 8-core Ryzen clocked at 3GHz. Zen 2 and higher will readily match that (though you'd be a fool to buy a 3700X today), as will Intel 12th gen and higher.

Cyberpunk recommends a 7800X3D, which is pure nonsense. A 3600 will perform just as well, because Cyberpunk is not even slightly CPU bound. I personally would get the 7800X3D if it's an option, because that thing is incredibly good for one of my favourite games, Stellaris, and will probably last the decade out for other games as well, but Cyberpunk doesn't need it.

That's not at all a comparable product, it's not even beige.
To be COMPLETELY FAIR, I am thinking about Dwarf Fortress. A i7 probably can do it, even 12 gen, it's current year... it's just I'm thinking about it, that's all. I want this computer to be viable for roughly 10 years. That's why I've heavily invested in SSD's and multiple drives in general.

Even when new games really wont work on it, I still want it to be able to write a word document and use some semi advanced software. I PROBABLY will end up with a i7. Cost really. But a i9 is definitely in my mind
 
  • Islamic Content
Reactions: Bara Blargg
DF is extremely single-threaded and tries its hardest to run like shit on any hardware it runs on.
From the videos I've listened to on its development... I fully fucking believe that. Hense the i9 which can boost itself to 5ghz per core. It may run like shit ,but possibly less * extreme optimism*
 
  • Islamic Content
Reactions: Bara Blargg
From the videos I've listened to on its development... I fully fucking believe that. Hense the i9 which can boost itself to 5ghz per core. It may run like shit ,but possibly less * extreme optimism*
I don't know whether it's memory-bound too, but it probably is. That means your 5GHz core doesn't matter because all the RAM bandwidth latency is being used by cache-inefficient spaghetti code.

I think the best (and cheapest) solution is to make sure keep your dwarf population low and proactively geld your cats.
 
Last edited:
  • Informative
Reactions: WelperHelper99
I don't know whether it's memory-bound too, but it probably is. That means your 5GHz core doesn't matter because all the RAM bandwidth is being used by cache-inefficient spaghetti code.

I think the best (and cheapest) solution is to make sure keep your dwarf population low and proactively geld your cats.
Well hopefully 32 gigs can make due for a little while till it can get upgraded lol
 
  • Islamic Content
Reactions: Bara Blargg
Not RAM capacity, RAM bandwidth. You will never escape the dwarven spaghetti. It is futile.

wouldn't it be more bound by memory latency than bandwidth? X3D CPUs perform amazingly in dwarf fortress

Yeah, right. Latency, not bandwidth. Sorry. I'm retarded.
Ah, the cache matters. Well, I'll be on 12th gen, so... I guess I have to hope the system as a whole meshes well enough to make it work lol
 
  • Islamic Content
Reactions: Bara Blargg
Ah, the cache matters. Well, I'll be on 12th gen, so... I guess I have to hope the system as a whole meshes well enough to make it work lol
You'll probably be fine, even with older processors. DF is one of those games that'll run fine on literally anything, but will quickly slow down to a uniform crawl on all hardware in certain conditions.
 
  • Informative
Reactions: WelperHelper99
You'll probably be fine, even with older processors. DF is one of those games that'll run fine on literally anything, but will quickly slow down to a uniform crawl on all hardware in certain conditions.
Ah. It must depend on the scale of the fortress then. What it's trying to simulate and such
 
  • Islamic Content
Reactions: Bara Blargg
You know how annoying it is when people (friends/family) come to you because you're the "computer guy" for tech support. I think even more annoying is when people don't listen to your advice. Bit of a sperg but I needed to vent. So my buddy wanted me to build him a new budget computer to replace his old budget computer, he was going to get another prebuilt from META and I convinced him not to, to let me build it and I won't charge him for labor. So I sent him a pcpartpicker list in his budget, which would have been an amazing computer for the price. Instead, he decided to do his own pcpartpicker build, and it included a 1000W PSU and a 4060, I told him he barely needs half of those 1000W, get a smaller PSU and we can get you a way better GPU. A 6700XT would be in budget, and even a 3060TI if we change some other parts. Nope. He wasn't having it. So whatever he buys his shitty computer, I build it and now he's complaining that he barely notices a difference over his old 1660 super. Lmao.
 
Yeah, right. Latency, not bandwidth. Sorry. I'm retarded.

Increased L3 cache improves the performance of bandwidth-bound workloads when there are a lot of out-of-order memory accesses. Certain bandwidth-bound workloads were a driving reason behind the development of the original 3D V-Cache enabled EPYCs.

So whatever he buys his shitty computer, I build it and now he's complaining that he barely notices a difference over his old 1660 super. Lmao.

Let me guess. He plays a lot of Fortnite and Counterstrike and was surprised a new GPU didn't magically make those games newer.
 
Let me guess. He plays a lot of Fortnite and Counterstrike and was surprised a new GPU didn't magically make those games newer.
Pretty much. Right now it's CS and cyberpunk, and well a 1660 super can handle cyberpunk at med settings no problem. The difference between med and high is not going to be very noticeable. However, another big part of the problem is he refused to get a better monitor, he thinks his piece of shit 75Hz spectre monitor is "amazing" because "it was 80 bucks and it's curved!". He doesn't realize he wouldn't be experiencing screen tearing on a better monitor, even with the 1660 super.
 
Pretty much. Right now it's CS and cyberpunk, and well a 1660 super can handle cyberpunk at med settings no problem. The difference between med and high is not going to be very noticeable. However, another big part of the problem is he refused to get a better monitor, he thinks his piece of shit 75Hz spectre monitor is "amazing" because "it was 80 bucks and it's curved!". He doesn't realize he wouldn't be experiencing screen tearing on a better monitor, even with the 1660 super.
$80 is cheap for a monitor. My primary monitor goes from $180-$270 depending on sales and discounts. Granted, it's an LG 1.5x ultrawide
 
Pretty much. Right now it's CS and cyberpunk, and well a 1660 super can handle cyberpunk at med settings no problem.
That's another thing I've noticed - I can barely tell the difference among different detail levels these days. The only visible difference between Darktide on high & medium for me is high drops more frames.
 
  • Agree
Reactions: Blue Miaplacidus
$80 is cheap for a monitor. My primary monitor goes from $180-$270 depending on sales and discounts. Granted, it's an LG 1.5x ultrawide
It's not a terrible monitor, but I would never use it as my primary monitor. For a 2nd or third screen it's perfectly viable. Not that mine are like pro-gamer amazing or anything. I have way too many monitors, 10 in total but for my primary 3 I'm using x3 cheap 32" Acer's mounted on a rack. I think they were 140 USD when I got them which isn't expensive at all. Even one would be a huge upgrade to what he has though, there's a noticeable difference between 75Hz and 240Hz with freensync.
That's another thing I've noticed - I can barely tell the difference among different detail levels these days. The only visible difference between Darktide on high & medium for me is high drops more frames.
It really isn't very noticeable, when you're actively playing. When you're actively playing a game you're not going to notice that shadows aren't feathered and slightly transparent. If you're staring at a still screen, sure you can find differences in a comparison, but actively playing they definitely don't stick out. It does depend on the game and which settings you change though but for the most part it's not much of a difference especially at 1080p.
 
Last edited:
Back