Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
this has been constantly repeated since the 8th gen consoles released and games still rarely use more than 4 threads effectivelynew titles will lean more into multithreading
meant to upload this earlier but it turns out that torrenting a 100GB game takes quite a whileWell alright then lol, sold
this has been constantly repeated since the 8th gen consoles released and games still rarely use more than 4 threads effectively
To be COMPLETELY FAIR, I am thinking about Dwarf Fortress. A i7 probably can do it, even 12 gen, it's current year... it's just I'm thinking about it, that's all. I want this computer to be viable for roughly 10 years. That's why I've heavily invested in SSD's and multiple drives in general.Hardware requirements are varying degrees of bullshit. The only games that benefit from very high end processors are unusually computationally expensive ones, think strategy games like Stellaris and Civilisation, or expansive sims like Cities. Generally speaking CPU and RAM are the areas a gamer can deemphasise in his budget, GPU reigns supreme. I personally wouldn't go for Intel before 12th gen though, since new titles will lean more into multithreading and that's where Intel are weakest. The guideline is, look at what CPU is in the current consoles, and try to match that. The PS5 uses an 8-core Ryzen clocked at 3GHz. Zen 2 and higher will readily match that (though you'd be a fool to buy a 3700X today), as will Intel 12th gen and higher.
Cyberpunk recommends a 7800X3D, which is pure nonsense. A 3600 will perform just as well, because Cyberpunk is not even slightly CPU bound. I personally would get the 7800X3D if it's an option, because that thing is incredibly good for one of my favourite games, Stellaris, and will probably last the decade out for other games as well, but Cyberpunk doesn't need it.
That's not at all a comparable product, it's not even beige.
DF is extremely single-threaded and tries its hardest to run like shit on any hardware it runs on.Dwarf Fortress
From the videos I've listened to on its development... I fully fucking believe that. Hense the i9 which can boost itself to 5ghz per core. It may run like shit ,but possibly less * extreme optimism*DF is extremely single-threaded and tries its hardest to run like shit on any hardware it runs on.
I don't know whether it's memory-bound too, but it probably is. That means your 5GHz core doesn't matter because all the RAMFrom the videos I've listened to on its development... I fully fucking believe that. Hense the i9 which can boost itself to 5ghz per core. It may run like shit ,but possibly less * extreme optimism*
Well hopefully 32 gigs can make due for a little while till it can get upgraded lolI don't know whether it's memory-bound too, but it probably is. That means your 5GHz core doesn't matter because all the RAM bandwidth is being used by cache-inefficient spaghetti code.
I think the best (and cheapest) solution is to make sure keep your dwarf population low and proactively geld your cats.
Not RAM capacity, RAM bandwidth. You will never escape the dwarven spaghetti. It is futile.Well hopefully 32 gigs can make due for a little while till it can get upgraded lol
wouldn't it be more bound by memory latency than bandwidth? X3D CPUs perform amazingly in dwarf fortressI don't know whether it's memory-bound too, but it probably is. That means your 5GHz core doesn't matter because all the RAM bandwidth is being used by cache-inefficient spaghetti code.
Yeah, right. Latency, not bandwidth. Sorry. I'm retarded.wouldn't it be more bound by memory latency than bandwidth? X3D CPUs perform amazingly in dwarf fortress
Not RAM capacity, RAM bandwidth. You will never escape the dwarven spaghetti. It is futile.
wouldn't it be more bound by memory latency than bandwidth? X3D CPUs perform amazingly in dwarf fortress
Ah, the cache matters. Well, I'll be on 12th gen, so... I guess I have to hope the system as a whole meshes well enough to make it work lolYeah, right. Latency, not bandwidth. Sorry. I'm retarded.
You'll probably be fine, even with older processors. DF is one of those games that'll run fine on literally anything, but will quickly slow down to a uniform crawl on all hardware in certain conditions.Ah, the cache matters. Well, I'll be on 12th gen, so... I guess I have to hope the system as a whole meshes well enough to make it work lol
Ah. It must depend on the scale of the fortress then. What it's trying to simulate and suchYou'll probably be fine, even with older processors. DF is one of those games that'll run fine on literally anything, but will quickly slow down to a uniform crawl on all hardware in certain conditions.
Yeah, right. Latency, not bandwidth. Sorry. I'm retarded.
So whatever he buys his shitty computer, I build it and now he's complaining that he barely notices a difference over his old 1660 super. Lmao.
Pretty much. Right now it's CS and cyberpunk, and well a 1660 super can handle cyberpunk at med settings no problem. The difference between med and high is not going to be very noticeable. However, another big part of the problem is he refused to get a better monitor, he thinks his piece of shit 75Hz spectre monitor is "amazing" because "it was 80 bucks and it's curved!". He doesn't realize he wouldn't be experiencing screen tearing on a better monitor, even with the 1660 super.Let me guess. He plays a lot of Fortnite and Counterstrike and was surprised a new GPU didn't magically make those games newer.
$80 is cheap for a monitor. My primary monitor goes from $180-$270 depending on sales and discounts. Granted, it's an LG 1.5x ultrawidePretty much. Right now it's CS and cyberpunk, and well a 1660 super can handle cyberpunk at med settings no problem. The difference between med and high is not going to be very noticeable. However, another big part of the problem is he refused to get a better monitor, he thinks his piece of shit 75Hz spectre monitor is "amazing" because "it was 80 bucks and it's curved!". He doesn't realize he wouldn't be experiencing screen tearing on a better monitor, even with the 1660 super.
That's another thing I've noticed - I can barely tell the difference among different detail levels these days. The only visible difference between Darktide on high & medium for me is high drops more frames.Pretty much. Right now it's CS and cyberpunk, and well a 1660 super can handle cyberpunk at med settings no problem.
It's not a terrible monitor, but I would never use it as my primary monitor. For a 2nd or third screen it's perfectly viable. Not that mine are like pro-gamer amazing or anything. I have way too many monitors, 10 in total but for my primary 3 I'm using x3 cheap 32" Acer's mounted on a rack. I think they were 140 USD when I got them which isn't expensive at all. Even one would be a huge upgrade to what he has though, there's a noticeable difference between 75Hz and 240Hz with freensync.$80 is cheap for a monitor. My primary monitor goes from $180-$270 depending on sales and discounts. Granted, it's an LG 1.5x ultrawide
It really isn't very noticeable, when you're actively playing. When you're actively playing a game you're not going to notice that shadows aren't feathered and slightly transparent. If you're staring at a still screen, sure you can find differences in a comparison, but actively playing they definitely don't stick out. It does depend on the game and which settings you change though but for the most part it's not much of a difference especially at 1080p.That's another thing I've noticed - I can barely tell the difference among different detail levels these days. The only visible difference between Darktide on high & medium for me is high drops more frames.