GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

What about framework laptops? have you tried one?

My Framework is from the first batches of preorders, and it's been pretty great. Especially considering it was new hardware from a completely unproven company. I'll probably get the 16 sometime next year.

It's not perfect though. Lately the fan tends to spin up annoyingly easy when using Windows 10 and doing typical developer stuff, but it's probably something stupid I've done to Windows. The only issues with Linux (Arch and NixOS) have been the fingerprint reader and getting sleeping/hibernating/whatever-the-fuck-it-is working flawlessly. These issues may have been fixed, but they don't really affect the way I typically use laptops...so I haven't bothered checking. Obviously this may be a much bigger problem for other people.

I also have a MacBook Pro with a M1 Pro, and despite having mostly left the Apple ecosystem...it's a hell of a machine. Never heard the fan spin up, and people aren't exaggerating about the battery usage. There have been times when I've forgotten to plug it while building projects and there hasn't been any human-noticeable difference in performance, fan noise, or excess heat. The x86 emulation(?) is bearable, but I've seen a few bugs and had some erratic performance problems from time to time. My biggest problems have been with Apple's miserly RAM situation and their slow but steady drift towards locking their computers down like they do their iOS devices.

As disappointing as it may be for weird zealots like me, most people are going to be very well served with a MacBook Pro--so long as they don't mind macOS and they didn't skimp on the RAM. But it's also worth remembering that Apple making nice hardware doesn't mean that Framework laptops are bad! Each Framework launch seems to be a nice improvement while also not robbing their users of the ability to repair, upgrade, or modify their computers.
 
Last edited:
  • Informative
Reactions: WelperHelper99
I did a quick search too, and couldn't find anyone actually selling anything above 5600MT/s. It'll probably exist one day, but for now I'm fairly sure the only way to get DDR5 over 6000MT/s is if it's soldered.
Yeah, doubling the framerate will make it feel smoother even though the result is super blurry. I'm not a "content creator" either, but LCD panels suffer from smearing issues even when you're just watching TV-series (which are 30fps), and IPS is only marginally better than VA there. By contrast my OLED TVs display things much more crisply.
View attachment 5535691
Mass immigration to save the economy and TN film because it makes guns shoots faster was the two greatest lies they ever sold to the public.
 
Mass immigration to save the economy and TN film because it makes guns shoots faster was the two greatest lies they ever sold to the public.
To be fair TN genuinely does have better response times. It just looks absolutely awful in every other way. TN is to IPS what IPS is to VA, and there is still a huge gap between OLED and VA.
 
Just bought the wifi chip for my board. Surprisingly cheap. I've also been thinking on possibly getting a 12th gen i9, depending on how Christmas turns out. I read up on the specs for games like Cyberpunk and Baldurs gate, a 12gen i7 would work but would be pushing it. Also been looking at noctua coolers; you guys were right, they are monsters
 
  • Islamic Content
Reactions: Bara Blargg
I've gradually moved back to using my laptop as my daily driver, been gaming on a steam deck mostly and 90% of my computer needs don't require desktop power.

Anyone have suggestions for a good usb c dock that can output to two monitors and power the laptop? The only ports my machine has are two usb c. Best option seems to be a second hand HP or Lenovo hub (prices on new ones are insane). I'm not keen on getting an off brand one from Amazon, already got one and when I plug the power in it starts buzzing and I don't want to burn my house down.
 
are you upgrading from an i3 on the same board or something? even a 9th gen i7 would be more than enough for cyberpunk and baldurs gate 3 if you just want 60fps
I'm reading the hardware requirements on the steam page man. A 12 gen i7 can do it, recommended even, but I'd rather have a little extra power
 
  • Islamic Content
Reactions: Bara Blargg
Noctua is overrated. Thermalright peerless assassin gives competing performance for a lot less.
I wouldn't say Noctua is overrated, they've definitely earned their rep with quality coolers and service, but competition has forced them to lower their prices significantly eventually.
 
I wouldn't say Noctua is overrated, they've definitely earned their rep with quality coolers and service, but competition has forced them to lower their prices significantly eventually.
As of now they're overrated. A large cooler from them is still $100+

The peerless assassin is a whopping $30-40. The dual 140 frost commander? $44.

Noctua has not lowered prices really much at all because people still buy them because "it's noctua"

Call me up when a d15 is $50
 
  • Informative
Reactions: Brain Problems
I'm reading the hardware requirements on the steam page man. A 12 gen i7 can do it, recommended even, but I'd rather have a little extra power
Hardware requirements are varying degrees of bullshit. The only games that benefit from very high end processors are unusually computationally expensive ones, think strategy games like Stellaris and Civilisation, or expansive sims like Cities. Generally speaking CPU and RAM are the areas a gamer can deemphasise in his budget, GPU reigns supreme. I personally wouldn't go for Intel before 12th gen though, since new titles will lean more into multithreading and that's where Intel are weakest. The guideline is, look at what CPU is in the current consoles, and try to match that. The PS5 uses an 8-core Ryzen clocked at 3GHz. Zen 2 and higher will readily match that (though you'd be a fool to buy a 3700X today), as will Intel 12th gen and higher.

Cyberpunk recommends a 7800X3D, which is pure nonsense. A 3600 will perform just as well, because Cyberpunk is not even slightly CPU bound. I personally would get the 7800X3D if it's an option, because that thing is incredibly good for one of my favourite games, Stellaris, and will probably last the decade out for other games as well, but Cyberpunk doesn't need it.
Noctua is overrated. Thermalright peerless assassin gives competing performance for a lot less.
That's not at all a comparable product, it's not even beige.
 
Noctua is overrated. Thermalright peerless assassin gives competing performance for a lot less.
I don't think Noctua fans are that overrated, they are very quiet and reliable. However, their coolers are definitely overpriced compared to the competition, I just like the way they look, nice and classic. I have a NH-U12S with a NF-P12 redux fan for my R5 5600X. Overkill for a 65W CPU, but it looks nice in my case.
 
That's not at all a comparable product, it's not even beige.
Irrelevant since even noctua sold out to the "paint it black" crowd.

Besides, the only noctua fans remotely worth buying are brown and black.

*Edit* Look in amazement as a dual 120 cooler almost matches a d15 at similar db:

801372_1675052720943.png
 
Last edited:
Just bought the wifi chip for my board. Surprisingly cheap. I've also been thinking on possibly getting a 12th gen i9, depending on how Christmas turns out. I read up on the specs for games like Cyberpunk and Baldurs gate, a 12gen i7 would work but would be pushing it. Also been looking at noctua coolers; you guys were right, they are monsters

An i7-12700k can run Cyberpunk 2077 at almost 350 fps.

It can run Baldur's Gate 3 at 430 fps:
 
  • Informative
Reactions: WelperHelper99
I think AMD is a lot more undeveloped on the software side before you get into hardware power comparisons. CUDA is a dream and ROCm leaves me buckbroken everytime. God help you if it involves anything new or experimental which most AI things are going to be.

Try SYCL. Way better than CUDA IME, and JIT compiling means it runs on NVIDIA anyway.

Now a number of different companies are releasing dedicated AI chips, most recently AMD. Team red also says there will be AI processors on their future graphics cards.
Are these AI chips more efficient at LLM stuff than graphics cards? A yet more refined ASIC basically?

AMD's memory controller isn't as good as NVIDIA's, and my colleagues report having a much tougher time achieving theoretical limits on Instinct as compared to A100s (no H100s yet). Ponte Vecchio is basically a proof of concept product and not worth discussing.

I expect AI-centric chips to eventually outperform GPUs simply because things designed to purpose tend to be more efficient. GPUs weren't designed for AI. They were designed for graphics and that happened to mean they were way better at AI than x86 CPUs. That doesn't mean they're the best possible thing for AI, and they probably aren't.
 
  • Informative
Reactions: SilentDuck
Back